Task and Batch Monitoring with InfluxDB

This section describes how to monitor the applications that were deployed as part of a Task definition in SCDF. The setup for each platform is different, but the general architecture is the same across the platforms.

The Data Flow 2.x metrics architecture is designed around the Micrometer library, which is a vendor-neutral application metrics facade. It provides a simple facade over the instrumentation clients for the most popular monitoring systems. See the Micrometer documentation for the list of supported monitoring systems. Starting with Spring Boot 2.0, Micrometer is the instrumentation library that powers the delivery of application metrics from Spring Boot. Spring Batch provides additional integration to expose metrics around task durations, rates and errors, which is critical to the monitoring of deployed batch-jobs.

The core of the Micrometer task integration is part of the Spring Cloud Task’s 2.2.0 release-line, which is a prerequisite for the Task-metrics and the Data Flow integration. Task applications built on the Spring Cloud Task 2.2.0 version can be configured to emit Task and Batch metrics to the pre-configured monitoring systems supported by Micrometer.

To enable Task metrics integration with DataFlow you must add the spring-boot-starter-actuator and include the desired Micrometer registry as the dependency in the Task POM.

For example:


To help you get started monitoring tasks, Data Flow provides Grafana Dashboards that you can install and customize for your needs.

The following image shows the general architecture of how applications are monitored:

Task Monitoring Architecture

To allow aggregating metrics per application type and per instance id or per task name, the Spring Cloud Task applications are configured to use the following Micrometer tags:

  • task.name: The name of the Task that contains the applications that send the metrics
  • task.execution.id: The instance id of the executed task.
  • task.external.execution.id: The external Task ID as present on the target platform (such as Cloud Foundry or Kubernetes) The type (Source, Processor, or Sink) of the application that reports the metrics
  • task.parent.execution.id: The parent task ID used to identify task that executes another task or tasks.

As setting up InfluxDB is different depending on the platform on which you run, we provide instructions for each platform. In Spring Cloud Data Flow 2.x, local server and Cloud Foundry instructions for InfluxDB have been provided.


This section describes how to set up InfluxDB for a local machine.


InfluxDB is a popular open-source push-based time series database. It supports downsampling, automatically expiring and deleting unwanted data, and backup and restore. Analysis of data is done through an SQL-like query language.

To enable Micrometer’s Influx meter registry for Spring Cloud Task application starters, start the Data Flow server with the following properties:


Instead of having to install them manually, for a quick bootstrap, Spring Cloud Data Flow provides a Docker Compose Influx file, which will bring up Spring Cloud Data Flow, Skipper, Apache Kafka, Influx, and prebuilt dashboards for Grafana. Instructions below leverage this approach..

Upgrade to latest version of Docker

We recommended that you upgrade to the latest version of Docker before running the docker-compose command. We have tested with Docker Engine version 18.09.2.

  • Downloading the Docker Compose Influx file

To download the Spring Cloud Data Flow Server Docker Compose file, run the following command:

wget https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/master/spring-cloud-dataflow-server/docker-compose-influxdb.yml
  • Starting Docker Compose

In the directory where you downloaded docker-compose-influxdb.yml, start the system, by running the following commands:

docker-compose -f ./docker-compose-influxdb.yml up

Now that docker compose is up, you can access the Spring Cloud Data Flow Dashboard. Also you can reach the Grafana dashboard at http://localhost:3000 using the user: admin, password: admin credentials.

Now you can deploy a custom Task application (task-demo-metrics) and define two tasks (task1 and task2):

dataflow:>app register --name myTask --type task --uri https://github.com/tzolov/task-demo-metrics/raw/master/apps/task-demo-metrics-0.0.1-SNAPSHOT.jar

dataflow:>task create --name task1 --definition "myTask"
dataflow:>task create --name task2 --definition "myTask"

Launch the tasks several times:

dataflow:>task launch --name task1
dataflow:>task launch --name task2

In the DataFlow task execution UI you should see list like this: SCDF Task Execution

You should see dashboards similar to those shown in the following image:

SCDF Task Grafana InfluxDB

Cloud Foundry

This section describes how to set up InfluxDB for Cloud Foundry.


You can follow the general Manifest based installation on Cloud Foundry instructions for installing Skipper and DataFlow on Cloud Foundry. To enabling the Task metrics integration you need to extend the DataFlow manifest with following SPRING_APPLICATION_JSON variable:

  "spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.enabled": true,
  "spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.db": "yourinfluxdb",
  "spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.autoCreateDb": false,
  "spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.uri": "https://your-influx-uri:port",
  "spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.userName": "influxusername",
  "spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.password": "******",
  "spring.cloud.dataflow.grafana-info.url": "https://your-grafana-uri:443"

Check the Influx Actuator properties for further details about the management.metrics.export.influx.XXX properties.

Complete example of a DataFlow manifest that enables metrics collection for both Stream and Tasks would look like this:

- name: data-flow-server
  host: data-flow-server
  memory: 2G
  disk_quota: 2G
  instances: 1
  path: ./spring-cloud-dataflow-server-2.2.0.BUILD-SNAPSHOT.jar
    SPRING_APPLICATION_NAME: data-flow-server
    MAVEN_REMOTEREPOSITORIES[REPO1]_URL: https://repo.spring.io/libs-snapshot
    SPRING_CLOUD_SKIPPER_CLIENT_SERVER_URI: http://your-skipper-server-uri/api
    SPRING_APPLICATION_JSON: '{"spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.enabled": true,"spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.db": "defaultdb","spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.autoCreateDb": false,"spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.uri": "https://influx-uri:port","spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.userName": "guest","spring.cloud.dataflow.applicationProperties.task.management.metrics.export.influx.password": "******","spring.cloud.dataflow.applicationProperties.stream.management.metrics.export.influx.enabled": true,"spring.cloud.dataflow.applicationProperties.stream.management.metrics.export.influx.db": "defaultdb","spring.cloud.dataflow.applicationProperties.stream.management.metrics.export.influx.autoCreateDb": false,"spring.cloud.dataflow.applicationProperties.stream.management.metrics.export.influx.uri": "https://influx-uri:port","spring.cloud.dataflow.applicationProperties.stream.management.metrics.export.influx.userName": "guest","spring.cloud.dataflow.applicationProperties.stream.management.metrics.export.influx.password": "******",
    "spring.cloud.dataflow.grafana-info.url": "https://grafana-uri:port"}'

- mysql