This page contains information about the 2.x release version (Einstein.SR9) of
stream-applications, which are incompatible with the latest (2020.0.0 +) release. If you are looking for information about the latest release, see pre-packaged applications.
The Spring team provides and supports a selection of pre-packaged applications that you can use to assemble various data integration and processing pipelines and to support Spring Cloud Data Flow development, learning, and experimentation.
If you are interested in upgrading existing data pipelines to use
3.x applications, see the Migration Guide.
All pre-packaged streaming applications:
- Are available as Apache Maven artifacts or Docker images.
- Use RabbitMQ or Apache Kafka.
- Support monitoring through Prometheus and InfluxDB.
- Contain metadata for application properties that are used in the UI and code completion in the shell.
You can register stream and task applications by using the Data Flow UI or the shell.
You can register applications individually by using the
app register command or in bulk by using the
app import command.
For streams, depending on whether you use Kafka or RabbitMQ, you can register the applications by using their respective URLs:
- Docker: https://dataflow.spring.io/kafka-docker-einstein
- Maven: https://dataflow.spring.io/kafka-maven-einstein
- Docker: https://dataflow.spring.io/rabbitmq-docker-einstein
- Maven: https://dataflow.spring.io/rabbitmq-maven-einstein
For tasks, you can use the following URLs:
- Docker: https://dataflow.spring.io/task-docker-latest
- Maven: https://dataflow.spring.io/task-maven-latest
When you use the Data Flow UI, the links shown in the following image are included for pre-fill:
From the Data Flow Shell, you can bulk import and register the applications, as the following example shows:
dataflow:>app import --uri https://dataflow.spring.io/kafka-maven-latest
latest bulk-registration links are updated as part of the release process. There are additional bulk-registration links to pre-packaged applications for a specific release.
The Spring team develops and maintains stream applications and publishes these applications to the Spring public Maven repository and to Dockerhub in accordance with a release schedule, normally following significant Spring Boot or Spring Cloud Stream releases. The pre-packaged stream applications are Spring Boot executable jars that are built with a specific binder implementation. For each stream app, we provide separate executable applications for RabbitMQ and Kafka.
The following table shows the currently available stream applications:
The Spring team develops and maintains task and batch applications and publishes these applications to the Spring public Maven repository and to Dockerhub in accordance with a planned release schedule, normally following significant Spring Boot, Spring Cloud Task, or Spring Batch releases.
The currently available task and batch applications are as follows:
Spring Cloud Data Flow supports bulk registration of applications, through a standard properties file format. For convenience, we publish static properties files with application URIs (for either Maven or Docker) for all the out-of-the-box stream and task and batch apps. You can use these files in Spring Cloud Data Flow to register all the application URIs in bulk. Alternately, you can register them individually or provide your own custom property file with only the required application URIs in it. Bulk registration is convenient for getting started with SCDF. To reduce clutter, we recommended maintaining a "focused" list of desired application URIs in a custom property file.