Pre-packaged Applications (Einstein.SR9)

This page contains information about the 2.x release version (Einstein.SR9) of stream-applications, which are incompatible with the latest (2020.0.0 +) release. If you are looking for information about the latest release, see pre-packaged applications.

The Spring team provides and supports a selection of pre-packaged applications that you can use to assemble various data integration and processing pipelines and to support Spring Cloud Data Flow development, learning, and experimentation.

Getting Started

If you are interested in upgrading existing data pipelines to use 3.x applications, see the Migration Guide.

All pre-packaged streaming applications:

  • Are available as Apache Maven artifacts or Docker images.
  • Use RabbitMQ or Apache Kafka.
  • Support monitoring through Prometheus and InfluxDB.
  • Contain metadata for application properties that are used in the UI and code completion in the shell.

You can register stream and task applications by using the Data Flow UI or the shell.

You can register applications individually by using the app register command or in bulk by using the app import command.

For streams, depending on whether you use Kafka or RabbitMQ, you can register the applications by using their respective URLs:

For tasks, you can use the following URLs:

When you use the Data Flow UI, the links shown in the following image are included for pre-fill:

Bulk register applications

From the Data Flow Shell, you can bulk import and register the applications, as the following example shows:

dataflow:>app import --uri

The latest bulk-registration links are updated as part of the release process. There are additional bulk-registration links to pre-packaged applications for a specific release.

Stream Applications

The Spring team develops and maintains stream applications and publishes these applications to the Spring public Maven repository and to Dockerhub in accordance with a release schedule, normally following significant Spring Boot or Spring Cloud Stream releases. The pre-packaged stream applications are Spring Boot executable jars that are built with a specific binder implementation. For each stream app, we provide separate executable applications for RabbitMQ and Kafka.

The following table shows the currently available stream applications:

Source Processor Sink
cdc-debezium aggregator cassandra
file bridge counter
ftp counter file
gemfire filter ftp
gemfire-cq groovy-filter gemfire
http groovy-transform hdfs
jdbc grpc jdbc
jms header-enricher log
load-generator httpclient mongodb
loggregator image-recognition mqtt
mail object-detection pgcopy
mongodb pmml rabbit
mqtt pose-estimation redis-pubsub
rabbit python-http router
s3 python-jython s3
sftp scriptable-transform sftp
sftp-dataflow splitter task-launcher-dataflow
syslog tasklaunchrequest-transform tcp
tcp tcp-client throughput
tcp-client tensorflow websocket
time transform
trigger twitter-sentiment

Task Applications

The Spring team develops and maintains task and batch applications and publishes these applications to the Spring public Maven repository and to Dockerhub in accordance with a planned release schedule, normally following significant Spring Boot, Spring Cloud Task, or Spring Batch releases.

The currently available task and batch applications are as follows:


Bulk Registration of Stream Applications

Spring Cloud Data Flow supports bulk registration of applications, through a standard properties file format. For convenience, we publish static properties files with application URIs (for either Maven or Docker) for all the out-of-the-box stream and task and batch apps. You can use these files in Spring Cloud Data Flow to register all the application URIs in bulk. Alternately, you can register them individually or provide your own custom property file with only the required application URIs in it. Bulk registration is convenient for getting started with SCDF. To reduce clutter, we recommended maintaining a "focused" list of desired application URIs in a custom property file.