Spring Cloud Data Flow allows users to create, configure, and launch a simple single-step Spring Batch job application without writing any code. The single-step batch job is composed of one item reader and one writer. An item reader provides data from different types of input. An item writer is similar in functionality to an item reader, but with inverse operations, in that it writes out, rather than reading.
The single-step batch job provides 4 different types of readers:
- Flat File
Similarly, it offers 4 different types of writers:
- Flat File
This topic describes how to configure a single-step batch job.
Register Single Step Batch Job Application
Begin by registering the single-step batch job application. In the Spring Cloud Data Flow UI, select the Applications option on the left side of the page. This will open the Applications page:
To register an application, select ADD APPLICATION(S). When the Add Application(s) page appears, select Register one or more applications.
Fill in the form as shown below, and click IMPORT APPLICATION(S):
Create Task Definition
To create a task in the Spring Cloud Data Flow UI:
- Select Tasks from the left navigation bar.
- Select Create task(s). This opens a graphical editor that you can use to compose tasks. The initial canvas contains
END nodes. The left of the canvas lists the available task applications, including
singlestepbatchjob, which was registered in the previous section.
- Drag the
singlestepbatchjob task application to the canvas.
- Connect the task to the START and END nodes to complete the task definition.
- Click CREATE TASK. You will be prompted to name the task definition, which is the logical name for the runtime configuration that you want to deploy. In this case, use the same name as the task application:
- Click CREATE THE TASK. You will be taken to the main Tasks view.
Launch Single Step Batch Job Application
You can launch the single-step batch job from the Task UI.
To launch the task:
- Click the option control on the row of the
singlestepbatchjob definition, and select the Launch option. This opens a form where you can add command line arguments and deployment properties.
- Select the EDIT button under the Application Properties section of the Launch page.
- Click App Properties, and then select
spring.batch.job dropdown. Enter:
- Chunk Size: The number of records to process before committing a transaction
- Step Name: The name of the step associated with the job
- Job Name: The name of the job to be processed
- Populate the reader properties.
- Click Reader properties.
- Select the reader type (File, AMQP, JDBC, or Kafka).
- Select the properties dropdown displayed and populate the properties for how to read from the input resource.
- Populate the writer properties.
- Select Writer properties
- Select the writer type (File, AMQP, JDBC, or Kafka).
- Select the properties dropdown displayed and populate the properties for how to write to the output resource.
- Click Launch the task. This runs the task on the Data Flow server's task platform and records a new task execution. You can track the execution progress using the Task Progress Indicator.
- When the task is complete, you can check the status of the job by selecting the Jobs executions option on the left side of the page.
- Select the execution ID of the task that you just launched. From here you can review the status of the job execution.