![]() ![]() The initial processor within a Batch Aggregator must be able to accept an array of records as input. You can add only one to a Batch Step component. For example, you might configure a connector operation to pass processed records one-by-one to an external server.Ī Batch Aggregator component is optional. All record processing takes place during this phase.Įach Batch Step component contains one or more processors that act upon a record to transform, route, enrich, or modify data in the records. ![]() The batch job instance executes when the batch job instance reaches. In this phase, the component prepares the input for processing as records, which includes creating a batch job instance in which processing takes place. When the Batch Job component receives a message from an upstream processor in the flow, the Load and Dispatch phase begins. For example, an HTTP request operation might retrieve the data to process, and a DataWeave script in a Transform Message component might transform the data into a valid format for the Batch Job component to receive. Processors located upstream of the Batch Job component typically retrieve and, if necessary, prepare a message for the Batch Job component to consume. ![]() Common event sources are listeners, such as an HTTP listener from Anypoint Connector for HTTP (HTTP Connector), a Scheduler component, or a connector operation that polls for new files. The Mule event source triggers the Mule flow. ![]()
0 Comments
Leave a Reply. |