Stream Excel reads a local Excel file row by row in a JSON structure and triggers subpipelines to process each line. This resource is indicated for situations in which large files need to be processed.

Take a look at the configuration parameters of the component:

  • File Name: determines the name of the local file to be read.

  • Sheet Name: name of the Excel sheet to be read.

  • Sheet Index: Excel sheet index to be read.

  • Use Sheet Index Instead Of Name: if activated, the option allows the sheet index to be informed instead of the name.

  • Max Fractional Digits: determines the precise number of fractional digits in a numeric cell when the Excel file is read (standard = 10).

  • Read Specific Columns As String: indicates which columns the component must read in string format instead of its original format. Each desired column must be informed separated by a comma (eg.: A,B,X,AA).

  • Read All Columns As String: if selected, the option will make all the columns to be read as string.

  • Parallel Execution Of Each Iteration: if selected, the option will make all the file lines to be read in parallel.

  • Fail On Error: when activated, this parameter suspends the pipeline execution only if there’s a severe occurrence in the iteration structure, disabling its complete conclusion. The “Fail On Error” parameter activation doesn’t have any connection with the errors occurred in the components used for the construction of the subpipelines (onProcess and onException).

  • Advanced: when selected, the option requires the definition of advanced parameters.

  • Skip: number of lines to be skipped before the file reading.

  • Limit: maximum number of lines to be read.

Stream Excel makes batch processing. To better understand the concept, click here.

IMPORTANT: Stream Excel isn’t capable of reading files in .xls format, but only in .xlsx format.

Messages flow

Input

The component accepts any input message, being able to use it through Double Braces.

Output

The component returns a JSON with the total amount of executions, successful executions and executions with error.

  • without error

{
"total": 5,
"success": 5,
"failed": 0
}

  • with error

{
"total": 5,
"success": 3,
"failed": 2
}

  • total: total number of processed lines

  • success: total number of successfully processed lines

  • failed: total number of line whose process failed

IMPORTANT: to know if a line has been correctly processed, there must be the return { "success": true } for every processed line.

The component throws an exception if the file doesn't exist or can't be read. On contrary, a message is produced at the output with the occurred exception.

The files manipulation inside a pipeline occurs in a protected way. All the files can be accessed with a temporary directory only, where each pipeline key gives access to its own files set.

This component makes batch processing. To better understand the concept, click here.

Did this answer your question?