Zip File allows the compression of files in Zip format.

Take a look at the configuration parameters of the component:

  • File Name: name of the file to be compressed.

  • Zip Operation: define the type of operation (currently only "Compress" is supported).

  • Output File Name: name of the Zip file to be generated.

  • Custom Files Specification: valid for the MULTIPLE COMPRESS operation only. If the option is enabled, it’s possible to dynamically pass the files to be compressed; otherwise, the files can be individually informed via key-value.

  • Files: valid for the MULTIPLE COMPRESS operation only, this field exists to define the files to be compressed.

  • Fail On Error: if the option is enabled, the execution of the pipeline with error will be interrupted; otherwise, the pipeline execution proceeds, but the result will show a false value for the “success” property.

Messages flow

Input

The component accepts any entry message, being able to use it through Double Braces.

Output

  • without error

{
"fileName": "data.csv",
"success": true
}

  • with error

{
"success": false,
"message": "File data.csv already exists.",
"exception":
"com.digibee.pipelineengine.exception.PipelineEngineRuntimeException"
}

Zip File in Action

Request answer

{
"success": true,
"outputFileName": "data.zip"
}

  • outputFileName: name of the file written

  • success: if “true”, the operation has been successfully executed; if “false”, the operation failed

Request answer with error

{
"exception": "java.io.FileNotFoundException: /tmp/pipeline-engine/3b3755ad-4256-429a-8898-2f7eea80f7db/data1.csv (No such file or directory)",
"message": "Encountered an I/O error while executing ZipFileConnector",
"success": false
}

  • success: “false” when the operation fails

  • message: message about the error

  • exception: information about the occured type of error

Files manipulation in the pipeline

The pipeline has a temporary and local area for the files manipulation, that is separated and valid only during the execution of the flow.

That way, you must understand the access to the files as if it's made in a system of virtual files. The name of the files may contain any valid characters and extensions, which can also have a relative directory. For example:

  • data.csv

  • processing/data.csv

Any access attempt to absolute directories will be blocked during the execution of the pipeline.

Did this answer your question?