All Collections
Triggers
HTTP File Trigger - Uploads
HTTP File Trigger - Uploads

Know the trigger and how to use it.

Rodrigo Lara avatar
Written by Rodrigo Lara
Updated over a week ago

HTTP File Trigger sends large files (with size greater than 5MB) to pipelines in a robust and efficient way, using HTTP.

Take a look at the configuration parameters of the trigger:

  • Methods: defines the methods accepted by the pipeline (GET, PUT, POST, PATCH and DELETE).

  • Response Headers: headers to be returned by the endpoint when processing in the pipeline is complete. This parameter cannot be left empty and accepts Double Braces. Special characters should not be used in keys, due to possible failures in proxies and gateways.

  • Add Cross-Origin Resource Sharing (CORS) - CORS Headers: add the CORS headers to be returned by the endpoint when processing in the pipeline is complete. Cross-Origin Resource Sharing (CORS) is a mechanism that lets you tell the browser which origins are allowed to make requests. This parameter defines CORS specifically for the pipeline and its constraints. To configure globally rather than individually on each pipeline, see the CORS Global article.


    Important: We use a comma to inform multiple values in a header, but we didn't add a space before or after the comma. Special characters should not be used in keys, due to possible failures in proxies and gateways.


  • Form Data Uploads: enables/disables the receipt of the form data upload (multipart form data).

  • Body Uploads: enables/disables the receipt of bodies upload.

  • Body Upload Content Types: define the content types allowed by the pipeline for the bodies upload.

  • Response Content Types: defines the response content types allowed by the pipeline - when configuring this response, you determine the response format.

  • Maximum Timeout: defines the maximum time of expiration (standard = 60000).

  • Maximum Request Size: defines the maximum size of the file in the upload request (in MB) (maximum = 100 MB).

  • External API: if activated, the option publishes the API in an external gateway.

  • Internal API: if activated, the option publishes the API in an internal gateway.

  • mTLS enabled API: If enabled, the API is published to a dedicated gateway for APIs with mTLS enabled by default. In this case, the access host will be different from the others. The pipeline can have both the External API and Internal API options enabled at the same time, but it is recommended to leave them inactive. This parameter does not support API Key and JWT. To use it in your realm, it is necessary to make a request via chat, and we will send you the necessary information to install this service.

  • API Key: if activated, the option will request the key for the API consumption.

  • Token JWT: if activated, the option will request the token for the API consumption.

  • Additional API Routes: if activated, the option allows you to add new routes.

  • API Routes: custom routes.

  • Security (Legacy): if activated, the option determines the need to use a JWT token to access the exposed API.

  • Allow Redelivery Of Messages: if activated, the option allows messages to be redelivered in case of Pipeline Engine failure.

HTTP File Trigger in Action

Scenario 1: POST with "multipart/form-data" content type of a file

Let's say you want to send a file with more than 5MB. You can call a pipeline endpoint configured with HTTP Trigger via POST with the "multipart/form-data" content type for this file to be available to be worked by the pipeline.

Check how to do that:

  1. Create a pipeline and configure its trigger as HTTP-File, enabling the Form Data Uploads option.

  2. Deploy the pipeline.

  3. Create an API Key and configure it so that it has access to the pipeline.

  4. Call the pipeline through this curl:

curl -d “@file_name” https://godigibee.io/pipeline/realm_name/v1/http-file-upload -v -H 'Content-Type: application/pdf' -H 'apikey: generated_token'

  • file_name: refers to a local file

  • realm_name: refers to the Realm where the pipeline is located

  • generated_token: refers to the API Key generated by the recently created API Key


The pipeline will receive an array files[] containing:

  • fileName

  • param

  • contentType

That way, the file can be referred and accessed from the pipeline:

{
"body": null,
"form": {},
"headers": {
...
},
"queryAndPath": {},
"method": "POST",
"contentType": "application/pdf",
"path": "/pipeline/realm_name/v1/http-file-upload",
"files": [
{
"fileName": "55acdc09-c0fc-4f6a-b3ee-f4199076b0c4",
"param": "body",
"contentType": "application/pdf"
}
]
}

Scenario 2: POST with "multipart/form-data" content type of multiple files

Let's say you have multiple files with more that 5MB. You can call a pipeline endpoint configured with HTTP Trigger via POST with the "multipart/form-data" content type for these files to be available to be worked by the pipeline.

To make it possible, all you have to do is follow these steps:

  1. Create a pipeline and configure its trigger as HTTP-File, enabling the Form Data Uploads option.

  2. Deploy the pipeline.

  3. Create an API Key and configure it so that it has access to the pipeline.

  4. Call the pipeline through this curl:

curl -F dgbfile1=@file_name1 -F dgbfile2=@file_name2 https://godigibee.io/pipeline/realm_name/v1/http-file-upload -v -H 'apikey: generated_token'

  • file_name1: refers to a local file

  • file_name2: refers to a local file

  • realm_name: refers to the Realm where the pipeline is located

  • generated_token: refers to the API Key generated by the recently created API Key


The pipeline will receive an array files[] containing:

  • fileName

  • originalFileName

  • param

  • charset

  • contentLength

  • contentType

That way, the files can be referred and accessed from the pipeline:


{
"body": "",
"form": {},
"headers": {
...
},
"queryAndPath": {},
"method": "POST",
"contentType": "multipart/form-data; boundary=------------------------b3c985803b952f2c",
"path": "/pipeline/realm_name/v1/http-file-upload",
"files": [
{
"fileName": "96f3ecb2-1c72-4980-9f01-6f44cafc719f",
"originalFileName": "file1",
"param": "dgbfile1",
"contentType": "application/octet-stream",
"charset": "UTF-8",
"contentLength": 5242880
},
{
"fileName": "58fb844f-a1d1-4788-b9b4-30df4b69165e",
"originalFileName": "file2",
"param": "dgbfile2",
"contentType": "application/octet-stream",
"charset": "UTF-8",
"contentLength": 5242880
}
]
}



Scenario 3: POST with any content type and body with more than 5MB

Let's say you have multiple files with more than 5MB. You can call a pipeline endpoint configured with HTTP Trigger via POST with any content type for these files to be available to be worked by the pipeline.

All you have to do is:

  1. Create a pipeline and configure its trigger as HTTP-File, enabling the Body Uploads option.

  2. Deploy the pipeline.

  3. Create an API Key and configure it so that it has access to the pipeline.

  4. Call the pipeline through this curl:

curl -d '@file_name1' https://godigibee.io/pipeline/realm_name/v1/http-file-upload -v -H 'apikey: generated_token'

  • file_name: refers to a local file

  • realm_name: refers to the Realm where the pipeline is located

  • generated_token: refers to the API Key generated by the recently created API Key


The pipeline will receive an array files[] containing:

  • fileName

  • param

  • charset

  • contentType

That way, the files can be referred and accessed from the pipeline:

{
"body": null,
"form": {},
"headers": {
...
},
"queryAndPath": {},
"method": "POST",
"contentType": "application/pdf",
"path": "/pipeline/realm_name/v1/http-file-upload",
"files": [
{
"fileName": "55acdc09-c0fc-4f6a-b3ee-f4199076b0c4",
"param": "body",
"contentType": "application/pdf"
}
]
}


HTTP File Trigger Response

It's simple to define the pipeline response format. Add a Transformer to the end of the pipeline and define the response:

{
"body": "<xml>Output 1</xml>",
"code": 200,
"Content-Type": "application/xml"
}


IMPORTANT: Content-Type must be one of the values defined in Response Content Types.


Click here to read the article that explains how to make sure that the pipeline output will be a file.

Did this answer your question?