S3 Storage

Know the component and how to use it.

Erick Rubiales avatar
Written by Erick Rubiales
Updated over a week ago

IMPORTANT: This documentation has been discontinued. Read the updated S3 Storage documentation on our new documentation portal.

S3 Storage connects itself with the AWS S3 Storage and makes the following operations in the storage: list, download, upload, delete or move.

Take a look at the configuration parameters of the component:

  • Account: account to be used by the component - mandatory, the account type must be BASIC. It's necessary to specify the client ID and the secret key given by the AWS console.

  • Operation: operation to be executed (list, download, upload, delete or move).

  • Region: region where the S3 is located.

  • Bucket Name: name of the Bucket S3.

  • Bucket Name - Move: for the MOVE operation only. Name of the bucket from which the file will be moved.

  • File Name: name of the local file to go through download or upload) do arquivo local a passar por download ou upload (it doesn't apply to the delete function).

  • Remote File Name: name of the Google file to go through download, upload, list or delete.

  • Remote File Name - Move: for the MOVE operation only. New name of the remote file after being moved.

  • Remote Directory: remote directory from Google Storage to go through download, upload or delete.

  • Remote Directory - Move: for the MOVE operation only. Name of the remote directory whose file will be moved.

  • Generate Download Link: when selected, the option generates a public link for the file download.

  • Expiration Timestamp (in ms): time for the link expiration (in milliseconds). In this field you must provide the current timestamp + the expiration timestamp. Eg.: CURRENT TIMESTAMP + 600000 (600000 = 10 minutes are being informed in milliseconds). If not informed, the default value of 15 minutes after the current timestamp will be assumed.

  • Fail On Error: if the option is enabled, the execution of the pipeline with error will be interrupted; otherwise, the pipeline execution proceeds, but the result will show a false value for the “success” property.

IMPORTANT: the manipulation of files inside a pipeline occurs in a protected way. All the files can be accessed through one temporary directory only, in which each pipeline key provides access to its own set of files.

Messages flow

Input

It will be necessary to provide some input message only if the component has a field configured with Double Braces expressions. Otherwise, the component doesn't expect any specific input message - all you have to do is to configure the fields shown in each selected operation.

Output

LIST operation scenario

{
"success": true,
"content": [
{
"bucketName": "digibee-amazon-s3-connector-test",
"key": "list/test.csv",
"size": 9,
"lastModified": 1596139663000,
"storageClass": "STANDARD",
"owner": null,
"etag": "59587d0fd956dee6905d423bfda2acaf"
}
],
"count": 1,
"nextToken": "1kWwy…..."
}

  • success: if the call is successful, the result will be “true”; otherwise, it will be “false”

  • content: array containing file information

- bucketName: name of the bucket

- key: name of the directory + name of the file

- size: size of the file

- lastModified: date of the last file change

- storageClass: type of storage configured in S3

- owner: nome of the file owner

- etag: entity tag, a hash generated by the file S3

  • count: number of returned objects

  • nextToken: if there's more than one object to be listed, this property is shown for the remaining items to be paginated

DOWNLOAD operation scenario

{
"success": true,
"fileName": "test.file",
"remoteDirectory": "pagination_folder/",
"remoteFileName": "c4b88b6b-83bb-42b0-9de6-0371389db585.csv",
"bucketName": "digibee-amazon-s3-connector-test"
}

  • success: if the call is successful, the result will be “true”; otherwise, it will be “false”

  • fileName: name of the file downloaded in the pipeline directory

  • remoteDirectory: name of the S3 remote directory

  • remoteFileName: name of the remote file downloaded in S3

  • bucketName: name of the S3 bucket

UPLOAD operation scenario

{
"success": true,
"fileName": "test.file",
"remoteDirectory": "pagination_folder/",
"remoteFileName": "test.file",
"urlGenerated": "https://digibee-amazon-s3-connector-test.s3.sa-east-1.amazonaws.com/pagination_folder/test.file?....",
"bucketName": "digibee-amazon-s3-connector-test"
}

  • success: if the call is successful, the result will be “true”; otherwise, it will be “false”

  • fileName: name of the file downloaded in the pipeline directory

  • remoteDirectory: name of the S3 remote directory

  • remoteFileName: name of the remote file downloaded in S3

  • bucketName: name of the S3 bucket

  • urlGenerated: download link of the file if the Generate Download Link option is enabled

MOVE operation scenario

{
"success": true,
"remoteDirectory": "pagination_folder/",
"remoteFileName": "c4b88b6b-83bb-42b0-9de6-0371389db585.csv",
"remoteFileNameMove": "abc.file",
"remoteDirectoryMove": "list/",
"bucketName": "digibee-amazon-s3-connector-test",
"bucketNameMove": "digibee-amazon-s3-connector-test"
}

  • success: if the call is successful, the result will be “true”; otherwise, it will be “false”

  • remoteDirectory: name of the S3 remote directory

  • remoteFileName: name of the remote file downloaded in S3

  • bucketName: name of the S3 bucket

  • bucketNameMove: name of the bucket of the moved file

  • remoteDirectoryMove: name of the remote directory of the moved file

  • remoteFileNameMove: new name of the remote file to be moved

DELETE operation scenario

{
"success": true,
"remoteDirectory": "list/",
"remoteFileName": "abc.file",
"bucketName": "digibee-amazon-s3-connector-test"
}

  • success: if the call is successful, the result will be “true”; otherwise, it will be “false”

  • remoteDirectory: name of the S3 remote directory

  • remoteFileName: name of the remote file deleted from S3

Output with error

{
"success": false,
"message": "Could no issue the operation: download in the AWS S3 Storage",
"error": "com.amazonaws.services.s3.model.AmazonS3Exception: The specified key does not exist. (Service: Amazon S3; Status Code: 404; Error Code: NoSuchKey; Request ID: A21B8733BB9771DE; S3 Extended Request ID: 1zAtWB8gOvJKGKUBXdkWj7er8K6Ik6wUgdIUO1w41TsNo0b51B3MXrT4F4lADL+xI0Ojvf0e6z4=), S3 Extended Request ID: 1zAtWB8gOvJKGKUBXdkWj7er8K6Ik6wUgdIUO1w41TsNo0b51B3MXrT4F4lADL+xI0Ojvf0e6z4="
}

  • success: “false”, because there was an error in the execution

  • message: error message of the component

  • error: error message received from the S3 server

Did this answer your question?