Skip to main content

Bulk upload files to import using API

Pieter Reitsma avatar
Written by Pieter Reitsma
Updated over 2 years ago

Using the Client File import functionality we allow users of the KYC Client File to import a bulk of clients, products and also documents. In some cases this bulk may be so large, that it is useful to automate this. For the automation proces, users can use the API. This article will describe step-by-step how to interact with the API to import data into the Client File.

Contents

Prerequisites

  • To perform the actions in the instructions, you require a valid authorization.
    This authorization is to be provided in the header of the requests to the API.

  • To be able to start the validation of the task the permission workflowPartyInputOrchestrationValidationStartedWrite is required. (For more on permissions: User groups and permissions )

One time setup: Get workflow details

Each tenant has a different ID per workflowType. To be able to create a new task for a workflow, we require the ID. In this case the workflow name is partyInputOrchestration.

Endpoint: GET https://api.blanco.cloud/bwfs/workflows

Response:

[
{
"id": "{workflowUUID}",
"name": "Party Input Orchestration",
"workflowType": "partyInputOrchestration"
},
...
]

From the response we require the value "workflowUUID"

Step 1: Create an import task

Before we can process any files, we require an import task. This task will contain for the import process.

Endpoint: POST https://api.blanco.cloud/bwfs/processes

Body:

{
"workflowId":"{workflowUUID}",
"stepId":"{workflowUUID}#1",
"assignee":null
}

Response:

{
"workflowId": "{workflowUUID}",
"status": "Open",
"processId": "{processUUID}"
}

The field procesId will contain the ID required to refer to the newly created task.

Step 2: Add files to import task

Now that we have the import task, we can start uploading files. These files can be either:

  • CSV's with party, product and connection data;

  • CSV's with a details on documents to be uploaded.

  • Documents to be uploaded.

You can repeat the steps below as many times as is required to upload all the documents necessary for the import.

Step 2.1: Prepare upload endpoints

Endpoint: POST https://api.blanco.cloud/input-orchestrator/uploads/{processUUID}/preUpload

Body example:

 {
"files": [
{"fileName":"Contract.pdf","mimeType":"application/pdf"},
...
{"fileName":"docs.csv","mimeType":"text/csv"}
]
}

Response example:

Te response is an array of endpoints that are ready to receive a file upload.

[
{
"id": "c2bae5b0-f79c-11ec-8904-d7063b8e8bf1",
"signedUploadUrl": "{signedURL_1}"
},
...
{
"id": "c2bae5b0-f79c-11ec-8904-d7063b8e8bf1",
"signedUploadUrl": "{signedURL_N}"
}
]

Step 2.2: Upload files

Now that we created an upload endpoints, it is possible to upload the files.

Request: PUT {signedUploadUrl}

This would be a request to upload a file, so make sure to provide the file in this request.

For this request no authorization header should be provided.

Step 3: Start validation

After all files are uploaded, the next step of the import task can be commenced, which is the validation step. It is advised to wait at least a second between uploading files and starting validation, to prevent timing issues.

Request: POST https://api.blanco.cloud/bwfs/processes/{processUUID}/states Body:

{
"stepId":"{processUUID}#2",
"assignee":null
}

Step 4: Continue using the web interface

After validation, the next step is for a user to review the data. This step should be completed using the web application.

The URL for the import task will be:

https://backend.blanco.cloud/administration/workflow-import/{processUUID}/open

Example script

Below is a shell script that can be used to upload one or multiple files using the KYC import functionality.

Note that the script will only work when the instructions in the block "Action Required" are followed.

#!/bin/bash

################################################
#### Action Required ###
################################################
## Dependency
# This script uses the module "jq", which should be installed on the system:
# https://stedolan.github.io/jq/download/
#
# Set the directory that contains the files you want to upload.
upload_dir=upload_files
# Set these variables up to your environment:
blanco_api_base_url="https://api.blanco.cloud"
blanco_portal_base_url="https://backend.blanco.cloud"
# Create an authentication header file, or overwrite this variable:
auth_header=$(cat auth-header)
################################################

# Start of script
mkdir -p .cache
cachefile_workfowUUID=.cache/workflowUUID
if [ -f "$cachefile_workfowUUID" ]; then
echo -n "Retrieving workflowUUID from cache..."
workflowUUID=$(cat $cachefile_workfowUUID)
echo "${workflowUUID}...done."
else
# One time setup: Get workflow details
echo -n "Retrieving workflowUUID from server..."
workflows=$(curl -s -X GET ${blanco_api_base_url}/bwfs/workflows -H "${auth_header}")
workflowUUID=$(echo $workflows | jq '.[] | select(.workflowType=="partyInputOrchestration") | .id' | sed 's/\"*//g')
echo ${workflowUUID} > $cachefile_workfowUUID
echo "${workflowUUID}...done."
fi

cachefile_processUUID=.cache/processUUID
if [ -f "$cachefile_processUUID" ]; then
echo -n "Retrieving processId from cache..."
processUUID=$(cat $cachefile_processUUID)
echo "${processUUID}...done."
else
# Step 1: Create an import task
echo -n "Creating import task..."
body="{\"workflowId\":\"${workflowUUID}\", \"stepId\":\"${workflowUUID}#1\", \"assignee\":null}"
response=$(curl -s -X POST ${blanco_api_base_url}/bwfs/processes\
-H "${auth_header}" \
-H "content-type: application/json" \
-d "${body}")
processUUID=$(echo $response | jq .processId | sed 's/\"*//g')
echo ${processUUID} > $cachefile_processUUID
echo "${processUUID}...done."
fi

# Step 2: Add files to import task
for file in ${upload_dir}/*; do
# Step 2.1: Prepare upload endpoints
fileName=$(basename "$file" suffix)
echo -n "Uploading file ${fileName}..."
if [ ${file##*.} == "csv" ]; then
echo -n "file is csv..."
mimeType="text/csv"
else
mimeType=$(file -b --mime-type ${file})
fi
body="{\"files\":[{\"fileName\":\"${fileName}\",\"mimeType\":\"${mimeType}\"}]}"
response=$(curl -s -X POST ${blanco_api_base_url}/input-orchestrator/uploads/${processUUID}/preUpload \
-H "${auth_header}" \
-H "content-type: application/json" \
-d "${body}")
signedUploadUrl=$(echo ${response} | jq .[].signedUploadUrl | sed 's/\"*//g')

# Step 2.2: Upload files
curl -T ${file} ${signedUploadUrl}
echo "done."
done

# Wait
echo -n "Waiting for uploads to finalise..."
sleep 2
echo "done."

# Step 3: Start validation
echo -n "Starting validation..."
body="{\"stepId\":\"${workflowUUID}#2\", \"assignee\":null}"
response=$(curl -s -X POST ${blanco_api_base_url}/bwfs/processes/${processUUID}/states\
-H "${auth_header}" \
-H "content-type: application/json" \
-d "${body}")
echo "done."

# Optional: If you want to create a new process (task) for each upload,
# then remove the cached processUUID
#rm -f $cachefile_processUUID

# Step 4: Continue using the web interface
echo "################################################"
echo "### Please continue the process in the web application:"
echo "### ${blanco_portal_base_url}/administration/workflow-import/${processUUID}/open"
echo "################################################"

Did this answer your question?