Skip to main content

Data Orchestrator REST API

Michael Steckner avatar
Written by Michael Steckner
Updated over 3 months ago

Quick Summary: This document explains how to engage with the Netstock Data Orchestrator REST API for Authentication, the basic flow between IA customers and Data Orchestrator, response codes, and the available API endpoints.

Data Orchestrator REST API Overview

Data Orchestrator exposes a REST API to allow customers to push the data IA requires to IA rather than IA having to pull the data from the customer via an ERP integration.

The REST API is designed to provide a seamless and secure ability for customers to send data and trigger operations over HTTPS. This API follows standard REST principles, utilizing common HTTP methods such as GET, POST, PUT, DELETE, and others to perform operations on the data.


Authentication

The API employs OAuth 2.0 for authentication, ensuring secure and standardized access control. Authentication is handled through a third-party identity server, which manages user credentials and issues access tokens.

Currently Netstock issues the API required credentials.

To interact with the API, you must first obtain an access token from the identity server and include it in the Authorization header of your requests.

Key Features

  • Standard HTTP Methods: Use GET to retrieve data, POST to create resources, PUT to update existing resources, DELETE to remove resources, and more.

  • HTTPS Security: All API communication is encrypted over HTTPS to protect data in transit.

  • RESTful Design: Resources are organized in a predictable, hierarchical structure, with clear and consistent endpoints.

  • OAuth 2.0: Securely authenticate and authorize requests using access tokens from a third-party identity server.


Basic Data Flow

IA integrations with the customers follow the process described below:

  • Data is either pushed/pulled into the Data Orchestrator

  • Data is stored in a staging database

  • Transform scripts are used to manipulate and mould the data into the IA required format

  • Data extracted from the staging database via the transform scripts is sent to IA for further analysis and processing

Looking specifically at the REST API integration, the above can be achieved with the following steps:

  • Authenticate

  • Start/Open up a dataset import

  • Start/Open a model import for each model you wish to import data for

  • Send all model data

  • End/Close all model imports

  • Start dataset import processing


Endpoints

An interactive version of all endpoints can be found on:


Response Overview

All responses have the following structure:

{

"api_call_status": boolean,

"message": string,

"data": { }

}

  • api_call_status: Represents the success state of the API call

  • message: verbose description of the response

  • data: Additional data returned by the API. Depending on the API endpoint

The following are possible error responses for all API calls:

Code

Response

Description

400



{

"api_call_status": false,

"message": "An error occurred: Invalid payload detected ...",

"data": {}

}

Bad request.

Validation failure, usually on request body

401



{

"api_call_status": false,

"message": "Authorization error message",

"data": {}

}

Unauthorized.

Problem with authorization:

  • No Authorization header

  • No Bearer token

  • Invalid token

403



{

"api_call_status": false,

"message": "Insufficient permissions",

"data": {}

}

Forbidden

Issue with permissions.

Contact IA representative to rectify this.

Authentication

Description: Authenticates the user and provides the user with a JWT access token.

URL: /application/o/token/

Method: POST

Request:

Content-Type: application/x-www-form-urlencoded

Parameters:

client_id: provider unique identifier (provided by IA)

username: customer unique identifier - matching customer identifier in IA

password: customer unique and secure password

grant_type: client_credentials

scope: identity server scope (provided by IA)

Response:

Content-Type: application/json

Response:

{
"access_token": ”abc",
"token_type": "Bearer",
"scope": "test_scope",
"expires_in": 3600,
"id_token": "abc"
}

⚠️ access_token needs to be added as a Bearer token to the Authorization header to every subsequent request.

Postman example:


Dataset Import

Start Dataset Import

Description: Starts/opens a new dataset import. There can only be one open dataset import per user at a time. Dataset imports must be closed/completed before attempting to start a new one. Each dataset import has a unique identifier that needs to be used in subsequent calls.

URL: /dataset/start

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": "{{unique customer identifier}}"
}

Response:

Code: 200

Content-Type: application/json

Response:

{
"api_call_status": true,
"message": "Dataset import started successfully",
"data": {
"dataset_import_id": "4444a44a-44a4-4444-4444-44a4a44aaa44"
}
}

⚠️ dataset_import_id needs to be included in subsequent calls to facilitate correct data allocation to a dataset import.

Postman example:


Start Dataset Processing

Description: This call can be triggered as soon as all data has been sent to the Data Orchestrator, regardless of whether the data has completed importing or not. All models need to, however, have been explicitly ended via the /model/end call. Depending on the state of the dataset import this call can have two possible flows:

  1. If dataset import was previously ended/closed then data reprocessing is triggered (runs transform scripts) if all model data has been successfully imported.

  2. If dataset import has not previously been ended/closed then this marks the dataset import as “ready to process”, waits for all model data to be imported and then runs transform scripts if all data was successfully imported

URL: /dataset/start_processing

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": "{{unique customer identifier}}",
"dataset_import_id": "{{unique dataset import identifier}}"
}

Response:

Code: 202

Content-Type: application/json

Response:

{
"api_call_status": true,
"message": "Dataset import completed and dataset processing started for 4444a44a-44a4-4444-4444-44a4a44aaa44",
"data": {}
}

Postman example:


Dataset Status

Description: This call displays the status of the specified dataset import

URL: /dataset/status

Method: GET

Request:

Authorization: Bearer access_token

Content-Type: application/json

Parameters:

  "customer_identifier": "{{unique customer identifier}}",
"dataset_import_id": "{{unique dataset import identifier}}"

Response:

Code: 200

Content-Type: application/json

Response:

{
"api_call_status": boolean,
"message": "",
"data": {
"status": string,
"success": boolean,
"message": string
}
}

{
"api_call_status": boolean,
"model": [{
"model": string,
"started_at":string,
"ended_at": string,
"success": null,
"minimum_expected_lines": integer,
"actual_lines_imported": integer,
"marked_to_end": boolean
}]
}

Response data explained:

  • status: Represents the status of the dataset import. Possible statuses include:

    • Complete: Dataset Import has completed. All data has been imported and processed. The success flag describes whether the dataset import competed successfully or not

    • Copying_po_db: Dataset Import has completed successfully. All data has been imported and processed. A copy of the database is being made to support pushing files back to the provider

    • Transforming: All data has been received and stored. Processing in progress...

    • Importing_complete: All data has been received and stored. Processing not yet started. Trigger processing by calling the dataset/start_processing endpoint

    • Importing: Busy importing data. Check model/status endpoint to see which models are started/processing/completed.

    • Dataset_import_started: Ready to start importing data. Trigger model import by calling model/start endpoint.

  • success: Describes whether the operations above completed successfully or not

  • message: if the success flag is false then message will contain all error logs indicating possible cause of failure

  • model: list of all models that are currently started/open as well as their state see more details on model fields under /model/status

Postman example:


Dataset Status Legend

Description: This call displays a list of all possible dataset import status along with verbose descriptions of each

URL: /dataset/status_legend

Method: GET

Request:

Authorization: Bearer access_token

Content-Type: application/json

Response:

Code: 200

Content-Type: application/json

Response:

{
"api_call_status": boolean,
"message": string,
"data": {
"legend": [
{
"status": string,
"description": string
}
]
}
}

Postman Example:


Dataset Full Audit Log

Description: This call displays a full audit trail of operations and logs generated so far for the specified dataset import

URL: /dataset/full_audit_log

Method: GET

Request:

Authorization: Bearer access_token

Content-Type: application/json

Parameters:

  "customer_identifier": "{{unique customer identifier}}",
"dataset_import_id": "{{unique dataset import identifier}}"

Response:

Code: 200

Content-Type: application/json

Response:

{
"api_call_status": boolean,
"message": string,
"data": {
"logs": [
{
"timestamp": string,
"message": string,
"log_type": string
}
]
}
}

Postman Example:


Models

Model Start

Description: Starts/opens a new model for the dataset import. There can only be one open model type at a time. However there can be multiple different model types open at the same time for data importing. Model imports must be closed/completed before attempting to start a new one of the same type.

URL: /model/start

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"model": string
}

Request explained:

  • Customer_identifier: unique customer identifier

  • Dataset_import_id: unique dataset import identifier

  • Model: the name of the model you wish to start/open. Possible models include:

    • locations

    • stocks

    • suppliers

    • master

    • groups

    • sales

    • po

    • co

    • transfers

    • pohist

    • bom

    • suppersessions

    • ut_abc: custom models need to start with ut_

Response:

Code: 200

Content-Type: application/json

Response:

{
"api_call_status": boolean,
"message": string,
"data": {}
}

Note: dataset_import_id needs to be included in subsequent calls to facilitate correct data allocation to a dataset import.

Postman example:


Model Status

Description: This call displays the status of the specified dataset import

URL: /model/status

Method: GET

Request:

Authorization: Bearer access_token

Content-Type: application/json

Parameters:

  "customer_identifier": "{{unique customer identifier}}",
"dataset_import_id": "{{unique dataset import identifier}}"

Response:

Code: 200

Content-Type: application/json

Response:

{
"api_call_status": boolean,
"message": "",
"data": {
"dataset_import_id": "e0bc8ded-2a8f-48c7-9dfd-c0d9ac3914ef",
"models": [
{
"model": string,
"started_at": date,
"ended_at": date,
"success": boolean,
"minimum_expected_lines": integer,
"actual_lines_imported": integer,
"marked_to_end": boolean
}
]
}
}

Response models explained:

  • models: list of all models that are currently started/open as well as their state

    • Model: Model name

    • started_at: Date and time when model import was started/opened

    • ended_at: Date and time when model import was ended/closed. null if still open.

    • success: If all data has been imported, this flag indicates whether it imported successfully or not. null if still busy

    • minimum_expected_lines: This is set when a model import is ended/closed. This indicated the amount of records sent to be imported

    • actual_lines_imported: This is the actual number of records imported to this point.

    • marked_to_end: This indicates whether the model import has been marked to end/close. Once this is set, no more data will be accepted. All data that has already been submitted will continue to import until done.

Postman example:


Model End

Description: This call can be triggered as soon as all data has been submitted for the specific model, regardless of whether the data has completed importing or not. The total number of records submitted needs to be sent in the payload here. A successful call will set the marked_to_end flag to true and will monitor the data still busy importing. There is a timeout period in which the data needs to complete importing. If the expected record count does not match the actual record count after the timeout period then the model and dataset import will be failed.

URL: /model/end

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"model": string,
"total_lines_imported": integer
}

Response:

Code: 202

Content-Type: application/json

Response:

{
"api_call_status": true,
"message": "Model import locations ended.",
"data": {}
}

Postman example:


Import

Below is a list of endpoints for possible models to import the actual data. All payloads per model need to conform to the standards defined in the Data Interface Requirements.

Locations

Description: This call facilitates the data import for locations

URL: /import/locations

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [{
"location_code": string,
"description": string,
"active": boolean,
"group": string,
"type": string
}]
}


Suppliers

Description: This call facilitates the data import for suppliers

URL: /import/suppliers

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"supplier_code": string,
"description": string,
"type": string,
"lead_time": integer
}
]
}


Master

Description: This call facilitates the data import for master items

URL: /import/suppliers

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"item_code": string,
"description": string,
"unique_identifier": string,
"uom": string,
"unit_volume": number,
"unit_weight": number,
"superseded_item_code": string,
"superseded_item_factor": string
}
]
}


Stock By Location

Description: This call facilitates the data import for stock by locations

URL: /import/stock_by_location

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"item_code": string,
"location": string,
"inventory_unit_cost": number,
"purchase_unit_cost": number,
"stock_on_hand": integer,
"vendor_code": string,
"date_added": date,
"selling_price": number,
"allocated_stock": integer,
"supply_type": string,
"supply_code": string,
"purchase_uom": string,
"purchase_factor": number,
"purchase_currency_code": string,
"lead_time": integer",
"stocking_indicator": string,
"minimum_stock": integer",
"minimum_order_quantity": integer,
"order_multiple": integer",
"group_1": string
}
]
}


Groups

Description: This call facilitates the data import for groups

URL: /import/groups

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"identifier": integer,
"value": string,
"description": string
}
]
}


Sales

Description: This call facilitates the data import for sales

URL: /import/sales

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"item_code": string,
"location": string,
"order_date": date,
"invoice_date": date,
"sales_quantity": number,
"customer_code": string,
"cost_of_sales": number,
"sales_value": number,
"issues_quantity": integer,
"transaction_details": string
}
]
}


Sales Legacy

Description: This call facilitates the data import for sales

URL: /import/sales_legacy

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"item_code": string,
"location": string,
"period": date,
"sales_quantity": integer,
"cost_of_sales": number,
"sales_value": number,
"issues_quantity": integer
}
]
}


Purchase Orders

Description: This call facilitates the data import for purchase orders

URL: /import/po

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"item_code": string,
"location": string,
"supplier_code": string,
"order_number": string,
"line_number": string,
"order_date": date,
"order_quantity": integer,
"expected_arrival_date": date,
"outstanding_quantity": integer,
"order_type": string,
"purchase_unit_cost": number
}
]
}


Customer Orders

Description: This call facilitates the data import for customer orders

URL: /import/co

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"item_code": string,
"location": string,
"customer_code": string,
"order_number": string,
"line_number": string,
"order_date": date,
"order_quantity": integer,
"requested_date": date,
"outstanding_quantity": integer
}
]
}


Transfers

Description: This call facilitates the data import for transfers

URL: /import/co

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"item_code": string,
"location": string,
"customer_code": string,
"order_number": string,
"line_number": string,
"order_date": date,
"order_quantity": integer,
"requested_date": date,
"outstanding_quantity": integer
}
]
}


Purchase Order History

Description: This call facilitates the data import for purchase order history

URL: /import/po_hist

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"item_code": string,
"location": string,
"supplier_code": string,
"order_number": string,
"line_number": string,
"order_date": date,
"order_quantity": integer,
"quantity_received": integer,
"date_of_receipt": date,
"expected_arrival_date": date,
"order_urgency": string,
"purchase_unit_cost": number
}
]
}


Bill of Materials

Description: This call facilitates the data import for boms

URL: /import/bom

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:


{
"customer_identifier": "string",
"dataset_import_id": "string",
"data": [
{
"finished_good_item": string,
"finished_good_location": string,
"raw_material_item": string,
"raw_material_location": string,
"ratio": number
}
]
}


Supersessions

Description: This call facilitates the data import for suppressions

URL: /import/suppressions

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"data": [
{
"old_product_code": string,
"new_product_code": string,
"factor": number
}
]
}


Custom Data Models

Description: This call facilitates the data import for custom models

URL: /import/ut_abc

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string,
"dataset_import_id": string,
"file_name": string,
"data": [
{
"type": object
}
]
}


Files

Below is a list of operations available for Purchase orders, Work orders and Transfers

Pop Purchase Order

Description: This call will “pop” and display the oldest available purchase order.

URL: /file/pop_po

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string
}

Response:

Code: 202

Content-Type: application/json

Response:

{
"api_call_status": true,
"message": "Model import locations ended.",
"data": {
"file_name": "po_12.csv",
"lines": [
{
"order_number": "12",
"supplier": "OFFICEMAX",
"location_code": "WHOLESALE",
"order_date": "2025/11/03",
"username": "System Administrator",
"user_email_address": "admin@netstock.co",
"item_code": "CONPAPERST",
"item_description": "Letter Paper Ream Standard",
"uom": "EA",
"quantity": "99",
"unit_cost": "3.0",
"created_date": "2025/11/03",
"expected_receipt_date": "2025/11/03",
"minimum_order_quantity": "1.0",
"comments": null,
"export_location_code": "SalesDemo01",
"sales_location_code": "PRODWHOLE"
}
]
}
}

Postman example:


Pop Transfer Order

Description: This call will “pop” and display the oldest available transfer order.

URL: /file/pop_tr

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string
}

Response:

Code: 202

Content-Type: application/json

Response:

{
"api_call_status": true,
"message": "Model import locations ended.",
"data": {
"file_name": "tr_12.csv",
"lines": [
{
"order_number": "O123",
"transfer_from_code": "Supplier Name",
"transfer_to_code": "LOC 123",
"order_date": "2025/11/03",
"username": "Username",
"user_email_address": "email@address.com",
"item_code": "ITEM123",
"item_description": "Item Description",
"uom": "EACH",
"quantity": 5.0,
"unit_cost": 10.0,
"created_date": "2025/11/03",
"receipt_date": "2025/11/03",
"minimum_order_quantity": 2.0,
"comments": "Comments ...",
"export_location_code": "LOC234",
"sales_location_code": "LOC345"
}
]
}
}


Pop Work Order

Description: This call will “pop” and display the oldest available work order.

URL: /file/pop_wo

Method: POST

Request:

Authorization: Bearer access_token

Content-Type: application/json

Body:

{
"customer_identifier": string
}

Response:

Code: 202

Content-Type: application/json

{

"api_call_status": true,

"message": "Model import locations ended.",

"data": {

"file_name": "wo_12.csv",

"lines": [

{

"order_number": "O123",

"supplier": "Supplier Name",

"location_code": "LOC 123",

"order_date": "2025/11/03",

"username": "Username",

"user_email_address": "email@address.com",

"item_code": "ITEM123",

"item_description": "Item Description",

"uom": "EACH",

"quantity": 5.0,

"unit_cost": 10.0,

"created_date": "2025/11/03",

"due_date": "2025/11/03",

"minimum_quantity": 2.0,

"comments": "Comments ...",

"export_location_code": "LOC234",

"sales_location_code": "LOC345"

}

]

}

}

Did this answer your question?