Skip to content
  • There are no suggestions because the search field is empty.

Send data to Databox via API

Learn how to send custom data to Databox via the Databox API to build metrics and dashboards from any data source.


Availability

profile  Users, Editors, and Admins

box  All accounts

lock  One or more features exclusive to select subscription plans



The tech stack of modern companies is increasingly complex, with data spread across countless tools, platforms, and internal systems. While Databox offers hundreds of native integrations, it’s not always possible—or practical—to cover every custom data source or proprietary system. Some organizations also prefer tighter control over how data flows into their reporting environment. The Databox API provides a flexible solution, allowing teams to programmatically send data from any source directly into Databox. As a result, all business-critical metrics can be consolidated in one place, enabling analysts, department leaders, and executives to monitor performance, uncover insights, and make informed decisions from a single, unified source of truth.

Send data to Databox via API

The Databox API (v1) is dataset-based, meaning you create containers for data and then ingest records into them.

Before you begin

Before authenticating your requests, you must first create an API key in Databox. This key is used to securely authorize all API calls and associate them with your account.

Follow the steps outlined on the Authentication page in the Databox API documentation to generate your key.

List available accounts

After generating your API key, the next step is to list the Databox accounts you have access to. You’ll need the account ID to create a new data source in the next step.

To retrieve your accounts, send a GET request to the /v1/accounts endpoint:

curl -i -X GET \
https://api.databox.com/v1/accounts \
-H 'x-api-key: YOUR_API_KEY_HERE'

A successful response returns a list of accounts associated with your API key:

{
"requestId": "b0eac937-c25c-47a5-bb7e-552f6b860458",
"status": "success",
"accounts": [
{
"id": 123456,
"name": "Acme Corp",
"accountType": "organization"
}
]
}

Use the appropriate account ID from this list when creating a data source to ensure the dataset is added to the correct Databox account.

Create a data source

Once you have your account ID, the next step is to create a data source. A data source serves as a logical container for the datasets you’ll send to Databox. You can think of it as the equivalent of an integration or connection within your Databox account.

Send a POST request to the /v1/data-sources endpoint:

curl -i -X POST \
https://api.databox.com/v1/data-sources \
-H 'Content-Type: application/json' \
-H 'x-api-key: YOUR_API_KEY_HERE' \
-d '{
"accountId": 123456,
"title": "ERP System",
"timezone": "UTC"
}'

A successful request returns the ID of your new data source:

{
  "requestId": "b0eac937-c25c-47a5-bb7e-552f6b860458",
  "status": "processing",
  "id": 4754489,
  "title": "ERP System",
  "created": "2025-10-01T12:00:00.000000Z",
  "timezone": "UTC",
  "key": "ingestion"
}

You’ll need the data source ID to create a dataset in the next step.

pinNote: To view all supported time zones, send a GET request to /v1/timezones. Use one of these values in the timezone field when creating your data source.

Create a dataset

Each dataset represents a table of data within a data source. Define the dataset structure, including primary keys (used to identify and update unique records).

Send a POST request to the /v1/datasets endpoint:

curl -i -X POST \
https://api.databox.com/v1/datasets \
-H 'Content-Type: application/json' \
-H 'x-api-key: YOUR_API_KEY_HERE' \
-d '{
"title": "Transactions",
"dataSourceId": 4754489,
"primaryKeys": [
"invoice_id"
]
}'

A successful response returns the dataset’s unique identifier:

{
  "requestId": "b0eac937-c25c-47a5-bb7e-552f6b860458",
  "status": "success",
  "id": "4e1219d8-7fa8-44b7-96c6-a8f2a9cfb0bf",
  "title": "Transactions",
  "created": "2025-10-01T12:00:00.000000Z"
}

You’ll use the dataset ID to send data to this dataset.

magic-wandTip: You can create multiple datasets under a single data source if you need to organize data by entity or domain (e.g., Customers, Orders, Campaigns).

Send data to a dataset

Once the dataset is created, you can begin sending data to it.

Use the /v1/datasets/{datasetId}/data endpoint to ingest records.

curl -i -X POST \
https://api.databox.com/v1/datasets/4e1219d8-7fa8-44b7-96c6-a8f2a9cfb0bf/data \
-H 'Content-Type: application/json' \
-H 'x-api-key: YOUR_API_KEY_HERE' \
-d '{
"records": [
{
"transactionId": "9826",
"occurredAt": "2025-10-01T12:00:00.000000Z",
"amount": 42.5,
"tags": [
"promo",
"returning"
]
},
  {
    "transactionId": "9827",
    "occurredAt": "2025-10-02T10:45:00.000Z",
    "amount": 87.9,
    "tags": [
"new",
"discount"
]
    }
]
}'

A successful ingestion returns an ingestion ID you can use to track status:

{
  "requestId": "b0eac937-c25c-47a5-bb7e-552f6b860458",
  "status": "success",
  "ingestionId": "8bfba187-84f4-41e2-9dd3-bee4bb884205",
  "message": "Data ingestion request accepted"
}

To verify the ingestion result, call:

curl -i -X GET \
  https://api.databox.com/v1/datasets/4e1219d8-7fa8-44b7-96c6-a8f2a9cfb0bf/ingestions/8bfba187-84f4-41e2-9dd3-bee4bb884205 \
  -H 'x-api-key: YOUR_API_KEY_HERE'

What happens next

Once your data is successfully ingested, the dataset and its fields become available in Databox for analysis and reporting. You can immediately start building custom metrics, visualizing trends, or combining your API data with metrics from other integrations.

Beyond visualization, Databox offers several tools to help you manage and enrich your datasets directly within the platform:

  • lock Add calculated columns — Create new columns derived from existing data using formulas, functions, and operators. This helps you calculate ratios, growth rates, and other custom KPIs without altering your source data.
  • lock Merge datasets — Combine multiple datasets into a single, unified dataset to consolidate related information. For example, merging sales and marketing data for cross-department reporting.
  • lock Export a dataset — Download your dataset as a CSV file for backup, further analysis, or sharing with other systems.

Once your dataset is configured, you can:

  • Use its fields in the metric builder to create visualizations and KPIs.
  • Set goals and forecasts based on your custom data.
  • Share Databoards across teams to keep everyone aligned around the same performance data.

Together, these options turn your API-fed datasets into a flexible, fully managed reporting layer inside Databox — ideal for ongoing performance tracking, analysis, and decision-making.

About the legacy Push API (v0)

Before the introduction of the dataset-based API (v1), Databox provided a simpler Push API (v0) for sending pre-aggregated metric data. It was designed for quick submissions, allowing users to push already calculated values — such as total revenue or number of leads — directly into Databox without the need to define datasets or schemas.

While v0 was easy to implement and suited basic reporting needs, it offered limited flexibility. It did not support raw data ingestion, record-level updates, or complex data structures — capabilities that are fully supported in v1.

The Push API (v0) is now deprecated. Although it continues to accept and process incoming data, new data sources can no longer be created using this version, and it is no longer maintained or updated.

The modern v1 API introduces a robust, dataset-based framework that supports structured data ingestion, validation, and integration with Databox’s datasets feature. It is the recommended option for all new and future implementations.

Frequently Asked Questions

Can I send real-time updates to Databox?

Yes. You can send data to Databox as frequently as needed, including real-time or event-based updates. However, API rate limits apply — refer to the Rate Limits documentation for details on request thresholds.

Keep in mind that while data is ingested immediately, metric updates in Databox still follow the configured data source sync frequency.

Still need help?

Visit our community, send us an email, or start a chat in Databox.