As a feature of StarbaseDB, we allow users to both import and export data into/from their database in the most convenient ways to them. This guide will walk you through how to do both of the operations with JSON file formats.
You can optionally import data from a file directly by uploading it (file size limits may apply) or by passing the JSON body into the API request.
URL Structure
To call the proper API route you can examine the following URL structure.
https://starbasedb.YOUR-IDENTIFIER.workers.dev/import/json/{YOUR_TABLE_NAME}
Host part of the URL is where your database is accessible from, followed by /import/json/
which begins routing your request to the application logic which will handle importing JSON data. The last path component is where you will put your specific table name (e.g. users
) to indicate where you would like this data to be appended to.
Column Mapping
As you pass data into the API you can also tell the system to map your JSON keys into different names that your database columns may refer to them as. For example, if your JSON has the key fname
but your database refers to that field as first_name
then you would want to tell the API to map fname
→ first_name
so it imports it into the table correctly.
Below is an example cURL:
curl --location 'https://starbasedb.YOUR-IDENTIFIER.workers.dev/import/json/users' \
--header 'Authorization: Bearer ABC123' \
--header 'Content-Type: application/json' \
--data '{
"data": [
{ "fname": "John", "lname": "Doe" },
{ "fname": "Jane", "lname": "Dover" }
],
"columnMapping": {
"fname": "first_name",
"lname": "last_name"
}
}'
Import from Body
If you have a preference of sending in the JSON data you want appended to a table by passing it in as a JSON object to the API route (instead of a file) you can do so with a data
param. Below is an example.
curl --location 'https://starbasedb.YOUR-IDENTIFIER.workers.dev/import/json/users' \
--header 'Authorization: Bearer ABC123' \
--header 'Content-Type: application/json' \
--data '{
"data": [
{ "column1": "value1", "column2": "value2" },
{ "column1": "value3", "column2": "value4" }
]
}'
The response you receive from the API endpoint in an ideal situation where no entry failed will look like this:
{
"result": {
"message": "Imported 2 out of 2 records successfully. 0 records failed.",
"failedStatements": []
}
}
While a failing response of some, or all, entries may look like the following:
{
"result": {
"message": "Imported 0 out of 2 records successfully. 2 records failed.",
"failedStatements": [
{
"statement": "INSERT INTO users (database_column1, database_column2) VALUES (?, ?)",
"error": "Unknown error"
},
{
"statement": "INSERT INTO users (database_column1, database_column2) VALUES (?, ?)",
"error": "Unknown error"
}
]
}
}
Import from File
You can also append new data into an existing SQL table from a JSON file by adding the file to your request instead of providing a data
body object. Below is an example:
curl -X POST \
-H "Content-Type: multipart/form-data" \
-H "Authorization: Bearer ABC123" \
-F "file=@/path/to/your/jsonfile.json" \
-F 'columnMapping={"fname": "first_name", "lname": "last_name"}' \
https://starbasedb.YOUR-IDENTIFIER.workers.dev/import/json/users
Export Table Data
It is quite simple to export all of your table data into a JSON file. By using the below cURL in your own command line tool such as Terminal and changing the last URL path component to be the table name you want to export data from, StarbaseDB will aggregate your data and output it into a local output.json
file wherever you initiate this command from on your machine.
curl
--location 'https://starbasedb.YOUR-ID-HERE.workers.dev/export/json/users' \
--header 'Authorization: Bearer ABC123'
--output output.json