onSchemaError
query parameter.
Updating the Schema
You can optionally update the table schema at the same time by providing a new schema in the JSON request body. If you do not provide a schema, the existing schema will be used. When using a CSV or TSV request body, you cannot pass a schema. If you need to update the schema, use theonSchemaError=updateSchema
query parameter, or stash the CSV/TSV data and pass a JSON request body referencing the stash ID.
Examples
Clear table data
Clear table data
rows
field:Reset table data
Reset table data
rows
field (being sure that row object structure matches the table schema):Reset table data from Stash
Reset table data from Stash
$stashID
reference in the rows
field instead of providing the data inline:Batch update table rows
Batch update table rows
- Call the Get Rows endpoint to fetch the current table data. Pass a reasonably high
limit
query parameter because you want to fetch all rows in as few requests as possible. - Modify the data as desired, and then stash the modified rows using the Stash Data endpoint.
- If a
continuation
was returned in step 1, repeat steps 1-3, passing thecontinuation
query parameter to Get Rows until all rows have been fetched, modified, and stashed. - Finally, call this endpoint with same stash ID used in step 2. This will overwrite the table with the updated data:
If-Match
header to ensure the table is not modified between steps 1 and 4. See the Data Versioning guide for more information.Authorizations
Bearer authentication header of the form Bearer <token>
, where <token>
is your auth token.
Headers
ETag of the current table version. If provided, the request will fail if the table has been updated since the given version. See Data Versioning.
Path Parameters
ID of the table, e.g., 2a1bad8b-cf7c-44437-b8c1-e3782df6
"2a1bad8b-cf7c-44437-b8c1-e3782df6"
Query Parameters
The action to take when the passed data does not match the table schema:
abort
: Abort the entire operation and return an error.dropColumns
: Ignore the data that caused the error, and do not import those columns in the affected rows.updateSchema
: Update the schema as needed to add any missing columns or widen the data types of existing columns, and then import the data from them.
abort
, dropColumns
, updateSchema
"updateSchema"
Body
A collection of row objects conforming to the schema of the table where keys are the column IDs and values are the column values:
[
{
"fullName": "Alex Bard",
"invoiceDate": "2024-07-29T14:04:15.561Z",
"totalAmount": 34.50,
"amountPaid": 0
},
{
"fullName": "Alicia Hines",
"invoiceDate": "2023-06-15T10:30:00.000Z",
"totalAmount": 50.75,
"amountPaid": 20
}
]
[
{
"fullName": "Alex Bard",
"invoiceDate": "2024-07-29T14:04:15.561Z",
"totalAmount": 34.5,
"amountPaid": 0
},
{
"fullName": "Alicia Hines",
"invoiceDate": "2023-06-15T10:30:00.000Z",
"totalAmount": 50.75,
"amountPaid": 20
}
]
The schema of the table as a collection of column definitions.
{
"columns": [
{
"id": "fullName",
"displayName": "Full Name",
"type": "string"
},
{
"id": "invoiceDate",
"displayName": "Invoice Date",
"type": "dateTime"
},
{
"id": "totalAmount",
"displayName": "Total",
"type": "number"
},
{
"id": "amountPaid",
"displayName": "Paid",
"type": "number"
}
]
}