Endpoint
The endpoint URL uses the
analytex
environment. For EU data residency, use
analytex-eu
instead. See Environment
Settings for your specific endpoint.Headers
Header | Value | Required |
---|---|---|
Content-Type | application/json | Yes |
Accept | application/json, text/plain, */* | Yes |
Authorization | Token {YOUR_API_KEY} | Yes |
Request Body
Field | Type | Required | Description |
---|---|---|---|
users | array | Yes | Array of user objects |
user_id | string | Yes | Unique identifier for the user |
company_id | string | No | Unique identifier for the company (if applicable) |
metadata | object | No | Key-value pairs describing the user (primitives only) |
JSON Payload Example
Optimize Your Payload: Use the full 10,000 record limit to maximize processing efficiency. The example above shows the structure for individual records - you can include up to 10,000 such records in a single request.
Response
A successful call returns a job object:Monitor Your Job: Use the job ID from the response to track the status of your bulk update job. Remember that only one bulk update job (user or company) can run at a time.
File Upload (NDJSON)
For very large updates, you may upload NDJSON files containing user profiles.Endpoint
The endpoint URL uses the
analytex
environment. For EU data residency, use
analytex-eu
instead. See Environment
Settings for your specific endpoint.Headers
Header | Value | Required |
---|---|---|
Content-Type | multipart/form-data | Yes |
Accept | application/json, text/plain, */* | Yes |
Authorization | Token {YOUR_API_KEY} | Yes |
Request Body
Submit the file using multipart/form-data. Include a key calledfile
with your NDJSON file:
Each line in the file should be a valid JSON object.
File size is limited to 50 MB. Each request can contain up to 10,000 users.
Rate Limits
The Bulk Updates API implements rate limiting to ensure efficient processing of large-scale user data operations while maintaining system stability. These limits help optimize performance for bulk user profile updates and synchronizations.Rate Limit Details
- Job Limitation: One job at a time - The system processes only one bulk update job at a time (either user or company operations)
- User Processing: 1,800 users per minute service consumption rate
- JSON Payload: Up to 10,000 records per request
- File Upload: Up to 50 MB file size
Error Responses
When you exceed these limits, the API returns:409 Conflict
: Returned when attempting to create a new job while another is in progress413 Payload Too Large
: Returned when file size exceeds 50 MB or payload exceeds 10,000 records
Best Practices
Check Job Status: Always verify no existing bulk update jobs are running before initiating a new one.Optimize Batch Sizes: Use the full 10,000 record limit for JSON payloads to maximize efficiency.Monitor Processing: Track your user processing speed to stay within the 1,800 users per minute limit.Handle File Uploads: For large datasets, use file uploads (up to 50 MB) instead of JSON payloads.
Limitations
- File size up to 50 MB (for file uploads).
- JSON/NDJSON list up to 10,000 users per request.
- Only primitive types (string, number, boolean, null) are supported in metadata.
Best Practices
- Validate Your Data: Ensure each record includes the required identifiers (
user_id
) and that metadata is formatted correctly. - Monitor Jobs: Always use the job monitoring endpoints to check the status of your bulk updates.
- Rate Limits and Retries: If you experience rate limits or timeouts, batch your requests and monitor job statuses before submitting more.
Troubleshooting
- Authentication Errors: Verify your API token and that it is sent in the
Authorization
header. - Invalid Payload: Ensure your JSON/NDJSON is well-formed and required fields are present.
- Job Failures: Use the job status endpoint to inspect error messages for failed records or processing issues.