The _bulk endpoint allows for efficient processing of multiple requests in a single operation. It supports streaming, parallel or sequential processing, and atomic execution.
Bulk requests can be streamed without requiring the entire request to be loaded into memory.
Results are kept in memory until the full stream is complete, which may result in large responses for big datasets. Consider breaking very large operations into smaller batches.
For processing more than 100 items, we recommend using streaming mode instead of increasing the bulk size limit.
To enable streaming, include one of the following content type headers in your HTTP request:
For a script stream, include content type application/vnd.formance.ledger.api.v2.bulk+script-stream
For a JSON stream, include content type application/vnd.formance.ledger.api.v2.bulk+json-stream
Script stream formatFor a script stream, each Numscript transaction must be wrapped with //script and //end delimiters:
Idempotency keys prevent duplicate transactions when replaying bulk requests after failures. Each bulk element can specify its own key.For script streams, add ik=<key> to the script header:
The 100 item limit applies only to non-streaming requests. If your use case requires a larger limit, you can configure it using the --bulk-max-size flag or BULK_MAX_SIZE environment variable:
Copy
Ask AI
ledger serve --bulk-max-size 1000
Increasing the bulk size does not necessarily improve write performance. Test different values to find the optimal setting for your use case.