The Chain.io Flow environment sits on a highly scalable parallel processing architecture. This means that every time you or your Integrations send data to a Chain.io Flow, each individual message is processed in real time.
This can create some unintended side effects if the systems you're working with are not quite as scalable.
For example, imagine dropping 1,000 purchase order files in an FTP folder for Chain.io to process and load via a API calls to a supplier's ERP system. If you drop all 1,000 files at the same time, Flow will create 1,000 separate processing jobs which will all run simultaneously. This may cause a problem in the downstream system if it cannot handle 1,000 simultaneous requests.
With throttling enabled, you can slow the flood of requests by specifying a number of requests per minute to be released into the Flow environment. Within the Flow configuration you can set the number of Flows to be submitted on a per minute basis. With the feature enabled, once the limit is reached, Chain.io will hold all messages and poll every ten seconds until the limit is no longer in effect, then it will release further messages until it reaches the limit again. Messages will be queued for up to 24 hours and then will be discarded.
The diagram below shows an example with 4 messages and a flow limit of 2 messages per minute.
To enable throttling on a flow:
1. Under Advanced Options, enter a throttle value of an integer value between 0 and 600 for Per Minute and/or Concurrent.
- If the field is blank, it uses the Chain.io default throttle of 600 messages per minute. This is a soft limit. Please contact support if you need it increased.
- Using a 0 value will cause all messages to queue until they time out and are discarded after 24 hours. You can use this to temporarily hold messages and release them when required. This is different than disabling a flow which will not process any messages at the API or FTP boundary.
Our throttling architecture is built for expansion, so if you need a different throttling mechanism for your system, please contact support. Our development team is available to discuss building custom algorithms to suit your needs.