I'm working on a project where I need to sync prices between my app and several Shopify stores. In my app, each user can link their Shopify store, and when a price changes for a product in my app, a request is made to the store to update the price via the Shopify API. However, each Shopify store has a rate limit, which means we need to add some milliseconds of delay between requests per store to avoid exceeding this limit. The idea is for the process to sync at least 10 stores in parallel. I was considering using queues with RabbitMQ or Kafka to keep the rate limit controlled while processing in parallel. The problem is that sync requests can arrive at any time, meaning that a queue might have requests from different users, making it impossible to control the rate limit for each store. What would be the best way to handle this efficiently and avoid "too many requests" errors? Thanks for your help!
Jese Leos
August 19, 2024
Verified user
From what I understand from the Shopify documentation, REST API rate limits are applied based on the combination of the app and store. This means that if rate limit restrictions are imposed on calls to one store, that shouldn't necessarily affect the rate limits of another store. Even if you are calling them from the same source (app). Therefore, you can simultaneously push to independent stores, ensuring only that you stay within the rate limit for each particular store. The queue approach you selected sounds appropriate; however, if you enqueue and dequeue all requests simultaneously, it defeats the purpose and might still get blocked by API rate limits. One way to avoid this is to introduce a slight delay between subsequent messages to the same store, which can be done using scheduled messages in RabbitMQ, for example. Reference: Scheduling Messages with RabbitMQ