has_many :codes
API

Nginx rate limits for third party services

Published  

API rate limits: the problem

In order to prevent abuse many APIs limit the number of requests that a client can make in a given amount of time, through rate limiting practices; here I’ll show you how it is possible to configure Nginx rate limits also for third party services, so that our applications can connect to Nginx as a proxy instead of consuming the third party services directly.

In the example below we’ll be using the Reddit API, which according to its access rules only allows 30 requests/minute per client.

Complying with this sort of API rate limits at application level, while possible, can be quite complicated, because there is the need to maintain some shared state across various instances of the application so that the API rate limits are not exceeded regardless of the instance making requests at any given time. I’m a Ruby developer, so in the past I have used a gem called SlowWeb to comply with a third party API’s rate limits. Unfortunately this gem is no longer maintained (last updates were 3 years ago), plus it is anyway limited in that it wouldn’t work by itself with multiple instance of the application since it doesn’t share state somehow by itself.

Nginx rate limits

Wouldn’t it be cool if there was a way to comply with a third party API rate limits independently from our application, and without reinventing the wheel? This way there wouldn’t be any more the need to maintain some shared state across multiple instances of the application since the rate limiting would be handled separately. There’s a simple answer to this: web servers. It is trivial to implement such a solution with a web server like Apache or Nginx.

I normally use Nginx, so I’ll give you a very simple example (for Reddit API) with this web server. First, we need to add the following lines to Nginx’s main configuration:

http {
  ...

  limit_req_zone $binary_remote_addr zone=api_name:10m rate=30r/m;

  ...
}

Then we need to add the following lines to a virtual host we’ll dedicate as wrapper for the third party API:

server {
  listen 80;
  server_name your_url.ext;

  location / {
    limit_req zone=api_name burst=30;
    proxy_pass http://api_url.ext/;
  }
}

That’s it! Now you can just use your custom URL in your application and stop worrying about the API rate limits. How it works is very simple: Nginx uses the builtin HttpLimitReqModule to limit the number of requests per session/client in a given amount of time. In the example above, we first define a ‘zone’ specifying that we want to limit requests to 30 per minute; then, in the virtual host, we let Nginx proxy all requests to the API’s URL with some “burstiness” unless the third party API does not allow this. Another bit of additional configuration you may want to add to the Nginx virtual host would be for caching, but I usually prefer handling this at application level, for example with Redis.

Know of other tricks to easily comply with API rate limits? Please let me know in the comments.

© Vito Botta