All eyes on AI: 2026 predictions – The shifts that will shape your stack.

Read now
For developersRate Limiting in .NET with Redis
Steve Lorello
Steve Lorello
INFO
In this tutorial we'll explore several approaches to implement rate limiting in ASP.NET Core apps using Redis. We'll start with a basic fixed window implementation, move to a more precise sliding window approach, and finish with a fully configurable rate limiting middleware.

#What is rate limiting?

Rate limiting entails techniques to regulate the number of requests a particular client can make against a networked service. It caps the total number and/or the frequency of requests. There are many reasons why you would want to add a rate limiter to your APIs, whether it is to prevent intentional or accidental API abuse, a rate limiter can stop the invaders at the gate. Let's think of some scenarios where a rate limiter could save your bacon:
  • If you ever worked at an API-based startup, you know that to get anywhere you need a "free" tier. A free tier will get potential customers to try your service and spread the word. But without limiting the free tier users you could risk losing the few paid customers your startup has.
  • Programmatic integrations with your API could have bugs. Sometimes resource starvation is not caused by a malicious attack. These FFDoS (Friendly-Fire Denial of Service) attacks happen more often than you can imagine.
  • Finally, there are malicious players recruiting bots on a daily basis to make API providers' lives miserable. Being able to detect and curtail those attacks before they impact your users could mean the life of your business.
Rate limiting relies on three particular pieces of information:
  1. Who's making the request: Identifying the source of abuse is the most important part of the equation. If the offending requests cannot be grouped and associated with a single entity you'll be fighting in the dark.
  2. What's the cost of the request: Not all requests are created equal. For example, a request that's bound to a single account's data can likely only cause localized havoc, while a request that spans multiple accounts or broad spans of time is much more expensive.
  3. What is their allotted quota: How many total requests and/or what's the rate of requests permitted for the user.

#Why Redis for rate limiting?

Redis is especially positioned as a platform to implement rate limiting for several reasons:
  • Speed: The checks and calculations required by a rate limiting implementation will add to the total request-response times of your API. You want those operations to happen as fast as possible.
  • Centralization and distribution: Redis can seamlessly scale your single server/instance setup to hundreds of nodes without sacrificing performance or reliability.
  • The right abstractions: Redis provides optimized data structures to support several of the most common rate limiter implementations and with its built-in TTL (time-to-live controls) it allows for efficient management of memory. Counting things is a built-in feature in Redis and one of the many areas where Redis shines above the competition.

#Prerequisites

#Start Redis

Before we begin, start Redis. For this example, we'll use the Redis docker image:

#Fixed window rate limiting with Redis and ASP.NET Core

The simplest approach to build a rate limiter is the "fixed window" implementation in which we cap the maximum number of requests in a fixed window of time. For example, if the window size is 1 minute, we can "fix" it at the top of the minute, like 12:00-12:59, 1:00-1:59, and so forth.
The procedure to implement a fixed window rate limiter is:
  1. Identify the requester: This might be an API key, a token, a user's name or id, or even an IP address.
  2. Find the current window: Use the current time to find the window. For example, with 1 minute windows at 3:15 PM, the window would be labeled "3:15".
  3. Find the request count: Look up the count under a key like route:apiKey:timeWindow.
  4. Increment the request count: Increment the counter for this window+user key.
  5. Rate limit if applicable: If the count exceeds the user's quota, deny the request; otherwise, allow it to proceed.
This simple implementation minimizes CPU and I/O utilization but comes with some limitations. It is possible to experience spikes near the edges of the window, since API users might send a burst of requests at the end of one window and the start of the next.

#Create the project

In your terminal, navigate to where you want the app to live and run:
Change directory to FixedRateLimiter and run:
Open the project in your IDE and in the Controllers folder, add an API controller called RateLimitedController:

#Initialize the multiplexer

To use Redis, initialize an instance of the ConnectionMultiplexer from StackExchange.Redis. Go to the ConfigureServices method inside Startup.cs and add the following line:

#Inject the ConnectionMultiplexer

In RateLimitedController.cs, inject the ConnectionMultiplexer into the controller and pull out an IDatabase object:

#Add a simple route

We will add a POST request route that uses Basic auth. Each request expects a header of the form Authorization: Basic <base64encoded> where the base64 encoded value is a string of the form apiKey:apiSecret. This route will parse the key out of the header and return an OK result.
With that setup, run the project with dotnet run, and issue a POST request with apiKey foobar and password password:
You should get a 200 OK response back.

#Fixed window Lua script

We are going to build a fixed window rate limiting script. Given the apiKey foobar hitting our route api/ratelimited/simple at 12:00:05 with a 60-second window and a limit of ten requests, we need to:
  1. Format a key from our info, e.g. route:apiKey:time-window — in our case, api/ratelimited/simple:foobar:12:00
  2. Increment the current value of that key
  3. Set the expiration for that key to 60 seconds
  4. If the current value is less than the max requests allowed, return 0 (not rate limited)
  5. Otherwise, return 1 (rate limited)
The issue we need to contend with is that this rate limiting requires atomicity for all our commands. Because of this, we will run everything on the server through a Lua script. StackExchange.Redis contains support for a more readable mode of scripting that lets you name arguments to your script:

#Loading the script

Add a new file Scripts.cs to the project with a static class called Scripts. This will contain the script string and a getter property to prepare the script for execution:

#Executing the script

Build the key, run the script, and check the result. Add the following just ahead of the return in the Simple method:
The complete Simple route should look like this:

#Test the fixed window limiter

Start the server with dotnet run and try running:
You will see some requests return a 200, and at least one return a 429. How many depends on the time at which you start sending the request. The requests are time-boxed on single-minute windows, so if you transition to the next minute in the middle of the 21 requests, the counter will reset. You should expect to receive somewhere between 10 and 20 OK results and between 1 and 11 429 results:

#Sliding window rate limiting with Redis and ASP.NET Core

#What is a sliding window rate limiter?

A sliding window rate limiter, unlike a fixed window, restricts requests for a discrete window prior to the current request under evaluation. For example, if you have a 10 req/minute rate limiter using a fixed window, you could encounter a case where the limiter allows 20 requests within a single minute: if the first 10 requests arrive at the end of one window and the next 10 at the start of the next, both buckets have capacity. A sliding window rate limiter considers all requests within the last 60 seconds regardless of window boundaries, so only 10 would make it through.
Using sorted sets and Lua scripts, implementing a sliding window rate limiter in Redis is straightforward.

#Create the project

In your terminal, run:
Change directory to SlidingWindowRateLimiter and run:
Set up the controller, multiplexer, and route the same way as the fixed window example:
Don't forget to register the ConnectionMultiplexer in Startup.cs:

#Sliding window Lua script

To implement this pattern we will:
  1. Create a key of the format route:apiKey that maps to a sorted set in Redis
  2. Check the current time, and remove entries that fall outside the window
  3. Check the cardinality of the sorted set
  4. If the cardinality is less than our limit, add a new member with a score of the current time in seconds and a member value of the current time in microseconds, then set the expiration and return 0
  5. If the cardinality is greater than or equal to our limit, return 1
Everything needs to happen atomically, which makes this a perfect place to use a Lua script. Using the StackExchange.Redis script preparation engine:
Create a Scripts.cs file to hold this script:

#Update the controller

Back in the Sliding method, replace the return statement with the rate limiting check:

#Test the sliding window limiter

Start the server with dotnet run and try:
You will see the first 10 requests return a 200, and the remaining 10 return a 429. If you wait and run the command again you may see every other request go through, because the window slides every second and only the previous 30 seconds of requests are considered:

#Configurable rate limiting middleware

Let's consider the case where we have multiple endpoints to rate limit. It doesn't make sense to embed rate-limiting logic in each route. Instead, we can build middleware that intercepts requests and checks whether the request is rate-limited before forwarding it to the appropriate endpoint. With some configuration work, we can make this middleware handle a configurable set of limits.

#Create the project

Change directory to RateLimitingMiddleware and run:

#Create the configuration object

We'll define configuration objects of the following form in our application configuration:
The parameters are:
Parameter NameDescription
PathLiteral path to be rate-limited; if the path matches completely, it will trigger a rate limit check
PathRegexPath regex to be rate-limited; if path matches, it will trigger a rate limit check
WindowThe sliding window to rate limit on, matching the pattern ([0-9]+(s|m|d|h))
MaxRequestsThe maximum number of requests allowable over the period

#Build the config object

Create a new class called RateLimitRule with the logic for rule matching and time parsing:

#Writing the Lua script

We need a Lua script that considers all the rules applicable to a particular user on a specific endpoint. We'll use sorted sets to check the rate limits for each rule and user. On each request, for each applicable rule, it will:
  1. Check the current time
  2. Trim off entries that fall outside the window
  3. Check if another request violates the rule — if so, return 1
  4. For each applicable rule, add a new entry to the sorted set with the score of the current time in seconds and a member name of the current time in microseconds
  5. Return 0
This script has an undetermined number of arguments and keys ahead of time. As such, it's important to make sure that all the keys are on the same shard, so when we build the keys (of the form path_pattern:apiKey:window_size_seconds) we surround the common part of the key apiKey with braces: {apiKey}.

#Build the middleware

Add a new file SlidingWindowRateLimiter.cs with two classes:
In the SlidingWindowRateLimiter class, add the script and constructor:

#Extract the API key

We use Basic auth, so we'll extract the username from the Basic auth header as the API key:

#Extract applicable rules

Pull out the RedisRateLimits configuration section, filter to rules matching the current path, group by window size and path key, and take the most restrictive rule for each group:

#Check limitation

Build the keys and arguments for the Lua script, evaluate it, and check the result:

#Block or allow

In the InvokeAsync method, glue everything together. Parse the API key, check the rate limits, and either throttle or proceed:

#Build the controller

Under the Controllers folder, add a RateLimitedController with two routes:

#Add middleware to the app

Open Startup.cs. In the ConfigureServices method, add:
In the Configure method, add:

#Configure the rate limits

In appsettings.json or appsettings.Development.json, add:

#Test the middleware

Run dotnet run and hit the endpoints. First, test the directly limited endpoint:
This will send seven requests, two of which will be rejected. After that, test the indirectly limited endpoint:
It should reject another two requests as throttled, since the /api/* regex rule applies to both endpoints and the earlier requests against /limited count toward the hourly quota.

#Resources