rateLimit
RateLimiter
ClassRate Limiting, Throttling, and Debouncing are three distinct approaches to controlling function execution frequency. Each technique blocks executions differently, making them "lossy" - meaning some function calls will not execute when they are requested to run too frequently. Understanding when to use each approach is crucial for building performant and reliable applications. This guide will cover the Rate Limiting concepts of TanStack Pacer.
Note
TanStack Pacer is currently only a front-end library. These are utilities for client-side rate-limiting.
Rate Limiting is a technique that limits the rate at which a function can execute over a specific time window. It is particularly useful for scenarios where you want to prevent a function from being called too frequently, such as when handling API requests or other external service calls. It is the most naive approach, as it allows executions to happen in bursts until the quota is met.
Rate Limiting (limit: 3 calls per window)
Timeline: [1 second per tick]
Window 1 | Window 2
Calls: ⬇️ ⬇️ ⬇️ ⬇️ ⬇️ ⬇️ ⬇️
Executed: ✅ ✅ ✅ ❌ ❌ ✅ ✅
[=== 3 allowed ===][=== blocked until window ends ===][=== new window =======]
Rate Limiting (limit: 3 calls per window)
Timeline: [1 second per tick]
Window 1 | Window 2
Calls: ⬇️ ⬇️ ⬇️ ⬇️ ⬇️ ⬇️ ⬇️
Executed: ✅ ✅ ✅ ❌ ❌ ✅ ✅
[=== 3 allowed ===][=== blocked until window ends ===][=== new window =======]
Rate Limiting is particularly important when dealing with front-end operations that could accidentally overwhelm your back-end services or cause performance issues in the browser.
Common use cases include:
Rate Limiting is the most naive approach to controlling function execution frequency. It is the least flexible and most restrictive of the three techniques. Consider using throttling or debouncing instead for more spaced out executions.
Tip
You most likely don't want to use "rate limiting" for most use cases. Consider using throttling or debouncing instead.
Rate Limiting's "lossy" nature also means that some executions will be rejected and lost. This can be a problem if you need to ensure that all executions are always successful. Consider using queueing if you need to ensure that all executions are queued up to be executed, but with a throttled delay to slow down the rate of execution.
TanStack Pacer provides both synchronous and asynchronous rate limiting through the RateLimiter and AsyncRateLimiter classes respectively (and their corresponding rateLimit and asyncRateLimit functions).
The rateLimit function is the simplest way to add rate limiting to any function. It's perfect for most use cases where you just need to enforce a simple limit.
import { rateLimit } from '@tanstack/pacer'
// Rate limit API calls to 5 per minute
const rateLimitedApi = rateLimit(
(id: string) => fetchUserData(id),
{
limit: 5,
window: 60 * 1000, // 1 minute in milliseconds
onReject: (rateLimiter) => {
console.log(`Rate limit exceeded. Try again in ${rateLimiter.getMsUntilNextWindow()}ms`)
}
}
)
// First 5 calls will execute immediately
rateLimitedApi('user-1') // ✅ Executes
rateLimitedApi('user-2') // ✅ Executes
rateLimitedApi('user-3') // ✅ Executes
rateLimitedApi('user-4') // ✅ Executes
rateLimitedApi('user-5') // ✅ Executes
rateLimitedApi('user-6') // ❌ Rejected until window resets
import { rateLimit } from '@tanstack/pacer'
// Rate limit API calls to 5 per minute
const rateLimitedApi = rateLimit(
(id: string) => fetchUserData(id),
{
limit: 5,
window: 60 * 1000, // 1 minute in milliseconds
onReject: (rateLimiter) => {
console.log(`Rate limit exceeded. Try again in ${rateLimiter.getMsUntilNextWindow()}ms`)
}
}
)
// First 5 calls will execute immediately
rateLimitedApi('user-1') // ✅ Executes
rateLimitedApi('user-2') // ✅ Executes
rateLimitedApi('user-3') // ✅ Executes
rateLimitedApi('user-4') // ✅ Executes
rateLimitedApi('user-5') // ✅ Executes
rateLimitedApi('user-6') // ❌ Rejected until window resets
For more complex scenarios where you need additional control over the rate limiting behavior, you can use the RateLimiter class directly. This gives you access to additional methods and state information.
import { RateLimiter } from '@tanstack/pacer'
// Create a rate limiter instance
const limiter = new RateLimiter(
(id: string) => fetchUserData(id),
{
limit: 5,
window: 60 * 1000,
onExecute: (rateLimiter) => {
console.log('Function executed', rateLimiter.getExecutionCount())
},
onReject: (rateLimiter) => {
console.log(`Rate limit exceeded. Try again in ${rateLimiter.getMsUntilNextWindow()}ms`)
}
}
)
// Get information about current state
console.log(limiter.getRemainingInWindow()) // Number of calls remaining in current window
console.log(limiter.getExecutionCount()) // Total number of successful executions
console.log(limiter.getRejectionCount()) // Total number of rejected executions
// Attempt to execute (returns boolean indicating success)
limiter.maybeExecute('user-1')
// Update options dynamically
limiter.setOptions({ limit: 10 }) // Increase the limit
// Reset all counters and state
limiter.reset()
import { RateLimiter } from '@tanstack/pacer'
// Create a rate limiter instance
const limiter = new RateLimiter(
(id: string) => fetchUserData(id),
{
limit: 5,
window: 60 * 1000,
onExecute: (rateLimiter) => {
console.log('Function executed', rateLimiter.getExecutionCount())
},
onReject: (rateLimiter) => {
console.log(`Rate limit exceeded. Try again in ${rateLimiter.getMsUntilNextWindow()}ms`)
}
}
)
// Get information about current state
console.log(limiter.getRemainingInWindow()) // Number of calls remaining in current window
console.log(limiter.getExecutionCount()) // Total number of successful executions
console.log(limiter.getRejectionCount()) // Total number of rejected executions
// Attempt to execute (returns boolean indicating success)
limiter.maybeExecute('user-1')
// Update options dynamically
limiter.setOptions({ limit: 10 }) // Increase the limit
// Reset all counters and state
limiter.reset()
The RateLimiter class supports enabling/disabling via the enabled option. Using the setOptions method, you can enable/disable the rate limiter at any time:
const limiter = new RateLimiter(fn, {
limit: 5,
window: 1000,
enabled: false // Disable by default
})
limiter.setOptions({ enabled: true }) // Enable at any time
const limiter = new RateLimiter(fn, {
limit: 5,
window: 1000,
enabled: false // Disable by default
})
limiter.setOptions({ enabled: true }) // Enable at any time
If you are using a framework adapter where the rate limiter options are reactive, you can set the enabled option to a conditional value to enable/disable the rate limiter on the fly. However, if you are using the rateLimit function or the RateLimiter class directly, you must use the setOptions method to change the enabled option, since the options that are passed are actually passed to the constructor of the RateLimiter class.
Both the synchronous and asynchronous rate limiters support callback options to handle different aspects of the rate limiting lifecycle:
The synchronous RateLimiter supports the following callbacks:
const limiter = new RateLimiter(fn, {
limit: 5,
window: 1000,
onExecute: (rateLimiter) => {
// Called after each successful execution
console.log('Function executed', rateLimiter.getExecutionCount())
},
onReject: (rateLimiter) => {
// Called when an execution is rejected
console.log(`Rate limit exceeded. Try again in ${rateLimiter.getMsUntilNextWindow()}ms`)
}
})
const limiter = new RateLimiter(fn, {
limit: 5,
window: 1000,
onExecute: (rateLimiter) => {
// Called after each successful execution
console.log('Function executed', rateLimiter.getExecutionCount())
},
onReject: (rateLimiter) => {
// Called when an execution is rejected
console.log(`Rate limit exceeded. Try again in ${rateLimiter.getMsUntilNextWindow()}ms`)
}
})
The onExecute callback is called after each successful execution of the rate-limited function, while the onReject callback is called when an execution is rejected due to rate limiting. These callbacks are useful for tracking executions, updating UI state, or providing feedback to users.
The asynchronous AsyncRateLimiter supports additional callbacks for error handling:
const asyncLimiter = new AsyncRateLimiter(async (id) => {
await saveToAPI(id)
}, {
limit: 5,
window: 1000,
onExecute: (rateLimiter) => {
// Called after each successful execution
console.log('Async function executed', rateLimiter.getExecutionCount())
},
onReject: (rateLimiter) => {
// Called when an execution is rejected
console.log(`Rate limit exceeded. Try again in ${rateLimiter.getMsUntilNextWindow()}ms`)
},
onError: (error) => {
// Called if the async function throws an error
console.error('Async function failed:', error)
}
})
const asyncLimiter = new AsyncRateLimiter(async (id) => {
await saveToAPI(id)
}, {
limit: 5,
window: 1000,
onExecute: (rateLimiter) => {
// Called after each successful execution
console.log('Async function executed', rateLimiter.getExecutionCount())
},
onReject: (rateLimiter) => {
// Called when an execution is rejected
console.log(`Rate limit exceeded. Try again in ${rateLimiter.getMsUntilNextWindow()}ms`)
},
onError: (error) => {
// Called if the async function throws an error
console.error('Async function failed:', error)
}
})
The onExecute and onReject callbacks work the same way as in the synchronous rate limiter, while the onError callback allows you to handle errors gracefully without breaking the rate limiting chain. These callbacks are particularly useful for tracking execution counts, updating UI state, handling errors, and providing feedback to users.
Use the AsyncRateLimiter when:
import { asyncRateLimit } from '@tanstack/pacer'
const rateLimited = asyncRateLimit(
async (id: string) => {
const response = await fetch(`/api/data/${id}`)
return response.json()
},
{
limit: 5,
window: 1000,
onError: (error) => {
console.error('API call failed:', error)
}
}
)
// Returns a Promise<boolean> - resolves to true if executed, false if rejected
const wasExecuted = await rateLimited('123')
import { asyncRateLimit } from '@tanstack/pacer'
const rateLimited = asyncRateLimit(
async (id: string) => {
const response = await fetch(`/api/data/${id}`)
return response.json()
},
{
limit: 5,
window: 1000,
onError: (error) => {
console.error('API call failed:', error)
}
}
)
// Returns a Promise<boolean> - resolves to true if executed, false if rejected
const wasExecuted = await rateLimited('123')
The async version provides Promise-based execution tracking, error handling through the onError callback, proper cleanup of pending async operations, and an awaitable maybeExecute method.
Each framework adapter provides hooks that build on top of the core rate limiting functionality to integrate with the framework's state management system. Hooks like createRateLimiter, useRateLimitedCallback, useRateLimitedState, or useRateLimitedValue are available for each framework.
Here are some examples:
import { useRateLimiter, useRateLimitedCallback, useRateLimitedValue } from '@tanstack/react-pacer'
// Low-level hook for full control
const limiter = useRateLimiter(
(id: string) => fetchUserData(id),
{ limit: 5, window: 1000 }
)
// Simple callback hook for basic use cases
const handleFetch = useRateLimitedCallback(
(id: string) => fetchUserData(id),
{ limit: 5, window: 1000 }
)
// State-based hook for reactive state management
const [instantState, setInstantState] = useState('')
const [rateLimitedState, setRateLimitedState] = useRateLimitedValue(
instantState, // Value to rate limit
{ limit: 5, window: 1000 }
)
import { useRateLimiter, useRateLimitedCallback, useRateLimitedValue } from '@tanstack/react-pacer'
// Low-level hook for full control
const limiter = useRateLimiter(
(id: string) => fetchUserData(id),
{ limit: 5, window: 1000 }
)
// Simple callback hook for basic use cases
const handleFetch = useRateLimitedCallback(
(id: string) => fetchUserData(id),
{ limit: 5, window: 1000 }
)
// State-based hook for reactive state management
const [instantState, setInstantState] = useState('')
const [rateLimitedState, setRateLimitedState] = useRateLimitedValue(
instantState, // Value to rate limit
{ limit: 5, window: 1000 }
)
import { createRateLimiter, createRateLimitedSignal } from '@tanstack/solid-pacer'
// Low-level hook for full control
const limiter = createRateLimiter(
(id: string) => fetchUserData(id),
{ limit: 5, window: 1000 }
)
// Signal-based hook for state management
const [value, setValue, limiter] = createRateLimitedSignal('', {
limit: 5,
window: 1000,
onExecute: (limiter) => {
console.log('Total executions:', limiter.getExecutionCount())
}
})
import { createRateLimiter, createRateLimitedSignal } from '@tanstack/solid-pacer'
// Low-level hook for full control
const limiter = createRateLimiter(
(id: string) => fetchUserData(id),
{ limit: 5, window: 1000 }
)
// Signal-based hook for state management
const [value, setValue, limiter] = createRateLimitedSignal('', {
limit: 5,
window: 1000,
onExecute: (limiter) => {
console.log('Total executions:', limiter.getExecutionCount())
}
})
Your weekly dose of JavaScript news. Delivered every Monday to over 100,000 devs, for free.