Modern web applications must prioritize performance—not only for the end-user experience but also for developer efficiency. In this post, I’ll walk you through how we implemented Redis caching in a real-world, multi-tenant ticketing SaaS platform, built entirely with Next.js. This project enables users to browse events, choose seats, and purchase tickets—handling high traffic, dynamic UI, and data consistency challenges.
The architecture was designed for both scalability and modularity:
Built with Next.js App Router
Backend APIs defined under app/api
using route segments
Multiple isolated frontend panels: admin, venue, artist, and marketing
Database powered by PostgreSQL + Prisma ORM
Caching layer via Redis for memory-first access
Frontend data management with React Query
This architecture ensures smooth handling of a complex event and seat management system while maintaining performance across various client roles.
We used ioredis
for Redis integration, with a Dockerized Redis container and a global utility wrapper:
// lib/redis.ts
import Redis from 'ioredis';
export const redis = new Redis(process.env.REDIS_URL);
A basic example: caching the /events
endpoint with Redis:
// app/api/events/route.ts
import { redis } from '@/lib/redis';
import { prisma } from '@/lib/prisma';
export async function GET() {
const cacheKey = 'events:all';
const cached = await redis.get(cacheKey);
if (cached) {
return Response.json(JSON.parse(cached));
}
const events = await prisma.event.findMany();
await redis.set(cacheKey, JSON.stringify(events), 'EX', 60);
return Response.json(events);
}
This reduces database round-trips and significantly improves response times.
Every route in the project is backed by a service class for business logic separation. Here’s how we integrated Redis into a gateway-level service:
public static listAllSections = async (eventSessionOrder: EventSessionOrderWithUserData): Promise<Section[]> => {
const cacheKey = `sections:${eventSessionOrder.eventSession.venue.venueSlug}`;
const cached = await redis.get(cacheKey);
if (cached) {
return JSON.parse(cached);
}
const sections = await prisma.section.findMany({
where: {
venueSlug: eventSessionOrder.eventSession.venue.venueSlug,
deletedAt: null,
},
});
await redis.set(cacheKey, JSON.stringify(sections), 'EX', 60);
return sections;
};
Each venue’s sections are cached for 60 seconds, providing a major performance boost during UI-heavy steps like seat selection.
Functions like listAllSeats
involve user-specific logic. Seats booked by others should show as unavailable, but if the user already holds them, they should be shown as selected. For such logic:
Use a composite cache key like seats:{eventSessionSlug}:{sectionId}
Keep a list of globally unavailable seats
Merge that with user’s own selections
This hybrid approach prevents stale or incorrect seat states from being shown.
Here are real-world benchmarks from our production environment:
Endpoint | No Cache (ms) | Redis Cache (ms) |
---|---|---|
/api/events | 420 | 47 |
/api/seats | 580 | 63 |
/api/sections | 310 | 35 |
/api/sections/seats | 670 | 91 |
Massive reductions in latency, especially in session-intensive views like the seat selector.
We used @tanstack/react-query
for frontend data fetching with caching and background sync support:
useQuery(['events'], fetchEvents, {
staleTime: 60 * 1000,
});
Combined with Redis, this gives users near-instant loading times and a smoother experience. SSR and SSG methods can also hydrate from the same Redis-backed APIs, ensuring consistency.
Keeping the cache fresh is as important as caching itself. After admin edits:
await redis.del('events:all');
await redis.del(`sections:${venueSlug}`);
This can be extended with hooks, cron jobs, or event-driven invalidation systems. For dynamic user-based data, shorter TTLs (15–30s) may suffice.
This real-world Next.js SaaS project shows how Redis can transform performance—especially for read-heavy, user-sensitive systems like ticketing platforms. From API routes to services, and all the way to the UI, caching smartly saves resources, reduces latency, and delights users.
To implement Redis effectively:
Choose your cache keys wisely
Use TTLs to prevent stale data
Avoid over-caching dynamic user-specific content
Have you added Redis to a large-scale Next.js project? Let me know how it went or what challenges you faced—I'd love to hear your experience!
Stay up to date! Get all the latest & greatest posts delivered straight to your inbox