How to Use NestJS Redis for Scalable Background Task Processing

When building scalable web applications, one common challenge is managing time-consuming operations without impacting the user experience. Tasks like sending emails, processing images, or generating reports can take time, so they’re better handled asynchronously. A powerful solution is using NestJS Redis for background task processing.

In this guide, we’ll show how to integrate NestJS Redis with Bull queues to create a production-ready system for handling background jobs. You’ll learn to configure Redis, register queues, build processors, and manage jobs efficiently using NestJS.

Why Use NestJS Redis for Background Tasks?

NestJS Redis offers several compelling benefits when implementing background task processing:

• Persists jobs: Tasks survive server restarts

• Supports multiple workers: Scale horizontally by adding more processors

• Provides atomic operations: Ensures data consistency

• Offers excellent performance: In-memory operations with optional persistence

• Has a mature ecosystem: Well-tested libraries like Bull for Node.js

Prerequisites and Setup for NestJS Redis Integration

Installing Dependencies

First, let’s set up our NestJS project with the required Redis and Bull dependencies:

# Create a new NestJS project

npm i -g @nestjs/cli

nest new redis-background-tasks

cd redis-background-tasks

# Install Bull and Redis packages

npm install @nestjs/bull bull redis

npm install @types/bull --save-dev

# Additional helpful packages

npm install @nestjs/schedule @nestjs/config

Redis Installation

You have several options for running Redis:

Option 1: Local Installation

You can run Redis locally or through Docker to power your NestJS Redis queues:

# macOS with Homebrew

brew install redis

redis-server

# Ubuntu/Debian

sudo apt-get install redis-server

sudo systemctl start redis

# Windows (using WSL or Redis for Windows)

Option 2: Docker

docker run -d -p 6379:6379– name redis redis:7-alpine

Option 3: Docker Compose (recommended for development)

# docker-compose.yml

version: '3.8'

services:

  redis:

    image: redis:7-alpine

    ports:

      - "6379:6379"

    volumes:

      - redis_data:/data

    command: redis-server --appendonly yes

volumes:

  redis_data:

Configuring Redis Connection in NestJS

Environment Configuration

Create a .env file to store your Redis connection details:

# .env

REDIS_HOST=localhost

REDIS_PORT=6379

REDIS_PASSWORD=

NODE_ENV=development

Application Module Setup

The key to integrating Redis with NestJS is configuring the BullModule in your main application module:

// src/app.module.ts

import { Module } from '@nestjs/common';

import { ConfigModule } from '@nestjs/config';

import { BullModule } from '@nestjs/bull';

import { ScheduleModule } from '@nestjs/schedule';




@Module({

  imports: [

    // Global configuration module

    ConfigModule.forRoot({

      isGlobal: true,

    }),

    

    // Enable scheduled jobs

    ScheduleModule.forRoot(),

    

    // Configure Bull with Redis connection

    BullModule.forRoot({

      redis: {

        host: process.env.REDIS_HOST || 'localhost',

        port: parseInt(process.env.REDIS_PORT) || 6379,

        password: process.env.REDIS_PASSWORD,

        // Optional: Additional Redis configuration

        maxRetriesPerRequest: 3,

        retryDelayOnFailover: 100,

        lazyConnect: true,

      },

    }),

  ],

})

export class AppModule {}

Creating Task Types and DTOs

Before we begin queuing jobs, we’ll define our task types and Data Transfer Objects (DTOs) using TypeScript. This structure makes it easier to process different job types within our NestJS Redis system.

// src/tasks/dto/create-task.dto.ts

import { IsString, IsOptional, IsObject, IsEnum } from 'class-validator';

export enum TaskType {

  EMAIL = 'email',

  IMAGE_PROCESSING = 'image_processing',

  REPORT_GENERATION = 'report_generation',

}

export class CreateTaskDto {

  @IsEnum(TaskType)

  type: TaskType;

  @IsString()

  title: string;

  @IsOptional()

  @IsString()

  description?: string;

  @IsOptional()

  @IsObject()

  payload?: any;

  @IsOptional()

  priority?: number;

  @IsOptional()

  delay?: number;

}

Setting Up Redis-Backed Queues

To register queues in your NestJS Redis application, create a dedicated TasksModule and define each queue using the BullModule.registerQueue() method.

Queue Registration

Create a tasks module that registers multiple Redis-backed queues:

// src/tasks/tasks.module.ts

import { Module } from '@nestjs/common';

import { BullModule } from '@nestjs/bull';




@Module({

  imports: [

    // Email processing queue

    BullModule.registerQueue({

      name: 'email',

      defaultJobOptions: {

        removeOnComplete: 10,    // Keep 10 completed jobs

        removeOnFail: 5,         // Keep 5 failed jobs

        attempts: 3,             // Retry failed jobs 3 times

        backoff: {

          type: 'exponential',

          delay: 2000,

        },

      },

    }),

    

    // Image processing queue

    BullModule.registerQueue({

      name: 'image-processing',

      defaultJobOptions: {

        removeOnComplete: 10,

        removeOnFail: 5,

        attempts: 5,             // More retries for image processing

        backoff: {

          type: 'exponential',

          delay: 5000,

        },

      },

    }),

    

    // Report generation queue

    BullModule.registerQueue({

      name: 'report-generation',

      defaultJobOptions: {

        removeOnComplete: 5,     // Fewer completed jobs (reports are large)

        removeOnFail: 3,

        attempts: 2,             // Fewer retries (reports take longer)

      },

    }),

  ],

  controllers: [TasksController],

  providers: [TasksService, EmailProcessor, ImageProcessor, ReportProcessor],

  exports: [TasksService],

})

export class TasksModule {}

Task Service Implementation

The service layer handles Redis queue operations:

// src/tasks/tasks.service.ts

import { Injectable, Logger } from '@nestjs/common';

import { InjectQueue } from '@nestjs/bull';

import { Queue } from 'bull';

import { CreateTaskDto, TaskType } from './dto/create-task.dto';




@Injectable()

export class TasksService {

  private readonly logger = new Logger(TasksService.name);




  constructor(

    @InjectQueue('email') private emailQueue: Queue,

    @InjectQueue('image-processing') private imageQueue: Queue,

    @InjectQueue('report-generation') private reportQueue: Queue,

  ) {}




  async createTask(createTaskDto: CreateTaskDto) {

    const { type, title, description, payload, priority = 0, delay = 0 } = createTaskDto;

    

    // Select the appropriate Redis queue based on task type

    let queue: Queue;

    const jobData = {

      title,

      description,

      payload,

      createdAt: new Date(),

    };




    switch (type) {

      case TaskType.EMAIL:

        queue = this.emailQueue;

        break;

      case TaskType.IMAGE_PROCESSING:

        queue = this.imageQueue;

        break;

      case TaskType.REPORT_GENERATION:

        queue = this.reportQueue;

        break;

      default:

        throw new Error(`Unknown task type: ${type}`);

    }




    // Add job to Redis queue with configuration

    const job = await queue.add(type, jobData, {

      priority,                    // Higher number = higher priority

      delay,                      // Delay execution (milliseconds)

      attempts: 3,                // Number of retry attempts

      backoff: {                  // Retry strategy

        type: 'exponential',

        delay: 2000,

      },

      removeOnComplete: 10,       // Cleanup policy

      removeOnFail: 5,

    });




    this.logger.log(`Created ${type} task with ID: ${job.id} in Redis`);

    

    return {

      id: job.id,

      type,

      title,

      status: 'queued',

      createdAt: new Date(),

    };

  }




  // Monitor Redis queue statistics

  async getQueueStats() {

    const emailStats = await this.getQueueStatus(this.emailQueue, 'email');

    const imageStats = await this.getQueueStatus(this.imageQueue, 'image-processing');

    const reportStats = await this.getQueueStatus(this.reportQueue, 'report-generation');




    return {

      email: emailStats,

      imageProcessing: imageStats,

      reportGeneration: reportStats,

      totalQueues: 3,

      redisConnection: 'active',

    };

  }




  private async getQueueStatus(queue: Queue, name: string) {

    const waiting = await queue.getWaiting();

    const active = await queue.getActive();

    const completed = await queue.getCompleted();

    const failed = await queue.getFailed();




    return {

      name,

      waiting: waiting.length,

      active: active.length,

      completed: completed.length,

      failed: failed.length,

      total: waiting.length + active.length + completed.length + failed.length,

    };

  }

}

Building Redis Job Processors

Now we’ll implement the actual job processors. Each processor listens to a specific Redis queue and handles the job data. This pattern is common in NestJS Redis applications using Bull.

Email Processor with Progress Tracking

// src/tasks/processors/email.processor.ts

import { Processor, Process } from '@nestjs/bull';

import { Logger } from '@nestjs/common';

import { Job } from 'bull';




@Processor('email')  // Links to Redis queue named 'email'

export class EmailProcessor {

  private readonly logger = new Logger(EmailProcessor.name);




  @Process('email')

  async handleEmailJob(job: Job) {

    this.logger.log(`Processing email job ${job.id} from Redis`);

   const { title, payload } = job.data;

       try {

      // Simulate email processing with progress updates stored in Redis

      await job.progress(25);

      await this.simulateWork(1000);

      

      await job.progress(50);

      this.logger.log(`Preparing email: ${title}`);

      await this.simulateWork(1000);

      await job.progress(75);

      this.logger.log(`Sending email to: ${payload?.recipient || 'default@example.com'}`);

      await this.simulateWork(1000);

      await job.progress(100);

      this.logger.log(`Email job ${job.id} completed successfully`);

      

      // Return result that will be stored in Redis

      return { 

        status: 'sent', 

        recipient: payload?.recipient || 'default@example.com',

        sentAt: new Date(),

        jobId: job.id,

      };

    } catch (error) {

      this.logger.error(`Email job ${job.id} failed:`, error);

      throw error; // Bull will handle retry logic based on queue configuration

    }

  }

  private simulateWork(ms: number): Promise<void> {

    return new Promise(resolve => setTimeout(resolve, ms));

  }

}

Image Processing Processor

// src/tasks/processors/image.processor.ts

import { Processor, Process } from '@nestjs/bull';

import { Logger } from '@nestjs/common';

import { Job } from 'bull';


@Processor('image-processing')

export class ImageProcessor {

  private readonly logger = new Logger(ImageProcessor.name);


  @Process('image_processing')

  async handleImageProcessing(job: Job) {

    this.logger.log(`Processing image job ${job.id} from Redis queue`);
   

    const { title, payload } = job.data;
    

    try {

      // Simulate complex image processing with detailed progress

      await job.progress(10);

      this.logger.log(`Loading image: ${payload?.filename || 'default.jpg'}`);

      await this.simulateWork(2000);

      

      await job.progress(30);

      this.logger.log(`Resizing image...`);

      await this.simulateWork(3000);

      

      await job.progress(60);

      this.logger.log(`Applying filters...`);

      await this.simulateWork(2000);

      

      await job.progress(80);

      this.logger.log(`Optimizing image...`);

      await this.simulateWork(1500);

      

      await job.progress(100);

      this.logger.log(`Image processing job ${job.id} completed`);

      

      return { 

        status: 'processed', 

        originalFile: payload?.filename || 'default.jpg',

        processedFile: `processed_${payload?.filename || 'default.jpg'}`,

        processedAt: new Date(),

        optimizationRatio: '75%',

        jobId: job.id,

      };

    } catch (error) {

      this.logger.error(`Image processing job ${job.id} failed:`, error);

      throw error;

    }

  }


  private simulateWork(ms: number): Promise<void> {

    return new Promise(resolve => setTimeout(resolve, ms));

  }

}

Redis Queue Monitoring and Management

Queue Statistics API

Create endpoints to monitor your Redis queues:

// src/tasks/tasks.controller.ts

import { Controller, Post, Get, Body, Param } from '@nestjs/common';

import { TasksService } from './tasks.service';

import { CreateTaskDto } from './dto/create-task.dto';




@Controller('tasks')

export class TasksController {

  constructor(private readonly tasksService: TasksService) {}




  @Post()

  async createTask(@Body() createTaskDto: CreateTaskDto) {

    return this.tasksService.createTask(createTaskDto);

  }




  @Get('stats')

  async getQueueStats() {

    return this.tasksService.getQueueStats();

  }




  @Get(':queueName/:jobId')

  async getJobDetails(

    @Param('queueName') queueName: string,

    @Param('jobId') jobId: string,

  ) {

    return this.tasksService.getJobDetails(queueName, jobId);

  }

}

Start Building Scalable Background Tasks with NestJS Redis

Advanced NestJS Redis Configuration for Production

Connection Pooling and Resilience

For production environments, enhance your Redis configuration:

// src/app.module.ts - Enhanced Redis configuration

BullModule.forRoot({

  redis: {

    host: process.env.REDIS_HOST || 'localhost',

    port: parseInt(process.env.REDIS_PORT) || 6379,

    password: process.env.REDIS_PASSWORD,

    

    // Connection pool settings

    maxRetriesPerRequest: 3,

    retryDelayOnFailover: 100,

    lazyConnect: true,

    keepAlive: 30000,

    

    // Connection limits

    family: 4,

    maxMemoryPolicy: 'allkeys-lru',

    

    // Cluster support (if using Redis Cluster)

    // enableReadyCheck: false,

    // maxRetriesPerRequest: null,

  },

  

  // Global default job options

  defaultJobOptions: {

    removeOnComplete: 100,

    removeOnFail: 50,

    attempts: 3,

    backoff: {

      type: 'exponential',

      delay: 2000,

    },

  },

}),

Scheduled Jobs Using NestJS Redis

Implement cron jobs that create background tasks:

// src/jobs/jobs.service.ts

import { Injectable, Logger } from '@nestjs/common';

import { Cron, CronExpression } from '@nestjs/schedule';

import { TasksService } from '../tasks/tasks.service';

import { TaskType } from '../tasks/dto/create-task.dto';




@Injectable()

export class JobsService {

  private readonly logger = new Logger(JobsService.name);




  constructor(private readonly tasksService: TasksService) {}




  // Scheduled job that adds tasks to Redis every 5 minutes

  @Cron(CronExpression.EVERY_5_MINUTES)

  async handleScheduledReports() {

    this.logger.log('Adding scheduled report to Redis queue...');

    

    await this.tasksService.createTask({

      type: TaskType.REPORT_GENERATION,

      title: 'Scheduled System Health Report',

      description: 'Automated system health report generation',

      payload: {

        type: 'system_health',

        automated: true,

        timestamp: new Date(),

      },

    });

  }

}

Testing Redis Integration

API Testing Examples

Test your Redis-backed background tasks:

# Create an email task

curl -X POST http://localhost:3000/api/tasks \

  -H "Content-Type: application/json" \

  -d '{

    "type": "email",

    "title": "Welcome Email",

    "description": "Send welcome email to new user",

    "payload": {

      "recipient": "user@example.com",

      "template": "welcome"

    },

    "priority": 5

  }'


# Check queue statistics

curl http://localhost:3000/api/tasks/stats


# Monitor job progress

curl http://localhost:3000/api/tasks/email/1

Redis CLI Monitoring

Monitor your queues directly in Redis:

# Connect to Redis CLI

redis-cli

# View all keys (queues)

KEYS *

# Monitor queue activity

MONITOR

# Check specific queue length

LLEN "bull:email:waiting"

# View job data

HGETALL "bull:email:1"

Best Practices for Redis Background Tasks

1. Queue Organization

•  Use separate queues for different task types

•  Configure appropriate cleanup policies

•  Set reasonable retry limits

2. Error Handling

•  Implement exponential backoff for retries

•  Log detailed error information

•  Set up dead letter queues for persistent failures

3. Performance Optimization

•  Monitor Redis memory usage

•  Use appropriate data expiration policies

•  Consider Redis persistence settings

4. Security

•  Use Redis AUTH (password)

•  Configure firewall rules

•  Consider Redis over TLS for production

5. Monitoring

•  Track queue lengths and processing times

•  Set up alerts for queue backlogs

•  Monitor the Redis server health

Deployment Considerations

Production Redis Setup

# Redis configuration for production

maxmemory 2gb

maxmemory-policy allkeys-lru

save 900 1

save 300 10

save 60 10000

Docker Compose for Production

version: '3.8'

services:

  app:

    build: .

    ports:

      - "3000:3000"

    environment:

      - REDIS_HOST=redis

      - NODE_ENV=production

    depends_on:

      - redis

  redis:

    image: redis:7-alpine

    volumes:

      - redis_data:/data

      - ./redis.conf:/usr/local/etc/redis/redis.conf

    command: redis-server /usr/local/etc/redis/redis.conf

    ports:

      - "6379:6379"

volumes:

  redis_data:
coma

Conclusion

Integrating Redis for background task processing in NestJS offers a robust and scalable way to handle time-consuming operations without blocking the main application thread. By combining Bull queues, Redis persistence, and NestJS’s modular architecture, you can efficiently manage everything from sending emails to processing images and generating reports. This setup enables high performance and reliability, making it ideal for modern web applications.

With features like job persistence, automatic retries, real-time monitoring, and horizontal scalability, NestJS Redis task queues provide the foundation for a production-ready background processing system. Follow the best practices outlined in this guide—such as configuring cleanup policies, monitoring queue health, and handling errors gracefully—to ensure long-term stability and responsiveness in your app.

Keep Reading

  • Service
  • Career
  • Let's create something together!

  • We’re looking for the best. Are you in?