Complete Guide: Integrating Message Queue with Django Rest Framework

Message queues are widely known as a form of asynchronous service-to-service communication mostly used in serverless or microservices architectures. In a highly data-intensive application dealing with heavy workload tasks, message queues ensure users have a fast experience while completing the tasks in the background. This allows users to relieve themselves from waiting for a task to be completed.

A typical request-response architecture does not cut where the response time is unpredictable. It’s useful to use message queues when you know that your system request will exponentially or polynomially go large.

Messaging queues provide various features such as persistence, routing and task management. Message queues are typical “brokers” that facilitate message passing by creating an interface that other services can access. This interface connects Consumers who process the message created by the producers.

Components of Message Queue

message flow in microservice

 Fig. Components of Message Queue

Let’s first discuss what a message broker is. According to Wikipedia, A message broker mediates communication between applications, minimizing the mutual awareness that applications should have of each other to exchange messages, effectively implementing decoupling.

In other words, a message broker is an intermediate layer in which applications can communicate by reading and writing messages. Having that allows the possibility of creating two decoupled applications that do not rely on each other.

The primary purpose of a message broker is to take incoming messages from applications and perform some action on them. For example, a message broker may be used to manage a message queue for multiple receivers, providing reliable storage, guaranteed message delivery and perhaps transaction management.

Redis is an open-source in-memory (a DBMS that uses main memory, bluntly) data store that can function as a message broker, a database and a cache. Redis is known to support different kinds of abstract data structures from the documentation itself, such as strings, lists, maps, sets, sorted sets, hyper logs, bitmaps, and spatial indexes. Redis is fast and lightweight and this makes it a personal preference of numerous developers across the globe.

There are few Redis alternatives; using Redis is quite easy and developer-friendly because it deals with a system’s main memory. Redis is fast and also scalable. It is extremely easy to set up, use and deploy. Redis provides an in-memory and advanced key-value cache.

Related read: Redis API Caching: Boost Performance of Your Django API

CELERY

Celery is an asynchronous task queue based on distributed message passing. It is focused on real-time operation but supports scheduling as well.

Celery allows Python applications to quickly implement task queues for many workers.

Celery takes care of receiving tasks and assigning them appropriately to workers.

We are using celery to achieve some goals:

🔸 Define independent tasks

🔸 Listen to a message broker to get new task requests

🔸 Assign those requests to workers to complete the task

🔸 Monitor the progress and status of tasks and workers

Use Cases of Message Queue

Now you might think where we can use the message queue? So here is the answer:

Message queues are used in Long-running processes and background jobs.

When a request takes a longer time than usual significantly, this is the perfect scenario to incorporate a message queue.

Big data-intensive applications and web services handle multiple requests and cannot lose any of one under any circumstances. The requests are very time-consuming to process; in this scenario, message queues come to the rescue. Such scenarios like:

1. Images Scaling

2. Sending large/many emails

3. Search engine indexing

4. File scanning

5. Video encoding

6. Delivering notifications

7. PDF processing

8. Calculations

Building A Background Email Sending Task Using Celery and Redis

Step-1:

Download Redis from this link, and install and run the server from Redis cli.

Step-2:

Install the celery package to your Django application

pip install celery

Step-3:

Configure celery into your application in the directory “project_name/project_name/celery.py”

import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('Django_SETTINGS_MODULE', 'Project_Name.settings')

app = Celery('Project_Name')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.

# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.

app.config_from_object('Django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print(f'Request: {self.request!r}')

Step-4:

Import celery.py. To ensure that the Celery app is loaded when Django starts, add the following code into the __init__.py file

from .celery import app as celery_app

__all__ = ('celery_app',)

Step-5:

Assigning Redis as a message broker to celery in setting.py.

#Configuration for celery with Redis broker

CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'

Step-6:

Starting the celery worker into action

celery -A proj_name worker -l info

Now celery has started to take your tasks

Step-7:

Now create your task in task.py

from celery import shared_task
from Django.conf import settings
from Django.core.mail import send_mail


@shared_task
def send_bulk_email(recipient):
from_email = settings.EMAIL_HOST_USER
sub = "welcome to mindbowser"
msg = "thank you for registering with us"
recipients = recipient
send_mail(sub, msg, from_email, recipients, fail_silently=False)

Step-8:

Now you can run this task asynchronously with Celery anywhere inside the application.

from .task import send_bulk_email

send_bulk_email.delay(recipient)

Creating a class-based view where we can take a CSV and send emails to all the users from the CSV in the background.

from .task import send_bulk_email

from .task import send_bulk_email


class SendBulkMail(GenericAPIView):
    """
    This class is used to send bulk mail 
    """
    serializer_class = CsvSerializer
    permission_classes = [AllowAny]
    throttle_classes = [EmailThrottle, ]

    def post(self, request):
        data = request.FILES['csv'].read().decode('utf-8')
        recipient = []
        string = io.StringIO(data)
        for row in csv.reader(string):
            recipient.extend(row)
        send_bulk_email.delay(recipient)
        return Response({
                "status_code": 200,
                "error": None,
                "data": {},
"message": "Email sent successfully"

Creating An Endpoint for the Created Class View for Sending An Email

from .views import *
from Django.urls import path

urlpatterns = [
    path('sendmail/', SendBulkMail.as_view(), name='sendmail'),

]
coma

Conclusion

In conclusion, incorporating message queues, such as Redis, with Django Rest Framework provides significant benefits in building scalable and efficient applications. By leveraging message queues, developers can achieve asynchronous service-to-service communication, allowing for faster response times and improved user experiences. Message queues facilitate the decoupling of applications, enabling them to communicate without relying on each other directly.

Keep Reading

Keep Reading

Leave your competitors behind! Become an EPIC integration pro, and boost your team's efficiency.

Register Here
  • Service
  • Career
  • Let's create something together!

  • We’re looking for the best. Are you in?