Fire and Forget Tasks With Django and Celery

Django is without a doubt my favorite web-framework, if not my favorite Python module. However, I feel Django is lacking in asynchrony and 'fire and forget tasks' in particular. Imagine you have a blocking task you would like to call from a view, middleware, or signal. Except, this task is not relevant to returning the view in any way. One example could be sending an email alert when a user logs in. This task is important to the view, however sending an email should not interfere with how quickly you render a response. The two functionalities should be independent of each other. That's where Celery comes in. Unfortunately, Celery can have it's own downsides as well. You are going to need to setup a message broker, which will be more costly to your memory. Celery can also use a DB as a message broker, but I do not recommend this. For this guide I am going to be using Redis as a message broker. Let's begin by setting up our celery configurations with Django, assuming we have a project similar to what is structured below.


First, create a file in the myproject directory.


from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myproject', broker='redis://localhost:6379/0')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.

Allow our celery_app to start when our home application starts.


from __future__ import absolute_import, unicode_literals

from .celery import app as celery_app

__all__ = ('celery_app',)

Create our tasks file. This is where our celery tasks will live. Any task can be called in async fashion with <TASK>.apply_async.


# Create your tasks here
from __future__ import absolute_import, unicode_literals
from celery import shared_task
from django.core.mail import send_mail
from myproject.celery import app

def send_mail_noblock(subject, body, from_email, to_emails):
    send_mail(subject, body, from_email, to_emails, fail_silently=False)

Finally, we can connect our signal dispatcher with our receiver. Now login_email_handler will be run in the background every time a user logs in. You could fill in the body with more useful information like a timestamp, ip address, etc.


from django.contrib.auth.signals import user_logged_in
from django.dispatch import receiver
from .tasks import send_mail_noblock

def login_email_handler(sender, user, request, **kwargs):
                                   <TO>), {}, ignore_result=True)