Fire and Forget Tasks With Django and Celery


Django is without a doubt my favorite web-framework, if not my favorite Python module. However, I feel Django is lacking in asynchrony and 'fire and forget tasks' in particular. Imagine you have a blocking task you would like to call from a view, middleware, or signal. Except, this task is not relevant to returning the view in any way. One example could be sending an email alert when a user logs in. This task is important to the view, however sending an email should not interfere with how quickly you render a response. The two functionalities should be independent of each other. That's where Celery comes in. Unfortunately, Celery can have it's own downsides as well. You are going to need to setup a message broker, which will be more costly to your memory. Celery can also use a DB as a message broker, but I do not recommend this. For this guide I am going to be using Redis as a message broker. Let's begin by setting up our celery configurations with Django, assuming we have a project similar to what is structured below.

myproject/
  home/
    __init__.py
    apps.py
    tasks.py
    signals.py
  myproject/
    __init__.py
    celery.py
    settings.py

First, create a celery.py file in the myproject directory.

myproject/myproject/celery.py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myproject', broker='redis://localhost:6379/0')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

Allow our celery_app to start when our home application starts.

myproject/myproject/__init__.py

from __future__ import absolute_import, unicode_literals

from .celery import app as celery_app

__all__ = ('celery_app',)

Create our tasks file. This is where our celery tasks will live. Any task can be called in async fashion with <TASK>.apply_async.

myproject/<APP_NAME>/tasks.py

# Create your tasks here
from __future__ import absolute_import, unicode_literals
from celery import shared_task
from django.core.mail import send_mail
from myproject.celery import app

@app.task
def send_mail_noblock(subject, body, from_email, to_emails):
    send_mail(subject, body, from_email, to_emails, fail_silently=False)

Finally, we can connect our signal dispatcher with our receiver. Now login_email_handler will be run in the background every time a user logs in. You could fill in the body with more useful information like a timestamp, ip address, etc.

myproject/home/signals.py

from django.contrib.auth.signals import user_logged_in
from django.dispatch import receiver
from .tasks import send_mail_noblock


@receiver(user_logged_in)
def login_email_handler(sender, user, request, **kwargs):
    send_mail_noblock.apply_async((<SUBJECT>,
                                   <BODY>,
                                   <FROM>,
                                   <TO>), {}, ignore_result=True)