Django Celery AWS SQS Background Jobs Python Message Broker

Asynchronous Task Processing in Django with Celery and AWS SQS

Leverage AWS Simple Queue Service (SQS) as a fully managed message broker for your Django Celery tasks to process reliable background jobs.

6 min read

Asynchronous Task Processing with Celery and AWS SQS

Django Celery and SQS Diagram

When dealing with long-running operations—like sending emails to thousands of users or processing large files—blocking your Django web request reduces performance. Celery is the standard queue solution for Python, and pairing it with AWS SQS provides a zero-maintenance message broker.


Why AWS SQS?

While Redis and RabbitMQ are popular brokers for Celery, they require infrastructure maintenance. SQS is fully managed, auto-scaling, and requires no upfront server provisioning.


Step 1: Install Dependencies

You'll need Celery and the specific requirements to communicate with SQS (kombu integration).

pip install celery[sqs] boto3

Step 2: Configure Celery in Django

Create a celery.py file next to your settings.py:

import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')

app = Celery('myproject')

app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

Step 3: Update Django Settings

In your settings.py, configure the SQS broker connection:

# AWS SQS Configuration
AWS_ACCESS_KEY_ID = 'your-access-key'
AWS_SECRET_ACCESS_KEY = 'your-secret-key'

# Celery Configuration
CELERY_BROKER_URL = f"sqs://{AWS_ACCESS_KEY_ID}:{AWS_SECRET_ACCESS_KEY}@"
CELERY_BROKER_TRANSPORT_OPTIONS = {
    'region': 'us-east-1',
    'polling_interval': 1,
    'visibility_timeout': 3600,
    'queue_name_prefix': 'django-',
}
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

Step 4: Write a Task

Inside any Django app (e.g., tasks.py):

from celery import shared_task
import time

@shared_task
def send_marketing_email_task(user_id):
    # Simulate an expensive operation
    time.sleep(5)
    return f"Email sent to user {user_id}"

Step 5: Start the Celery Worker

Run the Celery worker locally to begin consuming SQS messages:

celery -A myproject worker --loglevel=info

When you trigger send_marketing_email_task.delay(user.id) in your Django view, a message is immediately pushed to AWS SQS. The worker instantly retrieves the message and processes it in the background, freeing up the view response!

Found this helpful?

We write about what we build. If you need similar solutions for your business, let's talk.