Celery is an asynchronous task queue/job queue based on distributed message passing. It allows you to execute tasks in the background in a distributed manner. Here is a simple guide to help you get started with Celery in Python.
Step 1: Install Celery
You can install Celery using pip
:
pip install celery
Step 2: Set Up a Message Broker
Celery requires a message broker to send and receive messages. The most commonly used brokers are:
- RabbitMQ
- Redis
For this tutorial, we’ll use Redis as the message broker. You can install Redis on your system and run it locally, or you can use a cloud-based service like Redis Cloud.
To install Redis, you can use:
pip install redis
Make sure that the Redis server is running. You can start it with the following command if it’s installed locally:
redis-server
Step 3: Create the Celery Application
You will create a Python script that configures the Celery application and defines a task. Let’s create a simple tasks.py
file.
# tasks.py
from celery import Celery
# Create a Celery instance and configure it to use Redis as the broker
app = Celery('my_tasks', broker='redis://localhost:6379/0')
# Define a simple task
@app.task
def add(x, y):
return x + y
Here:
'my_tasks'
is the name of the Celery application.'redis://localhost:6379/0'
is the Redis URL, wherelocalhost
refers to your Redis server running locally,6379
is the default Redis port, and0
is the Redis database index.
Step 4: Start the Celery Worker
To start the Celery worker, run the following command in the terminal from the directory where tasks.py
is located:
celery -A tasks worker --loglevel=info
This will start the Celery worker that will listen for incoming tasks.
Step 5: Send a Task to Celery
You can now call the Celery task from another Python script, such as main.py
. This script will send a task to the worker.
# main.py
from tasks import add
# Send the 'add' task to the worker asynchronously
result = add.apply_async((4, 6))
# Print the task result (this will be available once the worker has finished processing the task)
print('Task Result:', result.get())
Step 6: Running the Application
- Start the Celery worker in one terminal window:
celery -A tasks worker --loglevel=info
- In another terminal window, run your
main.py
script:python main.py
You should see the worker process handling the task and the result printed in the terminal.
Step 7: Advanced Usage – Task Scheduling and Periodic Tasks
Celery also allows you to schedule tasks to run periodically. To schedule periodic tasks, you can use Celery Beat.
- Install Celery Beat:
pip install celery[redis]
- Add a periodic task to your
tasks.py
:
from celery import Celery
from celery.schedules import crontab
app = Celery('my_tasks', broker='redis://localhost:6379/0')
@app.task
def hello_world():
print("Hello, World!")
# Schedule the task to run every 5 seconds
app.conf.beat_schedule = {
'say-hello-every-5-seconds': {
'task': 'tasks.hello_world',
'schedule': 5.0,
},
}
- To start the Celery Beat scheduler along with the worker, run:
celery -A tasks worker --loglevel=info --beat
Step 8: Task Result Backend (Optional)
If you want to store task results (so you can inspect them later), you can configure a backend. For example, you could use Redis as a backend:
# tasks.py
app = Celery('my_tasks', broker='redis://localhost:6379/0', backend='redis://localhost:6379/0')
Conclusion
That’s it! You now have a basic setup to handle asynchronous tasks with Celery. You can expand this to handle more complex workflows, add retries, or configure different backends and brokers depending on your needs.