Demo application for Background Processing with RabbitMQ, Python & Flask

Dhruva Dave
3 min readDec 27, 2020

Background processing is a standard way of improving the performance and response times of your web applications.

In past years, I have used multi-threading and cron jobs to run background tasks, but my favorite mechanism is to use async message queues.

What Is Docker?

Docker helps developers build lightweight and portable software containers that simplify application development, testing, and deployment. The Docker goal is to ease the creation, deployment and the delivery of an application using the so called Containers. The Docker Containers allow the developer/sysadmin to bundle an application with all needed components (libraries and other resources) and to deliver it as an independent and single package.

Install Docker using official documentation https://docs.docker.com/engine/install/

What Is RabbitMQ?

RabbitMQ is a fairly popular asynchronous message broker that can handle millions of messages.It originally implemented the Advanced Message Queuing Protocol (AMQP) but has been extended to support Streaming Text Oriented Messaging Protocol (STOMP), Message Queuing Telemetry Transport (MQTT), and other protocols. It is written in Erlang. But here we set up using Docker.

You can use many other services for background processes like Amazon SQS, Google Cloud Tasks etc. Other Redis based solutions are also available like Python-RQ. Redis is an open-source in-memory data source which can function as a message-broker, database, and a cache.

Now let’s install RabbitMQ using Docker’s official management image.

docker run -d -p 5672:5672 -p 15672:15672 rabbitmq:3-management
f556d47a1014d09eedf1a4cd6d4b76b6ecf41aed80226a8a0c449889afef414d

After this, you should be able to connect to http://locahost:15672 and see the RabbitMQ management console. Use the username and password guest to login.

RabbitMQ management console

Now we will set Flask server as a Producer and Worker. So we can set up both Worker and Producer in same Flask server using Dockerfile.

Flask Server as a Producer

For this demo, Create server folder and I am using a Flask app with a simple /create-job/<msg> route in app.py file that will push a task to the RabbitMQ server, where a background job worker will receive the message and you can do whatever you want to do with that message.

We also add Dockerfile under same server folder.

Flask Server as a Worker

The worker process is the main background process. It receives messages on the queue and executes some code based on the message. Most of the interesting stuff happens in the callback() function that gets invoked when a new message arrives.

For worker create worker folder and add app.py to receive messages and Dockerfile.

Docker Compose

For our system to work, we need all three processes — namely the RabbitMQ Server, Flask Server, and Worker Process — to run together. For a local development environment, it’s very convenient to use docker-compose to orchestrate this, as shown here.

Now create docker-compose.yml in root folder.

Tree for this demo application will be like this :

Tree for demo_app

Use the following command to run the docker and start all the processes:

docker-compose up -d
Starting demo_app_rabbitmq_1 ... done
Starting demo_app_worker_1 ... done
Starting demo_app_server_1 ... done

Run Jobs

To see all of this in action, just hit the /create-job/demo_msg or /create-job/hii end-point on your localhost and you will see the messages flowing through.

curl localhost:5000/create-job/hii
Sent: hii

Now you can check message is received by Worker by checking logs:

docker logs demo_app_worker_1
Connecting to server ...
Waiting for messages...
Received hii
Done

I hope this will be helpful to understand basic about background processing with messaging queue :)

Happy Coding !!!

--

--