RabbitMQ is a popular message broker that facilitates the exchange of data between applications. However, as with any system, it’s important to have visibility into the logs generated by RabbitMQ to identify issues and ensure smooth operation. In this blog post, we’ll walk you through the process of shipping RabbitMQ logs to Elasticsearch, a distributed search and analytics engine. By centralising and analysing RabbitMQ logs with Elasticsearch, you can gain valuable insights into your system and easily troubleshoot any issues that arise.
Logs processing system architecture
To build that architecture, we’re going to set up 4 components in our system. Each one of them has got its own set of features. Here there are:
- A logs Publisher.
- A RabbitMQ Server With a Queue To Publish data to and receive data from.
- A Logstash Pipeline To Process Data From The RabbitMQ Queue.
- An Elasticsearch Index To Store The Processed Logs.
1. Logs Publisher
Logs can come from any software. It can be from a web server (Apache, Nginx), a monitoring system, an operating system, a web or mobile application, and so on. The logs give information about the working history of any software.
If don’t have any choices yet, you can use my simple stuff here: https://github.com/baoanh194/rabbitmq-simple-publisher-consumer
The logs publisher will be publishing the logs to a RabbitMQ queue.
Instead of going through a very long RabbitMQ installation, we’re going to go with a RabbitMQ Docker instance to make things simple. You can find your preferred operating system here: https://docs.docker.com/engine/install/
To start a RabbitMQ container. You can do this by running the following command:
This command starts a RabbitMQ container with the management plugin enabled. After enabling the plugin, you can access the RabbitMQ management console by going to http://localhost:15672/ in your web browser. Normally the username/password is guest/guest.
Go and check this link to install and configure Elasticsearch: https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html
To store RabbitMQ data for visualisation in Kibana, you need to start an Elasticsearch container. You can do this by running the following command (I’m using Docker to set up Elasticsearch):
When you start Elasticsearch for the first time, there are some security configuration required.
If you haven’t installed or worked with Logstash before, don’t worry. Have a look at the Elastic docs: https://www.elastic.co/guide/en/logstash/current/installing-logstash.html
It’s very detailed and easy to read.
For me, I installed Logstash on MacOS by Homebrew:
Once Logstash is installed on your machine, let’s create the Pipeline to process data.
Paste the code below to your pipelines.conf file:
(Put new config file under: /opt/homebrew/etc/logstash)
Run your pipeline with Logstash:
Here is a screenshot of what you should get if your RabbitMQ Docker Instance is running well and everything works pretty well on your Logstash pipeline side:
Let’s ship some logs
Now everything is ready. Go to the logs publisher root folder and run the send.js script
You can check the data is sent to Elastic:
curl -k -u elastic https://localhost:9200/_search?pretty
If everything goes well, you will get the result as below screenshot:
Configure Kibana to Visualize RabbitMQ Data
Additionally, you can configure Kibana to visualize the RabbitMQ data on Elastic. By configuring Kibana, you can create visualisations such as charts, graphs, and tables that make it easy to understand the data and identify trends or anomalies. For example, you could create a chart that shows the number of messages processed by RabbitMQ over time, or a table that shows the top senders and receivers of messages.
Kibana also allows you to build dashboards, which are collections of visualisations and other user interface elements arranged on a single screen. Dashboards can be shared with others in your organization, making it easier for team members to collaborate and troubleshoot issues. You can refer to this link for how to set up Kibana: https://www.elastic.co/pdf/introduction-to-logging-with-the-elk-stack
In summary, shipping RabbitMQ logs to Elasticsearch offers benefits such as centralized log storage, quick search and analysis, and improved system troubleshooting. By following the steps outlined in this blog post, you can set up a system to handle large volumes of logs and gain real-time insights into your messaging system. Whether you’re running a small or large RabbitMQ instance, shipping logs to Elasticsearch can help you optimise and scale your system.