Rails Logs

Rails logging is a mess, atleast the default one. I’ve even seen some gems, spewing garbage into the logfile with no predefined format. Good thing is that you can easily fix/customize it. Infact, you can customize it so much, that you can have this:

What’s there in these supposedly awesome logs ?

  • Superfast search backend
  • UI is much much cooler than, what the single screenshot shows
  • You can add custom fields such as username and custom rails instrumentations
  • Consolidation of logs across multiple servers
  • Email notifications on events such as exceptions

How it’s done ?

The slick UI with superfast search is made possible with Rails Logs using:

  • Logstasher Gem - A rails gem I wrote. It makes rails, generate logstash compatible logs with minimal overhead
  • Logstash - An open source log collector
  • Kibana - A ruby/rack application for visualization and searching the logstash index

Logstasher Gem

I have already written much about it on the README at Github. Please check it out to know the usage/installation


Now this is the core of the logging system. It collects and stores the logs. It has a very clean seperation of concerns, basically:

  • Inputs - The source for logs. There are tons of options for input sources. We’ll be using file input for local logs and lumberjack for remote logs.
  • Filters - These are tranformations you would like to apply on the input log. Used for sanitizing and appending any extra information. Since logstasher already sends sanitized logs, we’ll not need any filters, thus saving on some processing.
  • Ouputs - Again we have many options for outputs. We will use the email and elastic-search output.

See the full list of input, filter and output options here

Setting up logstash

Once your logstash is setup, you now need to configure input source. If the rails app is on the same machine, you can skip the lumberjack section.

For output we’re using the embedded elastic-search, so no need for any extra setups.


If your app resides on a remote server, or there are multiple app servers, you need to transport the logs to logstash. Lumberjack is a very fast and small utility to do the same with minimal memory usage. It also encrypts the logs, so super secure as well. It will just monitor your logfile for changes and on any updates, sends the new logs to logstash.

Setting up Lumberjack

  • I found compiling lumberjack to be a bit of pain on SUSE(Yes, unfortunately I have to use SUSE). You can have a go yourself by grabbing it from source. You will need to install Go Language to compile it. Luckily I found rpm and deb packages here. Install the deb by dpkg -i debfilename or rpm by rpm -i rpmfilename
  • Set it up as a service - init.d script
  • Set up the config file - sample config. Create the config file at /etc/default/lumberjack

It is a good idea to restart lumberjack when you deploy or rotate logs.

There is a new protocol for lumberjack v2. I will play with it and update the documentation on that.

Key and SSL Certificate for lumberjack

You’ll notice that key and certificate files are required by logstash and lumberjack. These are required just to encrypt the log shipping. All the commands I used to generate them are consolidated here


This is the easiest part. Just see here

Once you have everything setup, you should be able to see logs in the Kibana UI. Play around to see the powerful search feature. You can also write your custom queries for .e.g. to get all requests from an IP which raised an exception just put this query in the search field - ` @fields.ip:”” AND @tags:”exception”`


blog comments powered by Disqus