Turn your raw logging into meaningful and measurable data with Vizion.ai

Mark Mayfield

Developer Advocate at Panzura
Mark Mayfield

Latest posts by Mark Mayfield (see all)


There are countless reasons why you should be monitoring and analyzing your logging:

  • – Production Monitoring and Debugging
  • – Security
  • – Tracking Your Site’s/Platform’s Visitors
  • – Rogue Automated Robots
  • – HTTP Errors
  • – Resource Usage

But how do you go about extracting this raw data in a way that is measurable and visually appealing? Is it possible to derive meaningful insights from countless lines of logs?

 

 

Yes, it is! With the help of the support team at Vizion.ai, we take the hassle out of mapping all the data for you and provide you with an easy-to-use solution that can track and monitor the information that is most crucial to your enterprise.

 

 

The following insights will shine light on some of the tools and methods that we employ to make the transformation from raw logs to stunning visuals as seamless as possible.

For the following tasks, we will assume that the logs are already being ingested into a Vizion.ai account. We just need to parse them in a way that will allow us to build visualizations. To find out more about the process of connecting your data to Vizion.ai, please check out our developer forum: https://vizion.ai/forum where we provide step-by-step details on how to set this up.

 

 

Step 1 – Create the custom patterns that will break out and individualize the data from your logs.

  • To do this we will use Grok to debug a sample log message from the data set. Grok is a pattern matching syntax that is used to parse arbitrary text and structure it. Grok is good for parsing syslog, apache, and other web server logs, mysql logs, and in general, any log format that is written for human consumption. The resource we will be using for grok debugging is: https://grokdebug.herokuapp.com/
  •  



Step 2 – Create a pipeline that will use the custom patterns as a filter for the log data.

  • To do this we will use the DevTools section that can be found in your Vizion.ai Kibana. Here is a screenshot of a custom pipeline created to handle Apache2 access logs that have an extra WafIP address that needs to be filtered out. You can see that this pipeline uses multiple ‘Processors’ that further parse and structure the data. The processors used here are: Grok, GeoIP, Set, and User-Agent.
  •  



Step 3 – Connect the Filebeat that is shipping the logs to Vizion.ai with the newly created pipeline.

  • Add the pipeline in the Elasticsearch Output section of the filebeat.yml. This pipeline was named “test” when we created it in the Kibana DevTools
  •  



Step 4 – Restart Filebeat on the local machine.

  • Logs should now be shipping up into your Vizion.ai through the custom pipeline. You can see in the example below that the logs have been parsed into their proper mappings.


There are a few more tweaks that must be made behind the scenes but with the logging now more-or-less parsed correctly, we can start to build out custom dashboards to display the data in a visually pleasing way.

 



Summary
Vizion.ai empowers you to convert your raw logs into meaningful insights. Get started today for free and contact us for any questions.

Share this article

Share on facebook
Share on twitter
Share on linkedin