Home | Blog | Impress | Data Protection Policy

Automotive logging on Linux with Fluent Bit, DLT and EB solys

February 27, 2019, written by Torsten Mosis, torsten.mosis@systemticks.de


This blog post describes how three open source projects can be combined to build a log & trace solution for linux-based automotive software systems:

  • Fluent Bit for collecting data and logs from Linux OS and its infrastructure

  • Automotive DLT for application logging and routing

  • EB solys for log data analysis

Fluent Bit

Fluent Bit is a log processor and forwarder which allows you to collect runtime data and logs from different sources, unify and send them to multiple destinations. It has it roots in web-service and container environments. But since it is utilizing standard Linux facilities it can also be used for other domains where Linux is extensively used, such as Automotive or IoT.

Automotive DLT

Automotive DLT is a log and trace interface, based on the standardized protocol specified in the AUTOSAR standard 4.0. It is the de facto framework for application logging in linux-based automotive architectures like GENIVI, Adaptive AUTOSAR or Automotive Grade Linux.

EB solys

EB solys offers functionality for filtering, searching, aggregating and correlating logs and runtime data across different data sources in a single place. It is a customizable framework and construction kit for building tools to identify and localize functional and non-functional defects in complex software projects.

Data Flow

How do Fluent Bit, DLT and EB solys interplay with each other?

With so-called input plugins Fluent Bit can be configured to retrieve valuable linux system data, such as cpu load, memory consumption, kernel messages, systemd messages and many more.

These logs can be redirected from the standard output into DLT, which then consolidates the data received from Fluent Bit with the logs originated by the applications.

In the end EB solys will be connected to DLT via TCP/IP for preparing and visualizing the collected log data in an appropriate manner.

FDS stack

Download and install

Let’s walk through the build and installation process of the separate tools in order to setup the entire logging stack and make it running.

Fluent Bit

You can either download a stable version of the sources as a tarball from here https://fluentbit.io/download or you can clone the latest sources from GitHub: https://github.com/fluent/fluent-bit.

Either way, once you have downloaded the source code simply follow the instructions for build and install Fluent Bit: https://docs.fluentbit.io/manual/installation/build_install

We then have plenty of possibilities to configure Fluent Bit in terms of data collection, filtering and routing the collected data.

In a simple example we want to

  • measure the memory consumption

  • measure the disk I/O activities

  • forward the collected data to stdout

  • and represent the information as json structures

Configuration items could either be passed to Fluent Bit as command line parameters or stored in a separate configuration file. We are going to use the latter option.

    Name         mem
    Tag          memory

    Name         disk
    Tag          disk

    Name  modify
    Match memory
    Add source memory

    Name  modify
    Match disk
    Add source disk

    Name     stdout
    Format   json_lines
    Match    *

We don’t go into the details of the configuration facilities, hoping the configuration of this use-case is simple enough to be self-explanatory. In case you want to dig into the details right now, then jump to the Getting Started page and learn more about configuring input plug-ins, parsers, filters and output plug-ins.

Store the configuration in a file named example.conf and start Fluent Bit with:

fluentbit -c example.conf

You should then see something like this in your console.

{"date":1551184184.000511, "Mem.total":4046276, "Mem.used":1015060, "Mem.free":3031216, "Swap.total":0, "Swap.used":0, "Swap.free":0, "source":"memory"}
{"date":1551184185.000031, "Mem.total":4046276, "Mem.used":1015060, "Mem.free":3031216, "Swap.total":0, "Swap.used":0, "Swap.free":0, "source":"memory"}
{"date":1551184186.000640, "Mem.total":4046276, "Mem.used":1015060, "Mem.free":3031216, "Swap.total":0, "Swap.used":0, "Swap.free":0, "source":"memory"}
{"date":1551184187.000116, "Mem.total":4046276, "Mem.used":1015060, "Mem.free":3031216, "Swap.total":0, "Swap.used":0, "Swap.free":0, "source":"memory"}
{"date":1551184185.000172, "read_size":0, "write_size":0, "source":"disk"}
{"date":1551184186.000824, "read_size":0, "write_size":0, "source":"disk"}
{"date":1551184187.000377, "read_size":0, "write_size":0, "source":"disk"}

Cool, Fluent Bit is up and running!


We can now deal with the installation of DLT.

First of all DLT is a standard, a specification, not an implementation. Fortunately the GENIVI alliance is hosting and maintaining an implementation, called Genivi DLT.

DLT is primarily used for application logging and offers the corresponding APIs for it. But there is also a channel for feeding log data from other sources into DLT. This is done by a tool called dlt-adaptor-stdin.

We can clone the latest sources from here: https://github.com/GENIVI/dlt-daemon and follow the instructions from the Build and Install section.

We need to take care to activate the option WITH_DLT_ADAPTOR when calling cmake.

mkdir build
cd build

We can now start the dlt daemon without any further configuration.


The dlt-daemon is now awaiting incoming logs, either application logs or logs from other sources provided by the dlt-adaptor-stdin.

Redirect Fluent Bit output into DLT input

Now, let’s do the obvious and redirect the output stream from Fluent Bit into the input stream from DLT.

fluentbit -c example.conf | dlt-adaptor-stdin -a FLBT -c MON

Optionally we specify an application ID with '-a' and a context ID with '-c'. This will help us later for finding our collected data in the overall DLT logs.

We choose FLBT (for Fluent Bit) as application Id and MON (for Monitoring) as context Id .

Done. Fluent Bit logs are redirected into DLT.

Log Data Analysis with EB solys

Finally we install EB solys, that allows us analyzing and visualizing the logs we are collecting with Fluent Bit and routing through DLT.

EB solys is an Eclipse RCP application and is running under Windows, Linux and MacOS. Prerequisites on your host machine are installations of Java 8 and Maven >= 3.3.1.

Clone or download the sources:

git clone https://github.com/Elektrobit/eb-solys

Build the application with maven:

cd src/com.elektrobit.ebrace.releng.ui.ecl.aggregator
mvn clean verify

Go, grab a coffee. The very first build can take a while.

The product of your favourite environment can finally be found here:

cd src/com.elektrobit.ebrace.releng.ui.ecl.product/target/products

Unzip the product and execute the EB solys binary.

EB solys has a built-in DLT support, that allows us to connect to a running dlt-daemon. After launching EB solys just add a new connection and give it a name. You also need to enter the ip address where your dlt-daemon is running together with its port. The default port is 3490.

Be sure to select DLT daemon as source, because EB solys can alternatively be connected to its own log collector, called EB solys agent.

DLT conn

When we click connect, we can see all our DLT traces, including our memory and disk I/O traces supplied by Fluent Bit. Easily to recognize by the application Id (FLBT) and context Id (MON) .

EB solys DLT 1

Our logs collected by Fluent Bit are now shown textually as json strings in a table together with all other application logs. Though this is not a straight way of representation for further analysis.

This is where EB solys can contribute substantially. EB solys provides different hooks that allows the post-processing of the collected runtime data for the purpose of aggregation, correlating and visualization.

A more natural way of representing memory and disk activity data would definitely be showing them in a line chart, like this:

mem used

Explaining how to achieve this with the EB solys built-in capabilities would blow up this blog.

Though it is a good reason to follow-up with it in another blog post. Keep watching, I will continue on that.