How to setup Logstash internal logging in Docker

Today i’d like to show you how to make Logstash Docker container output its operation to a log file inside the container. I’m writing specifically about this, because the official Logstash documentation is a bit vague and unless you know how Java (the language ELK stack is written in) logging with the third party library log4j2 works, you might struggle with this issue like me.

Why would you need to log the operation of Logstash?

Some might as, why would i need logging for Logstash - it’s a tool that i use to save my other logs into Elastic Search, right? Logging is useful for many reasons. In my case, i wanted to be able to easily debug possible problems and logs would let me reproduce the issues with low effort, using the input data from the pipeline. Since i am using a custom Logstash plugin that i wrote in Ruby, one other reason for logging was to check if i don’t lose any data being processed by Logstash, by writing the output to the log. In other cases, you may lose this to monitor your Logstash availability and liveness, to detect and prevent security issues, to analyze data flow and many more.

Okay, okay, give us the solution…

No more small talk - to begin with, let’s look at the official documentation at this page. It mentions two ways of enabling logging - through the Logstash API and through the Log4j2 configuration. We are interested in the latter, as any configuration through the API is not persistent and must be done every time Logstash is started, restarted (including crashes and outages) or updated, since this will require restart too.

To make this configuration, we need to change the log4j2.properties file, found under /usr/share/logstash/config. Bellow i will show the original contents of this file and the modifications that i made for my logging purposes. You may change these configurations based on what you want to log, using the official log4j2 documentation at this page.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
# Original configuration
status = error
name = LogstashPropertiesConfig

appender.console.type = Console
appender.console.name = plain_console
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = [%d{ISO8601}][%-5p][%-25c]%notEmpty{[%X{pipeline.id}]} %m%n

appender.json_console.type = Console
appender.json_console.name = json_console
appender.json_console.layout.type = JSONLayout
appender.json_console.layout.compact = true
appender.json_console.layout.eventEol = true

rootLogger.level = ${sys:ls.log.level}
rootLogger.appenderRef.console.ref = ${sys:ls.log.format}_console
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
# Modified configuration
status = error
name = LogstashPropertiesConfig

appender.console.type = Console
appender.console.name = plain_console
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = [%d{ISO8601}][%-5p][%-25c]%notEmpty{[%X{pipeline.id}]} %m%n

appender.json_console.type = Console
appender.json_console.name = json_console
appender.json_console.layout.type = JSONLayout
appender.json_console.layout.compact = true
appender.json_console.layout.eventEol = true

appender.rolling.type = RollingFile
appender.rolling.name = plain_rolling
appender.rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log
appender.rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}.log
appender.rolling.policies.type = Policies
appender.rolling.policies.time.type = TimeBasedTriggeringPolicy
appender.rolling.policies.time.interval = 1
appender.rolling.policies.time.modulate = true
appender.rolling.layout.type = PatternLayout
appender.rolling.layout.pattern = [%d{ISO8601}][%-5p][%-25c] %-.10000m%n

appender.json_rolling.type = RollingFile
appender.json_rolling.name = json_rolling
appender.json_rolling.fileName = ${sys:ls.logs}/logstash-${sys:ls.log.format}.log
appender.json_rolling.filePattern = ${sys:ls.logs}/logstash-${sys:ls.log.format}-%d{yyyy-MM-dd}.log
appender.json_rolling.policies.type = Policies
appender.json_rolling.policies.time.type = TimeBasedTriggeringPolicy
appender.json_rolling.policies.time.interval = 1
appender.json_rolling.policies.time.modulate = true
appender.json_rolling.layout.type = JSONLayout
appender.json_rolling.layout.compact = true
appender.json_rolling.layout.eventEol = true

rootLogger.level = ${sys:ls.log.level}
rootLogger.appenderRef.console.ref = ${sys:ls.log.format}_console
rootLogger.appenderRef.rolling.ref = ${sys:ls.log.format}_rolling

After making these changes, you can select log format and log paths in your logstash.yml file, located in the same folder as the properties file.

1
2
3
4
5
6
7
8
http.host: '0.0.0.0'
xpack.monitoring.elasticsearch.hosts:
[
'<your-elastic-search-server-address-1>:9200',
'<your-elastic-search-server-address-2>:9200',
]
path.logs: '/usr/share/logstash/logs'
log.format: 'json'

I’ve set the log folder to be under the Logstash home folder, to avoid any write permissions problems.

Wait, so what did we just do?

By default, the Logstash Docker container is configured to output logs only to the Console (Terminal). In other words, even if you configure a logs folder path and log format, you will not get any files. This may be obvious for some people, but i struggled for a long time, until i found that i also had to define a file appender in log4j2.properties and register it to the rootLogger object, to get output in my log files folder. For those of you that were just like me, i hope this article helped you save some time. Cheers and good luck!

Revolut: We want to get more of your money

Comments

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×