Structured logging refers to writing log messages in a structured and machine-readable format, typically in the form of key-value pairs, XML, or JSON. Logging tools like Elasticsearch, Splunk, or Kibana make structured logs easy to parse, search, and analyze. This helps improve observability and streamline log analysis in modern large-scale and distributed applications.
1. Builtin Support for Structured Logging
Since version 3.4, Spring Boot has native support for structured logging using the simple properties in the application.properties file:
- logging.structured.format.console: Print structured logs in console
- logging.structured.format.file: Print structured logs in a file
1.1. Basic Properties
# ecs, gelf or logstash
logging.structured.format.console=logstash # Logs to Console
logging.structured.format.file=logstash # Logs to File
logging.file.name=${java.io.tmpdir}/app.log
These properties take one of the following values:
- ecs: Elastic Common Schema for structuring logs and event data in the Elastic Stack.
- gelf: Graylog Extended Log Format for centralized logging with Graylog.
- logstash: Logstash format for parsing, transforming, and forwarding logs to various destinations like Elasticsearch or databases.
Once the property is added to the application.properties file, we can check the generated logs:
@SpringBootApplication
public class StructuredLoggingApplication implements CommandLineRunner {
final static Logger log = LoggerFactory.getLogger(StructuredLoggingApplication.class);
public static void main(String[] args) {
SpringApplication.run(StructuredLoggingApplication.class, args);
}
@Override
public void run(String... args) throws Exception {
log.trace("Trace log");
log.debug("Debug log");
log.info("Info log");
log.warn("Hey, This is a warning!");
log.error("Oops! We have an Error. OK");
}
Verify the generated logs:
{"@timestamp":"2024-11-18T11:03:47.651968600Z","log.level":"INFO","process.pid":19592,"process.thread.name":"main","service.name":"structured-logging","log.logger":"com.howtodoinjava.demo.StructuredLoggingApplication","message":"Info log","ecs.version":"8.11"}
{"@timestamp":"2024-11-18T11:03:47.652943600Z","log.level":"WARN","process.pid":19592,"process.thread.name":"main","service.name":"structured-logging","log.logger":"com.howtodoinjava.demo.StructuredLoggingApplication","message":"Hey, This is a warning!","ecs.version":"8.11"}
{"@timestamp":"2024-11-18T11:03:47.652943600Z","log.level":"ERROR","process.pid":19592,"process.thread.name":"main","service.name":"structured-logging","log.logger":"com.howtodoinjava.demo.StructuredLoggingApplication","message":"Oops! We have an Error. OK","ecs.version":"8.11"}
1.2. Additional Pre-defined Properties
Apart from enabling basic logging into the specified format, we can add the following built-in and supported properties to add specific contextual data to the logs. These properties are highly useful in multi-node deployments, such as K8s clusters.
In the following code snippet, we have added the properties and sample values.
# ecs properties
logging.structured.ecs.service.name=MyService
logging.structured.ecs.service.version=1
logging.structured.ecs.service.environment=Production
logging.structured.ecs.service.node-name=Primary
# gelf properties
logging.structured.gelf.host=MyService
logging.structured.gelf.service.version=1
In the case of logstash, we can add, rename, and remove fields based on the logging requirements. The following configuration:
Property | Description |
---|---|
logging.structured.json.include logging.structured.json.exclude | Filters specific paths from the JSON |
logging.structured.json.rename | Renames a specific member in the JSON |
logging.structured.json.add | Adds additional members to the JSON |
logging.structured.format.console=logstash
# logstash log fields
logging.structured.json.add.host=MyService
logging.structured.json.add.version=1
logging.structured.json.exclude=level_value
logging.structured.json.rename.logger_name=logger_class
After adding these properties, we can verify the logs:
{
"@timestamp":"2024-11-18T16:49:22.924893+05:30",
"@version":"1",
"message":"Info log",
"logger_class":"com.howtodoinjava.demo.StructuredLoggingApplication",
"thread_name":"main",
"level":"INFO",
"host":"MyService",
"version":"1"
}
2. Adding MDC or Key-value Pairs
The built-in properties are suitable for adding static information such as service name and version. But if we have to enrich the log event from the information in runtime, we can add the info in MDC or key-value pairs. Such information is automatically added to the structured log event.
// 1 - Adding MDC values
MDC.put("MyKey1", "MyValue1");
MDC.put("MyKey2", "MyValue2");
log.info("Info log");
MDC.remove("MyKey1");
MDC.remove("MyKey2");
// 2 - Adding key-value pairs
log.atInfo()
.setMessage("Info log")
.addKeyValue("MyKey1", "MyValue1")
.addKeyValue("MyKey2", "MyValue2")
.log();
}
Verify the generated logs to check if the information is added correctly:
{
"@timestamp":"2024-11-18T16:49:22.924893+05:30",
"@version":"1",
"message":"Info log",
...
"MyKey1":"MyValue1",
"MyKey2":"MyValue2"
}
3. Custom Log Formats
The default logging properties, including basic customizations, are insufficient to satisfy enterprise applications’ diverse needs. For example, a logging system may ask for the logs in the CSV pattern.
Spring Boot’s StructuredLogFormatter interface allows formatting a log event to the desired structured log message. The output message does not always need to be a JSON; it can be a CSV, XML, or plain string.
The following implementation of StructuredLogFormatter returns the log event as a CSV record. It also adds any dynamic information added using either MDC or key-value pairs.
import ch.qos.logback.classic.spi.ILoggingEvent;
import org.springframework.boot.logging.structured.StructuredLogFormatter;
public class CustomLogFormatter implements StructuredLogFormatter<ILoggingEvent> {
@Override
public String format(ILoggingEvent event) {
// Use a StringBuilder for efficient string concatenation
StringBuilder logBuilder = new StringBuilder();
// Add basic log details
logBuilder.append("time=").append(event.getTimeStamp())
.append(",logger=").append(event.getLoggerName())
.append(",level=").append(event.getLevel())
.append(",message=\"").append(event.getFormattedMessage()).append("\"");
// Iterate over key-value pairs and append as CSV
if (event.getKeyValuePairs() != null) {
event.getKeyValuePairs()
.stream()
.forEach((pair) -> {
logBuilder.append(",").append(pair.key).append("=").append(pair.value);
});
}
// Iterate over MDC map and append as CSV
if (event.getMDCPropertyMap() != null) {
event.getMDCPropertyMap()
.entrySet()
.stream()
.forEach((entry) -> {
logBuilder.append(",").append(entry.getKey()).append("=").append(entry.getValue());
});
}
// Add a newline at the end of the log
logBuilder.append("\n");
return logBuilder.toString();
}
}
The output will be a list of CSV records:
1731930400398,com.howtodoinjava.demo.StructuredLoggingApplication,INFO,"Info log"
1731930400398,com.howtodoinjava.demo.StructuredLoggingApplication,WARN,"Hey, This is a warning!"
1731930400398,com.howtodoinjava.demo.StructuredLoggingApplication,ERROR,"Oops! We have an Error. OK"
4. External Log Encoder
For some reason, if we are not able to use Spring Boot’s native support for structured logging, then we can implement structured logging using the tool-specific implementation. For example, logstash allows generating the structured logs using the net.logstash.logback.encoder.LogstashEncoder class added as part of ‘logstash-logback-encoder‘ dependency.
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId<logstash-logback-encoder</artifactId>
<version>8.0</version>
</dependency>
To log information in a structured way, we need to plugin the encoder into the application’s logging facility.
<configuration>
<appender name="jsonConsoleAppender" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="jsonConsoleAppender"/>
</root>
</configuration>
You can follow the Logstash Github Guide for detailed information on fine-tuning and further customizing the fields added/formatted by the encoder.
5. Summary
Structured logging is an essential aspect of enterprise applications deployed into clustered environments. Unlike unstructured logs, such as text files, structured logs are easier to parse, analyze, and incorporate into generated reports.
Since version 3.4, Spring Boot will provide native support for structured logging in the most common and popular formats. It also provides essential tools to add/remove information from the log events and create custom formats as well.
Happy Learning !!
Comments