With use of microservices, we have been able to overcome many legacy problems and it allow us to create stable distributed applications with desired control on the code, team size, maintenance, release cycle, cloud ennoblement etc. But it has also introduced few challenges in other areas e.g. distributed log management and ability to view logs of full transaction distributed among many services and distributed debugging in general.
Actually the challenge is that microservices are isolated among themselves and they does not share common database and log files. As the number of microservice increases and we enable cloud deployment with automated continuous integration tools, it is very much necessary to have some provision of debugging the components when we have any problem.
Thanks to the open source movement. We already have bundle of tools which can do the magic if used properly together. One such popular set of tools are Elastic Search, Logstash and Kibana – together referred as ELK stack. They are used for searching, analyzing, and visualizing log data in a real time.
In this ELK stack tutorial, learn to integrate ELK stack to microservices ecosystem.
Table of Contents 1. ELK Stack 2. ELK Configuration 3. Create Microservice 4. Logstash Configuration 5. Kibana Configuration 6. Verify ELK Stack 7. Summary
1. What is ELK Stack
- Elasticsearch is a distributed, JSON-based search and analytics engine designed for horizontal scalability, maximum reliability, and easy management.
- Logstash is a dynamic data collection pipeline with an extensible plugin ecosystem and strong Elasticsearch synergy.
- Kibana gives the visualization of data through a UI.
1.1. ELK Stack Architecture
Logstash processes the application log files based on the filter criteria we set and sends those logs to Elasticsearch. Through Kibana, we view and analyze those logs when required.

2. ELK stack configuration
All these three tools are based on JVM and before start installing them, please verify that JDK has been properly configured. Check that standard JDK 1.8 installation, JAVA_HOME
and PATH
set up is already done.
2.1. Elasticsearch
- Download latest version of Elasticsearch from this download page and unzip it any folder.
- Run
bin\elasticsearch.bat
from command prompt. - By default, it would start at http://localhost:9200
2.2. Kibana
- Download the latest distribution from download page and unzip into any folder.
- Open
config/kibana.yml
in an editor and setelasticsearch.url
to point at your Elasticsearch instance. In our case as we will use the local instance just uncommentelasticsearch.url: "http://localhost:9200"
- Run
bin\kibana.bat
from command prompt. - Once started successfully, Kibana will start on default port
5601
and Kibana UI will be available at http://localhost:5601
2.3. Logstash
- Download the latest distribution from download page and unzip into any folder.
- Create one file
logstash.conf
as per configuration instructions. We will again come to this point during actual demo time for exact configuration.Now run
bin/logstash -f logstash.conf
to start logstash
ELK stack is not up and running. Now we need to create few microservices and point logstash to the API log path.
3. ELK stack example – Create Microservice
3.1. Create Spring Boot Project
Let’s create an application using spring boot for faster development time. Follow those steps to start this service.
3.2. Add REST Endpoints
Add one RestController
class which will expose few endpoints like /elk
, /elkdemo
, /exception
. Actually we are going to test few log statements only, so feel free to add/modify logs as per your choice.
package com.example.howtodoinjava.elkexamplespringboot; import java.io.PrintWriter; import java.io.StringWriter; import java.util.Date; import org.apache.log4j.Level; import org.apache.log4j.Logger; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.context.annotation.Bean; import org.springframework.core.ParameterizedTypeReference; import org.springframework.http.HttpMethod; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; import org.springframework.web.client.RestTemplate; @SpringBootApplication public class ElkExampleSpringBootApplication { public static void main(String[] args) { SpringApplication.run(ElkExampleSpringBootApplication.class, args); } } @RestController class ELKController { private static final Logger LOG = Logger.getLogger(ELKController.class.getName()); @Autowired RestTemplate restTemplete; @Bean RestTemplate restTemplate() { return new RestTemplate(); } @RequestMapping(value = "/elkdemo") public String helloWorld() { String response = "Hello user ! " + new Date(); LOG.log(Level.INFO, "/elkdemo - > " + response); return response; } @RequestMapping(value = "/elk") public String helloWorld1() { String response = restTemplete.exchange("http://localhost:8080/elkdemo", HttpMethod.GET, null, new ParameterizedTypeReference() { }).getBody(); LOG.log(Level.INFO, "/elk - > " + response); try { String exceptionrsp = restTemplete.exchange("http://localhost:8080/exception", HttpMethod.GET, null, new ParameterizedTypeReference() { }).getBody(); LOG.log(Level.INFO, "/elk trying to print exception - > " + exceptionrsp); response = response + " === " + exceptionrsp; } catch (Exception e) { // exception should not reach here. Really bad practice :) } return response; } @RequestMapping(value = "/exception") public String exception() { String rsp = ""; try { int i = 1 / 0; // should get exception } catch (Exception e) { e.printStackTrace(); LOG.error(e); StringWriter sw = new StringWriter(); PrintWriter pw = new PrintWriter(sw); e.printStackTrace(pw); String sStackTrace = sw.toString(); // stack trace as a string LOG.error("Exception As String :: - > "+sStackTrace); rsp = sStackTrace; } return rsp; } }
3.3. Configure Spring boot Logging
Open application.properties
under resources
folder and add below configuration entries.
logging.file=elk-example.log spring.application.name = elk-example
Read More: Spring Boot Logging Example
3.4. Verify Microservice Generated Logs
Do a final maven build using mvn clean install
and start the application using command java -jar target\elk-example-spring-boot-0.0.1-SNAPSHOT.jar
and test by browsing http://localhost:8080/elk.
Don’t be afraid by seeing the big stack trace in the screen as it has been done intentionally to see how ELK handles exception message.
Go to the application root directory and verify that the log file i.e. elk-example.log
has been created and do a couple of visits to the endpoints and verify that logs are getting added in the log file.
4. Logstash Configuration
We need to create a logstash configuration file so that it listen to the log file and push log messages to elastic search. Here is the logstash configuration used in the example, please change the log path as per your setup.
input { file { type => "java" path => "F:/Study/eclipse_workspace_mars/elk-example-spring-boot/elk-example.log" codec => multiline { pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*" negate => "true" what => "previous" } } } filter { #If log line contains tab character followed by 'at' then we will tag that entry as stacktrace if [message] =~ "\tat" { grok { match => ["message", "^(\tat)"] add_tag => ["stacktrace"] } } grok { match => [ "message", "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- \[(?<thread>[A-Za-z0-9-]+)\] [A-Za-z0-9.]*\.(?<class>[A-Za-z0-9#_]+)\s*:\s+(?<logmessage>.*)", "message", "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?<logmessage>.*)" ] } date { match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS" ] } } output { stdout { codec => rubydebug } # Sending properly parsed log events to elasticsearch elasticsearch { hosts => ["localhost:9200"] } }
5. Kibana Configuration
Before viewing the logs in Kibana, we need to configure the Index Patterns. We can configure logstash-*
as default configuration. We can always change this index pattern in logstash side and configure in Kibana. For simplicity, we will work with default configuration.
The index pattern management page will look like below. With this configuration we are pointing Kibana to Elasticsearch index(s) of your choice. Logstash creates indices with the name pattern of logstash-YYYY.MM.DD
We can do all those configuration in Kibana console http://localhost:5601/app/kibana and going to Management link in left panel.

6. Verify ELK Stack
Now when all components are up and running, let’s verify the whole ecosystem.
Go to application and test the end points couple of times so that logs got generated and then go to Kibana console and see that logs are properly stacked in the Kibana with lots of extra feature like we can filter, see different graphs etc in built.
Here is the view of generated logs in Kibana.


7. ELK Stack Tutorial – Summary
In this ELK example, we learned to configure ELK stack and the saw how we can point our application log files to ELK and view and analyze logs in Kibana. I will suggest you to play with configurations and share your learnings with us. e.g.
- Instead logstash to listen to our logs, we can use logback configuration to use TCP appender to send logs to a remote Logstash instance via TCP protocol.
- We can point multiple log files with Logstash.
- We can use more sophisticated filters in the logstash configuration file to do more as per our need.
- We can use remote ELK cluster to point to our log files, or push logs into, this is basically required when all the applications will be deployed in cloud.
- Create different index patterns in logstash.
Drop me your questions in comments section.
Happy Learning !!
Chandu
Hi All,
Please provide information about license
Difference between
Elastic License, and includes the full set of free features vs
Apache 2.0 license (the open source version) oss packages for elasticsearch, kibana and logstash.
Heena
Hi,
I am using 7.4.2 versions for ELK and java 8 but on logstash set up i am getting below error:
D:\New folder\logstash-7.4.2\bin>logstash -f logstash.conf
Error: Could not find or load main class folder\logstash-7.4.2\logstash-core\lib\jars\animal-sniffer-annotations-1.14.jar;
logstash.conf has below content:
input { stdin { } }
output {
elasticsearch { hosts => [“localhost:9200”] }
stdout { codec => rubydebug }
}
Can you please help me out?
Thanks
sDesai
do
logstash -f “file-path/name”;
Nitin Gadekar
How do i configure logstash output to elasticsearch with secure url https://logstash-server considering I have the crt and key file for tls. [tls]
vikas
hi nice one also please share about how to parse json logs .. and also to stack the entire json in an event.
Suman Dhar
Can you elaborate the logstash.conf file content? It looks complex.
Sachin S Chavan
Below is the error while i run .CONF file logstash -f logconfig.conf
C:\Users\Priyesh.Chourasia\Desktop\ElkOld\logstash-6.5.2\bin>logstash -f logconf
ig.conf
Sending Logstash logs to C:/Users/Priyesh.Chourasia/Desktop/ElkOld/logstash-6.5.
2/logs which is now configured via log4j2.properties
[2019-03-20T17:07:59,402][INFO ][logstash.setting.writabledirectory] Creating di
rectory {:setting=>”path.queue”, :path=>”C:/Users/Priyesh.Chourasia/Desktop/ElkO
ld/logstash-6.5.2/data/queue”}
[2019-03-20T17:07:59,415][INFO ][logstash.setting.writabledirectory] Creating di
rectory {:setting=>”path.dead_letter_queue”, :path=>”C:/Users/Priyesh.Chourasia/
Desktop/ElkOld/logstash-6.5.2/data/dead_letter_queue”}
[2019-03-20T17:07:59,515][WARN ][logstash.config.source.multilocal] Ignoring the
‘pipelines.yml’ file because modules or command line options are specified
[2019-03-20T17:07:59,567][INFO ][logstash.runner ] Starting Logstash {”
logstash.version”=>”6.5.2″}
[2019-03-20T17:07:59,597][INFO ][logstash.agent ] No persistent UUID f
ile found. Generating new UUID {:uuid=>”16a82f1b-11cd-4380-9928-77ff7321e06a”, :
path=>”C:/Users/Priyesh.Chourasia/Desktop/ElkOld/logstash-6.5.2/data/uuid”}
[2019-03-20T17:08:00,407][ERROR][logstash.agent ] Failed to execute ac
tion {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>”J
ava::JavaLang::NoSuchMethodError”, :message=>”org.apache.commons.codec.binary.He
x.encodeHexString([B)Ljava/lang/String;”, :backtrace=>[“org.logstash.execution.A
bstractPipelineExt.initialize(AbstractPipelineExt.java:124)”, “org.logstash.exec
ution.AbstractPipelineExt$INVOKER$i$3$0$initialize.call(AbstractPipelineExt$INVO
KER$i$3$0$initialize.gen)”, “org.jruby.internal.runtime.methods.JavaMethod$JavaM
ethodThree.call(JavaMethod.java:1186)”, “org.jruby.internal.runtime.methods.Java
Method$JavaMethodN.call(JavaMethod.java:743)”, “org.jruby.ir.runtime.IRRuntimeHe
lpers.instanceSuper(IRRuntimeHelpers.java:983)”, “org.jruby.ir.runtime.IRRuntime
Helpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:974)”, “org.jruby.ir.target
s.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)”, “C_3a_.Users
.Priyesh_dot_Chourasia.Desktop.ElkOld.logstash_minus_6_dot_5_dot_2.logstash_minu
…
…
[2019-03-20T17:08:00,462][FATAL][logstash.runner ] An unexpected error
occurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::
NoSuchMethodError` for `PipelineAction::Create`>, :backtrace=>[“org/logsta
sh/execution/ConvergeResultExt.java:103:in `create'”, “org/logstash/execution/Co
nvergeResultExt.java:34:in `add'”, “C:/Users/Priyesh.Chourasia/Desktop/ElkOld/lo
gstash-6.5.2/logstash-core/lib/logstash/agent.rb:329:in `block in converge_state
‘”]}
[2019-03-20T17:08:00,550][ERROR][org.logstash.Logstash ] java.lang.IllegalSta
teException: Logstash stopped processing because of an error: (SystemExit) exit
please suggest solution for the above issue.
Lokesh Gupta
Try finding which version common-codec jar file has this method and use that version.
NoSuchMethodError”, :message=>”org.apache.commons.codec.binary.Hex.encodeHexString([B)
Sachin S Chavan
Lokesh below isthe jars in my C:\Users\Priyesh.Chourasia\Desktop\ElkOld\logstash-6.5.2\logstash-core\lib\jars
commons-codec-1.11 which has below mehod i’m confused which version is conflicting checked with 1.6 to 1.12 every jar has the same below method. could you suggest me please
public static String encodeHexString(byte[] data)
{
return new String(encodeHex(data));
}
Sachin S Chavan
you are saving the file as logstash.conf & type is text its not configuration file. I’m getting an error when i’m running .CONF file
Raja
Not able to proceed with Step 5. Kibana Configuration.
Not getting/Not able to create the logstash-* index pattern.
Britto
Refer this URL:
https://www.youtube.com/watch?v=O5ou6lBwWYw
to get this worked
neeraj bali
path => “F:/Study/eclipse_workspace_mars/elk-example-spring-boot/elk-example.log”
this path is wrong replace”/” with “\”. it will work
also
add index , and search for it in the kibana
# Sending properly parsed log events to elasticsearch
elasticsearch {
hosts => [“localhost:9200”]
index => “syslog-%{+YYYY.MM.dd}”
}
Alberto Navarro
I’d suggest you to go further with this using JSON instead of plaintext, so you can get rid of grok (it doesn’t scale). Take a look to these articles: https://looking4q.blogspot.com/2018/09/level-up-logs-and-elk-introduction.html
Raja Praveen Katta
Hi I have downloaded your source code and did what ever you told. Ran spring boot jar, Kibana,Elastic Search and Logstash. In the Kibana, there is no default logstash-*. So I created in dev tools like this.
put logstash?pretty and ran the command , it created index as logstash but not a default.
What should I do next to import logs to Kibana
Sumit Ranjan
Great Tutorial.Simple and easy to use.
Thank you very much:)
DEEPAK PANDEY
Here we are giving path => “F:/Study/eclipse_workspace_mars/elk-example-spring-boot/elk-example.log” in logstash config.
For each microservice we have different files for logging then Can we provide different file {} for different microservicde log files ? Please suggest
DDet
Hello,
needs to be