Tuesday, 23 August 2016

Setup a simple ELK on AWS in under 10 minutes!

Setup a simple ELK on AWS in under 
10 minutes!

So here I am... back at it again after a good long break! For this tutorial, Im going to show you how easy and simple it is to setup a basic working ELK on AWS!

ELK? Not heard of it? well if you haven't heard of it or tried it out yet, I suggest you do! ELK is actually an acronym for ElasticSearch, Logstash and Kibana.. fancy names but trust me.. together they form a really powerful log analysis tool.
  • ElasticSearch: Built on top of Apache Lucene, ElasticSearch is the work engine behind ELK that performs real-time data extractions and analysis on structured as well as unstructured data. To know more read HERE
  • Logstash: Logstash is a tool that can ingest logs, process them and forward them to another system such as ElasticSearch. Logstash comes with a huge supply of inputs, filters, codecs and outputs that can be used to consume virtually any type of logs from web servers, to syslogs, to error lgos etc. To know more read HERE
  • Kibana: Kibana is a visualization tool that can be used to visualize trends, patterns, read and interpret your log data. To know more read HERE

In this tutorial, I'll be walking you simple steps to get started with a single node ELK setup on AWS infrastructure. 

Source: https://www.cartoonstock.com/cartoonview.asp?catref=jhen301

To get started, you will need an AWS account. You can signup for AWS and get one year of services for free* T&C apply of course!

For this particular tutorial, Im using a Ubuntu AMI and running this setup on a t2.medium instance (CPU: 2 -- RAM: 4 GB -- HDD: 20 GB). 
Make sure the instance that you launch has a Public IP attached to it. Login to the instance using the username "ubuntu" and the PPK file as the password.
Run the following command as shown:

Update the packages using the following command:
# sudo apt-get update

Next, install Java. You can either go for Oracle supplied Java or even the OpenJDK version. 
# sudo apt-get install openjdk-7-jre-headless

Once installed, check the java version using the following command:
# java -version

With the basics out of the way, we now go ahead and install ElasticSearch out first. To do so, type in the following command to import ElasticSearch's public GPG key:
# wget -qO - https://packages.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add - 

Next, we add the ElasticSearch to the source list:

# echo "deb http://packages.elastic.co/elasticsearch/1.7/debian stable main" | sudo tee -a /etc/apt/sources.list.d/elasticsearch-2.x.list

NOTE: You can replace the 1.7 with 2.x to obtain the latest version of ElasticSearch.

Finally, run the update command again:
# sudo apt-get update

Next, install the ElasticSearch packages using the following command:
# sudo apt-get install elasticsearch

With the installation complete, start the service using the following command:
# sudo service elasticsearch restart

Test the install by querying the localhost at port 9200. You should see the output as shown below:
# curl localhost:9200

Enable ElasticSearch to start on boot using the following command:

# sudo update-rc.d elasticsearch defaults 95 10

We now move to Logstash. Installing Logstash is a simple and straightforward process as well. First we add the 
# echo "deb http://packages.elastic.co/logstash/1.5/debian stable main" | sudo tee -a /etc/apt/sources.list

NOTE: You can substitute 1.5 with 2.1 to install the latest Logstash packages.

Make sure you update the packages on the system using the update command:
# sudo apt-get update

Install the Logstash package:
# sudo apt-get install logstash

Enable Logstash to start on boot using the following command:
# sudo update-rc.d logstash defaults 97 8

For this tutorial, we will use Logstash to forward the instance's syslogs to ELK. 

The Logstash configuration file defines our Logstash pipeline.

Here's a snippet of the configuration pipeline:

input { 
filter { 
output {  

NOTE: The filter section is optional.

# sudo vi /etc/logstash/conf.d/10-syslog.conf

input { 
    file { 
    type => "syslog" 
    path => [ "/var/log/messages", "/var/log/*.log" ] 
    output { 
    stdout { 
   codec => rubydebug 
   elasticsearch { 
   host => "<PRIVATE_IP>" # Provide the Private IP address of your ELK instance here

Restart the Logstash service for the configurations to take effect:
# sudo service logstash restart 

We now move on too the final part of the ELK setup and that is the installation and configuration of Kibana. To do so, we first download the Kibana setup using the following command:
# wget https://download.elastic.co/kibana/kibana/kibana-4.1.1-linux-x64.tar.gz

You will also need to download Kibana's service file as well. This will help in starting Kibana's service much easily:

# wget https://raw.githubusercontent.com/akabdog/scripts/master/kibana4_init -O kibana4

Extract Kibana's contents to a folder using the following commands:
# sudo tar -xzf kibana-4.1.1-linux-x64.tar.gz

# sudo mkdir -p /opt/kibana

# sudo mv kibana-4.1.1-linux-x64/* /opt/kibana

Finally, copy Kibana's service file and enable it to start on boot using the following command:
# sudo mv kibana4 /etc/init.d/

# sudo chmod +x /etc/init.d/kibana4

# sudo update-rc.d kibana4 defaults 96 9

# sudo service kibana4 start

Kibana will start on port 5601. To test the install, point your ELK instance's public IP address and port 5601 to a browser. You should see Kibana's initial configuration screen as shown below:

Here, you need to configure the default index pattern for ElasticSearch to run indexes and analysis against. Select the "Index contains time-based events" checkbox as shown. Also, type in "logstash-*" in the Index name or pattern field and set the Time-field name value as "@timestamp"
Click the "Create" button to complete the setup. 

You should see logs from your ELK instance popping up in Kibana's "Discover" tab as shown below:

There you have it! a simple ELK on a single EC2 instance! 
Coming up next, Configuring ELK clients and how to setup ELK on a production scale as well, so stay tuned for much more!!