How to build an ELK log system based on Docker

How to build an ELK log system based on Docker

Background requirements:

As the business grows larger and larger, there are more and more servers, and the amount of various access logs, application logs, and error logs increases. As a result, operation and maintenance personnel cannot manage logs well. Developers need to check logs on the server to troubleshoot problems, which is inconvenient for operation personnel. If they need some data, we need to operate and maintain the server to analyze the logs.

ELK Introduction:

insert image description here

ELK is the abbreviation of three open source software: Elasticsearch, Logstash and Kibana, all of which are open source software. Currently, the original ELK Stack has been renamed Elastic Stack due to the addition of the Beats tool. Beats is a lightweight log collection and processing tool (Agent) that takes up few resources and is suitable for collecting logs on various servers and transmitting them to Logstash. The official also recommends this tool.

insert image description here

Building steps:

After the above description, everyone should have a little understanding that to build this platform, at least 3/4 software is needed (filebeat is not required).

  • kibana is used for display
  • elasticsearch for retrieval
  • logstash is used for filtering
  • Filebeat is used to collect logs

This article assumes that you already have a Docker environment and have basic experience using Docker.

Pull the image (unified version can avoid many pitfalls):

docker pull kibana:6.8.2
docker pull elasticsearch:6.8.2
docker pull mobz/elasticsearch-head:5 # a plugin for es docker pull logstash:6.8.2
docker pull docker.elastic.co/beats/filebeat:6.8.2 # If it is very slow, you can try to set up a proxy or change the docker image source

To set up a Docker proxy in Linux, you can use ClashX to subscribe to the URL, and then connect Docker to the proxy. If you have a desktop, you can change it directly. If you don't have one, you can create a configuration file to change it. ⇒ Portal

insert image description here

Also give Docker more memory to avoid lags.

Build ES:

docker run -d -p 9200:9200 -p 9300:9300 --name elasticsearch -e "discovery.type=single-node" elasticsearch:6.8.2

Browser access (curl is also OK): localhost:9200. If the following result appears, it means the startup is successful:

insert image description here

If an error occurs, check the error log.

Create ES-HEAD:

docker run -d -p 9100:9100 docker.io/mobz/elasticsearch-head:5

Visit localhost:9100 again and you will see the following result:

insert image description here

You may not be able to connect to es in es-head because es does not have cross-domain enabled. Enter the es container and find elasticsearch.yml under the config file. Add the following to this file:

http.cors.enabled: true
http.cors.allow-origin: "*"

Then restart es and es-head can connect to es.

Open Kibana:

docker run -d -p 5601:5601 --link elasticsearch -e ELASTICSEARCH_URL=http://elasticsearch:9200 kibana:6.8.2

Here it is http://elasticsearch:9200. Do not change it. Previously, we used --link to add the elasticsearch IP address to the hosts file of the kibana container, so that we can access the es service directly through the defined name.
After the container is started successfully, you should be able to see the following information in es-head:

insert image description here

Visiting localhost:5601 will result in the following:

insert image description here

So far, our kibana has been running successfully, and es has also been running. The next step is to collect log services.

Build filebeat and logstash:

First of all, these two brothers need some configuration files. We want to store these two configuration files in the same folder:

mkdir elktest # In the main directory, that is, ~/elktest path cd elktest
touch filebeat.yml
touch logstash.conf

Add a profile

vim filebeat.yml
filebeat.prospectors:
- paths:
    - /home/elk/logs/user/a.log
  multiline:
      pattern: ^\d{4}
      negate: true
      match: after
  fields:
    doc_type: user
- paths:
    - /home/elk/logs/service/a.log
  multiline:
      pattern: ^\d{4}
      negate: true
      match: after
  fields:
    doc_type: service
output.logstash: # Output address hosts: ["logstash:5044"]
vim logstash.conf
input {
  beats {
    port => "5044"
  }
}
filter {
  json {
    source => "message"
  }
}

output {
  stdout { codec => rubydebug }
  elasticsearch
        hosts => [ "elasticsearch:9200" ]
        index => "%{[fields][doc_type]}-%{+YYYY.MM.dd}"
    }
}

You should still be in the elktest directory. Since we don't have a log source, this video will create a log manually and create a log folder:

mkdir logdir

Build some logs yourself, write whatever you want, don't change the file and folder names, they are already mapped in the configuration file. After the build is complete, your directory structure should look like this:

insert image description here

Create a container:

docker run -it --name logstash --link elasticsearch -d -p 5044:5044 -v ~/elktest/logstash.conf:/usr/share/logstash/pipeline/logstash.conf logstash:6.8.2

docker run --name filebeat --link logstash -d -v ~/elktest/filebeat.yml:/usr/share/filebeat/filebeat.yml -v ~/elktest/logdir/user/:/home/elk/logs/user/ -v ~/elktest/logdir/service/:/home/elk/logs/service/ docker.elastic.co/beats/filebeat:6.8.2

At this point your es-head should look like this:

insert image description here

If this is not the case, you can check whether the container is running and whether there is an error log. You can also ping elasticsearch in the logstash container and ping logstash in the filebeat container to check whether there is a problem with the path.

Kibana shows: Create index:

insert image description here

After the creation is complete, check it and add the fields you want to view. The content is in the message (which happens to be the content in a.log):

insert image description here

So far, our elk has been built. Friends who have successfully built it can give themselves a chicken drumstick for dinner! ! ! !

The above is the detailed content of the method of building an ELK log system based on Docker. For more information about building an ELK log system with Docker, please pay attention to other related articles on 123WORDPRESS.COM!

You may also be interested in:
  • Example of using Docker to build an ELK log system
  • Detailed explanation of how to use Docker to quickly deploy the ELK environment (latest version 5.5.1)
  • Detailed explanation of using ELK to build a Docker containerized application log center
  • Docker builds ELK Docker cluster log collection system

<<:  CSS realizes process navigation effect (three methods)

>>:  Tudou.com front-end overview

Recommend

Summary of MySQL database usage specifications

Introduction: Regarding MySQL database specificat...

Hyper-V Introduction and Installation and Use (Detailed Illustrations)

Preface: As a giant in the IT industry, Microsoft...

Docker dynamically exposes ports to containers

View the IP address of the Container docker inspe...

Upgrade Docker version of MySQL 5.7 to MySQL 8.0.13, data migration

Table of contents 1. Back up the old MySQL5.7 dat...

Detailed explanation of vite2.0+vue3 mobile project

1. Technical points involved vite version vue3 ts...

The docker-maven-plugin plugin cannot pull the corresponding jar package

When using the docker-maven-plugin plug-in, Maven...

HTML web page hyperlink tag

HTML web page hyperlink tag learning tutorial lin...

Analysis of multi-threaded programming examples under Linux

1 Introduction Thread technology was proposed as ...

Implementation of proxy_pass in nginx reverse proxy

The format is simple: proxy_pass URL; The URL inc...

Linux system file sharing samba configuration tutorial

Table of contents Uninstall and install samba Cre...

A small introduction to the use of position in HTML

I just learned some html yesterday, and I couldn&#...

Example code for Html layered box-shadow effect

First, let’s take a look at the picture: Today we...

MySQL learning notes help document

View system help help contents mysql> help con...

Detailed explanation of MySQL slow log query

Slow log query function The main function of slow...