
1. 問題描述
我們需要將不同服務器(如Web Server)上的log4j日志傳輸到同一臺ELK服務器,介于公司服務器資源緊張(^_^)
(資料圖片僅供參考)
2. 我們需要用到filebeat
什么是filebeat?
filebeat被用來ship events,即把一臺服務器上的文件日志通過socket的方式,傳輸到遠程的ELK。
可以傳輸到logstash,也可以直接傳輸到elasticsearch。
3. 我們這里講解如何傳輸到遠程的logstash,然后再由elasticsearch講數據傳輸到kibana展示
3.1、 首先你要在你的本地測試機器上安裝filebeat
以下是下載路徑:??https://www.elastic.co/downloads/beats/filebeat??
3.2、 其次你應該配置你的filebeat.xml
filebeat.prospectors:
- input_type: log
#Paths that should be crawled and fetched. Glob based paths.
paths:
- /Users/KG/Documents/logs/t-server/*.log
#----------------------------- Logstash output --------------------------------
output.logstash:
#The Logstash hosts
hosts: ["xx.xx.xx.xx:5000"]
3.3、 啟動filebeat
需要先給配置文件加權限
chown root:root filebeat.yml
然后啟動
sudo ./filebeat -e -c filebeat.yml &
3.4、 配置遠程的logstash并啟動
log4j_filebeat.conf
input{ beats { port => 5000 }}output{ stdout{ codec => rubydebug } elasticsearch { hosts => "localhost:9200" index => "t-server-%{+YYYY.MM.dd}" document_type => "log4j_type" user => your-username password => your-password }}
啟動:
./bin/logstash -f config/log4j_fliebeat.conf &
3.5、 Java客戶端日志配置和程序
log4j.properties
### 設置###log4j.rootLogger = debug,stdout,D### 輸出信息到控制抬 ###log4j.appender.stdout = org.apache.log4j.ConsoleAppenderlog4j.appender.stdout.Target = System.outlog4j.appender.stdout.layout = org.apache.log4j.PatternLayoutlog4j.appender.stdout.layout.ConversionPattern = [%-5p] %d{yyyy-MM-dd HH:mm:ss,SSS} method:%l%n%m%n### 輸出DEBUG 級別以上的日志到=/Users/bee/Documents/elk/log4j/debug.log###log4j.appender.D = org.apache.log4j.DailyRollingFileAppenderlog4j.appender.D.File = /Users/KG/Documents/logs/t-server/app.loglog4j.appender.D.Append = truelog4j.appender.D.Threshold = DEBUGlog4j.appender.D.layout = org.apache.log4j.PatternLayoutlog4j.appender.D.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss} [ %t:%r ] - [ %p ] %m%n
Java API
package org.genesis.arena.elk;import org.apache.log4j.Logger;/** * Created by KG on 17/3/27. */public class ElkLog4jTest { private static final Logger logger = Logger.getLogger(ElkLog4jTest.class); public static void main(String[] args) throws Exception { logger.debug("最新的日志!!"); }}
在logstash看到結果如下:
在kibana看到結果如下:
同理,我們使用另一個端口啟動另外一個logstash后臺進程
logstash配置文件如下:
log4j_fliebeat2.conf
input{ beats { port => 5001 }}output{ stdout{ codec => rubydebug } elasticsearch { hosts => "localhost:9200" index => "t-yxc-finance-%{+YYYY.MM.dd}" document_type => "log4j_type" user => your-username password => your-password }}
啟動:
./bin/logstash -f config/log4j_fliebeat2.conf &
filebeat.yml
filebeat.prospectors:# Each - is a prospector. Most options can be set at the prospector level, so# you can use different prospectors for various configurations.# Below are the prospector specific configurations.- input_type: log # Paths that should be crawled and fetched. Glob based paths. paths: - /Users/KG/Documents/logs/t-yxc-finance/*.log#----------------------------- Logstash output --------------------------------output.logstash: # The Logstash hosts hosts: ["xx.xx.xx.xx:5001"]
客戶端配置文件和代碼:
### 設置###log4j.rootLogger = debug,stdout,D### 輸出信息到控制抬 ###log4j.appender.stdout = org.apache.log4j.ConsoleAppenderlog4j.appender.stdout.Target = System.outlog4j.appender.stdout.layout = org.apache.log4j.PatternLayoutlog4j.appender.stdout.layout.ConversionPattern = [%-5p] %d{yyyy-MM-dd HH:mm:ss,SSS} method:%l%n%m%n### 輸出DEBUG 級別以上的日志到=/Users/bee/Documents/elk/log4j/debug.log###log4j.appender.D = org.apache.log4j.DailyRollingFileAppenderlog4j.appender.D.File = /Users/KG/Documents/logs/t-yxc-finance/app.loglog4j.appender.D.Append = truelog4j.appender.D.Threshold = DEBUGlog4j.appender.D.layout = org.apache.log4j.PatternLayoutlog4j.appender.D.layout.ConversionPattern = %-d{yyyy-MM-dd HH:mm:ss} [ %t:%r ] - [ %p ] %m%n
package org.genesis.arena.elk;import org.apache.log4j.Logger;/** * Created by KG on 17/3/27. */public class ElkLog4jTest { private static final Logger logger = Logger.getLogger(ElkLog4jTest.class); public static void main(String[] args) throws Exception { logger.debug("另外一臺服務器,最新的日志!!"); }}
運行結果如下: