使用Logstash自动检测并读取日志文件更改

上周我一直在使用ELK Stack的集中式日志logging解决scheme。

我目前使用的环境基本上是Elasticsearch,Logstash,Kibana和Logspout的4个容器。 我正在使用带有本地计算机上的共享卷的文件input来收集logstash日志。

我目前使用的logstashconfiguration文件是这样的(logstash.sample.conf):

input { file { path => "/share-logs/lumen.log" stat_interval => 1 type => applumen #sincedb_path => "/share-logs/lumen.log.sincedb" sincedb_path => "/dev/null" } file{ path => "/share-logs/laravel.log" type => applaravel #sincedb_path => "/share-logs/laravel.log.sincedb" stat_interval => 1 sincedb_path => "/dev/null" } file { path =>"/share-logs/server.log" type => appjava #sincedb_path => "/share-logs/server.log.sincedb" sincedb_path => "/dev/null" } file { path => "/share-logs/access.log" type => apacheaccess #sincedb_path => "/share-logs/access.log.sincedb" stat_interval => 1 sincedb_path => "/dev/null" } file { path => "/share-logs/error.log" type => apacheerror #sincedb_path => "/share-logs/error.log.sincedb" stat_interval => 1 sincedb_path => "/dev/null" } tcp { port => 5000 type => syslog } udp { port => 5000 type => syslog } } filter { if [type] == "syslog" { grok { match => { "message" => "(?:%{TIMESTAMP_ISO8601:timestamp}|-) +(?:%{HOSTNAME:containerId}|-) +(?:%{NOTSPACE:containerName}|-) +(?:%{WORD:processId}|-) %{GREEDYDATA:logMsg}" } #Still needs to parse with multi-line features } } if [type] == "applumen" { grok { match => { "message" => "%{SYSLOG5424SD:timestamp} %{WORD:application}.%{WORD:logType}:%{GREEDYDATA:logMsg}" } } multiline { pattern => "(Stack trace:)|(^#.+)|(^\"\")|( thrown+)|(^\s)" what => "previous" } } if [type] == "applaravel" { grok { match => { "message" => "%{SYSLOG5424SD:timestamp} %{WORD:application}.%{WORD:logType}:%{GREEDYDATA:logMsg}" } } multiline { pattern => "(Stack trace:)|(^#.+)|(^\"\")|( thrown+)|(^\s)|(Next )|(^\n)" what => "previous" } multiline { pattern => "(^\n)" negate => "true" what => "previous" } } if [type] == "appjava" { grok { match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:logType} +%{NOTSPACE:service} %{GREEDYDATA:logMsg}"} } multiline { pattern => "(default task-)|( at)|(user.)" what => "previous" } multiline { pattern => "(^\n)" negate => "true" what => "previous" } } if [type] == "apacheaccess"{ grok { match => {"message" => "%{COMBINEDAPACHELOG:logMsg}"} } } if [type] == "apacheerror"{ grok { match => {"message" => "%{GREEDYDATA:logMsg}"} } } } output { elasticsearch { host => "elasticsearch" } stdout { codec => rubydebug } } 

现在,如果我手动更改任何日志文件中的某行或通过任何提及的文件中的应用程序更改行,Logstash将检测到这些更改并立即将其发送给Elasticsearch。

发生的事情是,server.log(Java应用程序)没有像其他人那样被跟踪,configuration也几乎相同。

我在这里做错了什么?