本文介绍了在运行logstash实例时将条目复制到Elastic Search中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直试图将日志从logstash发送到elasticsearch.假设我正在运行一个logstash实例,并且在其运行时,我对logstash实例正在监视的文件进行了更改,那么以前所有的日志再次保存保存在elasticsearch中的内容,因此会形成重复项.

I have been trying to send logs from logstash to elasticsearch.Suppose I am running a logstash instance and while it is running,I make a change to the file which the logstash instance is monitoring,then all the logs which have been previously saved in the elasticsearch are saved again,hence duplicates are formed.

此外,当logstash实例关闭并再次重新启动时,日志将在elasticsearch中重复.

Also,when the logstash instance is closed and is restarted again,the logs gets duplicated in the elasticsearch.

如何解决此问题?如何仅将文件中最新添加的条目从logstash发送到elasticsearch?我的logstash实例命令如下: bin/logstash -f logstash-complex.conf

How do I counter this problem?How to send only the newest added entry in the file from logstash to elasticsearch?My logstash instance command is the following:bin/logstash -f logstash-complex.conf

和配置文件是这样的:

input
 {
  file
 {

     path => "/home/amith/Desktop/logstash-1.4.2/accesslog1"

  }
}

filter
 {
  if [path] =~ "access"
 {
    mutate 
{
 replace =>
 { "type" => "apache_access" } }
    grok {
      match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
  }
  date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
  }
}

output {
  elasticsearch {
    host => localhost 
    index => feb9
  }
  stdout { codec => rubydebug }
}

推荐答案

正如您在问题中提到的那样.

As you mentioned in your question.

因此,可能您已删除.since_db.请在此处查看.尝试指定since_dbstart_position.例如:

So, it probably you have delete the .since_db. Please have a look at here. Try to specific the since_db and start_position. For example:

input
{
    file
    {
        path => "/home/amith/Desktop/logstash-1.4.2/accesslog1"
        start_position => "end"
        sincedb_path => /home/amith/Desktop/sincedb
    }
}

这篇关于在运行logstash实例时将条目复制到Elastic Search中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-27 02:47