本文介绍了无法使用logstash在elasticsearch上创建索引的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用带logstash的elasticsearch来显示带有kibana的数据集.

I am using elasticsearch with logstash to visualize a data set with kibana.

我根据规范创建了一个配置文件,启动并运行了我的elasticsearch和kibana,然后加载了该配置文件.完成加载文件后,我将收到以下消息.我知道文件没有加载,因为我会在提示中看到数据集.我也在kibana仪表板中搜索了索引,但没有显示索引.

I created a config file as per the specifications, have my elasticsearch and kibana up and running and then loading the config file. I am getting the below as the message when i am done loading the file. I know the file has not loaded as i would have seen the data sets in the prompt. I also searched for the index in kibana dashboard but it does not show me the index.

以下是我收到的提示消息:

Below is the prompt message i get:

Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to D:/AMS/Softwares/logstash-7.4.2/logs which is now configured via log4j2.properties
[2019-11-25T14:05:08,406][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-11-25T14:05:08,429][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.4.2"}
[2019-11-25T14:05:10,623][INFO ][org.reflections.Reflections] Reflections took 42 ms to scan 1 urls, producing 20 keys and 40 values
[2019-11-25T14:05:11,458][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"Imports", id=>"7195e24081a8419104011573b541ec87e57c49b74d413cad474911f90ee68a82", hosts=>[//localhost], document_type=>"Imports20162017", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_9e38ae5e-1762-4e76-9a15-bf2e71e368a8", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-11-25T14:05:12,021][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-11-25T14:05:12,260][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-11-25T14:05:12,322][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2019-11-25T14:05:12,327][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-11-25T14:05:12,375][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2019-11-25T14:05:12,450][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2019-11-25T14:05:12,549][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-11-25T14:05:12,565][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2019-11-25T14:05:12,572][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x3f750db6 run>"}
[2019-11-25T14:05:15,423][ERROR][logstash.javapipeline    ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<ArgumentError: File paths must be absolute, relative path specified: D:\\AMS\\Docs\\ELK Related\\Data Set\\import-and-export-by-india\\PC_Import_2016_2017.csv>, :backtrace=>["D:/AMS/Softwares/logstash-7.4.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.11/lib/logstash/inputs/file.rb:269:in `block in register'", "org/jruby/RubyArray.java:1800:in `each'", "D:/AMS/Softwares/logstash-7.4.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.1.11/lib/logstash/inputs/file.rb:267:in `register'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:195:in `block in register_plugins'", "org/jruby/RubyArray.java:1800:in `each'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:194:in `register_plugins'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:296:in `start_inputs'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:252:in `start_workers'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:149:in `run'", "D:/AMS/Softwares/logstash-7.4.2/logstash-core/lib/logstash/java_pipeline.rb:108:in `block in start'"], :thread=>"#<Thread:0x3f750db6 run>"}
[2019-11-25T14:05:15,452][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2019-11-25T14:05:15,848][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-11-25T14:05:20,768][INFO ][logstash.runner          ] Logstash shut down.

以下是我正在使用的logstash文件:

Below is the logstash file i am using:

input {
  file {
    path => "D:\\AMS\\Docs\\ELK Related\\Data Set\\import-and-export-by-india\\PC_Import_2016_2017.csv"
    start_position => "beginning"
    sincedb_path => "nul"
  }
}

filter {
  csv {
    separator => ","
    columns => [ "pc_code", "pc_description", "unit", "country_code", "country_name", "quantity", "value" ]
  }
}

output {
  elasticsearch {
    hosts => "localhost"
    index => "Imports"
    document_type => "Imports20162017"
  }
}

我在做错什么吗?

已解决

问题是文件路径中使用了反斜杠.我改用正斜杠,但效果很好.

The issue was the backslashes used in file path. I used forward slashes instead and it worked fine.

推荐答案

问题是此错误:

您需要消除路径中的反斜杠,而改用正斜杠,如下所示:

You need to get rid of the backslashes in your path and use forward slashes instead, like this:

input {
  file {
    path => "D:/AMS/Docs/ELK Related/Data Set/import-and-export-by-india/PC_Import_2016_2017.csv"
    start_position => "beginning"
    sincedb_path => "nul"
  }
}

这篇关于无法使用logstash在elasticsearch上创建索引的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

06-29 21:48