Logstash запускается, но не создает индексы

Я пытаюсь создать индекс в elasticsearch, используя файл csv. Ниже приведена конфигурация.

input {
  file {
    path => "C:\Users\soumdash\Desktop\Accounts.csv"
    start_position => "beginning"
    sincedb_path => "NUL"
  }
}

filter {
  csv{
     separator => ","
     columns => ["Country_code","Account_number","User_ID","Date","Time"]
  }
  mutate {convert => ["Account_number","integer"]}
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "accounts"
  }
  stdout {}
}

Я запускаю logstash и из консоли вижу, что он запущен и конвейер создан. Но я не вижу того же индекса в кибане.

C:\Users\soumdash\Desktop\logstash-7.2.0\bin>logstash -f logstash-account.conf
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to C:/Users/soumdash/Desktop/logstash-7.2.0/logs which is now configured via log4j2.properties
[2019-07-26T14:01:27,662][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-07-26T14:01:27,711][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.2.0"}
[2019-07-26T14:01:42,181][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"accounts", id=>"b54e1c07198cf188279cb051e01c9fe6118db48fe2ce76739dc2ace82e02c078", hosts=>[//localhost:9200], document_type=>"ERC_Acoounts", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_57f41853-7ddf-48e5-a5e4-316d94c83a0f", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-07-26T14:01:46,248][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-07-26T14:01:46,752][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-07-26T14:01:46,852][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-07-26T14:01:46,862][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-07-26T14:01:46,910][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-07-26T14:01:47,046][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-07-26T14:01:47,205][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-07-26T14:01:47,236][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2019-07-26T14:01:47,236][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x26c630b8 run>"}
[2019-07-26T14:01:52,105][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-07-26T14:01:52,232][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-07-26T14:01:52,249][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2019-07-26T14:01:53,290][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Я проверил и попробовал несколько других ответов по той же проблеме, таких как Logstash создает конвейер, но индекс не создается и Logstash не создает index в эластичном поиске Но безуспешно.

Кто-нибудь может помочь? Я использую ЭЛК 7.2.


person Soumya Saraswata Dash    schedule 26.07.2019    source источник
comment
Я также не вижу никаких выходных данных из-за stdout {} в журналах. Ваш файл читается?   -  person Nishant    schedule 26.07.2019
comment
Как я могу это подтвердить? Извините, я новичок в ELK. Я добавил rubydebug с файлом для вывода. output { elasticsearch { hosts => ["localhost:9200"] index => "accounts" } file { path => "C:\Users\soumdash\Desktop\temp" codec => rubydebug } } Но все равно   -  person Soumya Saraswata Dash    schedule 26.07.2019


Ответы (1)


Можете ли вы использовать rubydebug внутри stdout, просто чтобы убедиться, что ваш файл прочитан?

person diabolique    schedule 26.07.2019
comment
Я его добавил. Даже пытался вывести в файл. output { elasticsearch { hosts => ["localhost:9200"] index => "accounts" } file { path => "C:\Users\soumdash\Desktop\temp" codec => rubydebug } } но без изменений - person Soumya Saraswata Dash; 26.07.2019
comment
Можете ли вы удалить параметр sincedb_path и повторить попытку? - person diabolique; 29.07.2019
comment
Это все то же самое после удаления, так как db_path - person Soumya Saraswata Dash; 08.08.2019
comment
Для пути во входной части вам нужно заменить back slash \ на forward slash / - person diabolique; 08.08.2019
comment
И вернуться обратно sincedb_path => "NUL" - person diabolique; 08.08.2019