0

I am using elasticsearch 7.1.1 and logstash 7.1.1. I am trying to upload a log file to elastic search using grok filter.

Pipeline is getting started, but data is not getting uploaded.

Here is my config file.

 input{ file { path => "/home/i-exceed.com/pankaj.kumar/elk/logfiles/bankvisit-dialog-service/bankvisit-dialog-service.10-jun.log" start_position => "beginning" sincedb_path => "/dev/null" } } filter{ grok{ match => { "message" => "\[%{GREEDYDATA:logLabel}\] %{GREEDYDATA:date} %{GREEDYDATA:time} \[%{GREEDYDATA:threadId}\] \[%{GREEDYDATA:transactionId}\]%{GREEDYDATA:message}"} } } output{ elasticsearch { hosts => "localhost" index => "bankvisit" document_type => "bankvisitlog" } } 

Here the console output, it keeps on looping.

 [2019-06-14T14:08:02,767][DEBUG][org.logstash.config.ir.CompiledPipeline] Compiled output P[output-elasticsearch{"hosts"=>"localhost", "index"=>"bankvisit", "document_type"=>"bankvisitlog"}|[str]pipeline:21:5: elasticsearch { hosts => "localhost" index => "bankvisit" document_type => "bankvisitlog" } ] into org.logstash.config.ir.compiler.ComputeStepSyntaxElement@3a1579c8 [2019-06-14T14:08:02,975][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"} [2019-06-14T14:08:02,979][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"} [2019-06-14T14:08:07,131][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline. [2019-06-14T14:08:08,004][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"} [2019-06-14T14:08:08,005][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"} 

1 Answer 1

0

For sincedb_path instead of /dev/null, use nul as shown below (mind the single 'l' in nul):

sincedb_path => "nul" 

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.