Stream data to amazon elasticsearch using logstash?

Stream data to amazon elasticsearch using logstash?

So I spinned up a 2 instance Amazon Elasticsearch cluster.
I have installed the logstash-output-amazon_es plugin. This is my logstash configuration file :
input {
file {
path => “/Users/user/Desktop/user/logs/*”
}
}

filter {
grok {
match => {
“message” => ‘%{COMMONAPACHELOG} %{QS}%{QS}’
}
}

date {
match => [ “timestamp”, “dd/MMM/YYYY:HH:mm:ss Z” ]
locale => en
}

useragent {
source => “agent”
target => “useragent”
}
}

output {
amazon_es {
hosts => [“foo.us-east-1.es.amazonaws.com”]
region => “us-east-1”
index => “apache_elk_example”
template => “./apache_template.json”
template_name => “apache_elk_example”
template_overwrite => true
}
}

Now I am running this from my terminal:
/usr/local/opt/logstash/bin/logstash -f apache_logstash.conf

I get the error:
Failed to install template: undefined method `credentials’ for nil:NilClass {:level=>:error}

I think I have got something completely wrong. Basically I just want to feed some dummy log inputs to my amazon elasticsearch cluster through logstash. How should I proceed?
Edit Storage type is Instance and access policy is set to accessible to all.
Edit
output {
elasticsearch {
hosts => [“foo.us-east-1.es.amazonaws.com”]
ssl => true
index => “apache_elk_example”
template => “./apache_template.json”
template_name => “apache_elk_example”
template_overwrite => true

}
}

Solutions/Answers:

Solution 1:

You need to provide the following two parameters:

  • aws_access_key_id and
  • aws_secret_access_key

Even though they are described as optional parameters, there is one comment in the code that makes it clear.

aws_access_key_id and aws_secret_access_key are currently needed for this >plugin to work right. Subsequent versions will have the credential resolution logic as follows:

Solution 2:

I also faced same problem, and I solved it by mentioning port after the hostname.
This occurs because hostname hosts => ["foo.us-east-1.es.amazonaws.com"] points to foo.us-east-1.es.amazonaws.com:9200 which is not the default port in the case of aws elasticsearch. So by changing hostname to foo.us-east-1.es.amazonaws.com:80 solves the problem.

Related:  Logstash email alerts dynamically from multiple log files

Solution 3:

I was able to run logstash together with AWS Elasticsearch without the AccessKeys, I configured the policie in the ES service.

It worked without the Keys if you start the logstash manually, if you start logstash as a service the plugin doenst work.

https://github.com/awslabs/logstash-output-amazon_es/issues/34

References