Issue in dumping Data in AWS Elasticsearch using Logstash

Irtiza
2 min readJul 26, 2020
https://www.zenitk.com/import-from-csv-to-elasticsearch-with-logstash

Overview

This story is not about dumping CSV data in AWS Elasticsearch using Logstash because there are already a lot of good articles. I personally like this one.

Instead, it is about an issue I faced while doing the above operation.

Issue

I deployed an Elasticsearch instance on AWS which was publically accessible. I was able to perform requests using CURL. But when I tried to access the dump data using Logstash with this configuration:

I was getting this error:

elasticsearch - Failed to install template. {:message=>"Got response code '401' contacting Elasticsearch at URL '<elasticsearch-url>:443/_xpack'"

By looking at the URL, I noticed that Logstash was appending /_xpack subpath.

After doing a little bit of research about X-Pack, I found out that:

X-Pack is an Elastic Stack extension that provides security, alerting, monitoring, reporting, machine learning, and many other capabilities.

Although, I haven’t provided anything in the URL but somehow it was appending the xpack in the path.

Solution

After doing a bit of research I found out that AWS Elasticsearch service installs X-Pack and ILM is part of X-Pack feature set.

The data that I was dumping in Elasticsearch was not confidential so therefore I didn’t need this feature. So I disable ILM by configuring this parameter:

ilm_enabled => false

The final Logstash configuration file looks like this:

After this change, everything worked!

Final Thoughts

I hope that you like this story and please give feedback about anything that can be improved or I have missed. Thank you :)

--

--