Member-only story
Export Data from ElasticSearch to CSV
A few days ago I experienced a very horrible situation. Someone from our DevOps team, by mistake, delete my primary Postgres Db table. Boommm!!

Fortunately, my secondary Db was ElasticSearch and all data was populated there. God save us for 7–10 hours system down. Because it is one of the core services and without these data nothing working. Ok, now I need to repopulate my pg DB with elastic data for sync. My team lead asked me to scrap all data from elastic search and store it in a CSV file. First I’ve planned to write a python script and query for all data and store it in a CSV file.
But it’s not an effective solution and also elastic has some limitation to retrieve 10k plus data at a time.

Finally, I decided to export data using Logstash. What the hell is Logstash?
“Logstash is an open source, server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends it to your favorite “stash.” (Ours is Elasticsearch, naturally.)”
Cool! Right? To use Logstash I needed to install 2 Logstash pluggings alongside Logstrash. To install Logstash follow this…