Member-only story

Export Data from ElasticSearch to CSV

Shaon Shaonty
3 min readDec 27, 2018

A few days ago I experienced a very horrible situation. Someone from our DevOps team, by mistake, delete my primary Postgres Db table. Boommm!!

elastic site

Fortunately, my secondary Db was ElasticSearch and all data was populated there. God save us for 7–10 hours system down. Because it is one of the core services and without these data nothing working. Ok, now I need to repopulate my pg DB with elastic data for sync. My team lead asked me to scrap all data from elastic search and store it in a CSV file. First I’ve planned to write a python script and query for all data and store it in a CSV file.

But it’s not an effective solution and also elastic has some limitation to retrieve 10k plus data at a time.

Postman

Finally, I decided to export data using Logstash. What the hell is Logstash?

“Logstash is an open source, server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends it to your favorite “stash.” (Ours is Elasticsearch, naturally.)”

Cool! Right? To use Logstash I needed to install 2 Logstash pluggings alongside Logstrash. To install Logstash follow this

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Already have an account? Sign in

Shaon Shaonty
Shaon Shaonty

Written by Shaon Shaonty

Data Scientist | Software Engineer | Computer Science Graduate @TU Dresden

Responses (7)

Write a response