Bits and pieces.

Redship-ES – Redis-Elasticsearch Shipper

Ever found in a situation, to store data in redis over a long time, in a environment where key collisions could happen?

Recently I’m dealing a lot with Redis to store large arrays of metadata over a short period for analysis purposes.

However, Redis is of course not the right tool to store keys over a longer time, for example 30 days, as in my scenario keys could overlap. Those keys are fragments of a uint32 representing milliseconds, “calculated” by modulo.

The cool kid in the block

I love Elasticsearch. It doesnt matter if it’s actually used for logging or application data. Elasticsearch is simple, scales well and doesnt bring all the horrible performance and scaling downsides such as typical SQL databases like MySQL or Postgres. Ever ran multi master mysql replication? You really dont want to.

Redship?

In order to ship redis keys, where every key is json, to Elasticsearch, I’ve wrote a python application. That application constantly pulls redis keys, parses json, adds some fields (such as GeoIP details), rebuilds the json string and pushes them to ES and deletes the keys afterwards. Simple, isnt it?

Afterwards, data can be queried using for example the official ES PHP Library.

Leave a Reply

Your email address will not be published. Required fields are marked *