And the Best Halloween Costume goes to ...

Halloween approaches each year as a surprise, at least to some of us. Especially as Meltwater has offices in 55 locations across the world, keeping track of the regional customs for a given occasion can be tricky (see Geography of Halloween for example). Hence we are fortunate to have local culture ambassadors that keep their national customs alive and show us how it is done.

So without further ado, this year the prize for the best Halloween costume goes to Agile Coach Jeff from our Raleigh office. Make sure you read the tasks on the Kanban board to get the full story! We love it Jeff!

The Full Stack Fest Experience 2019

In September I traveled to the small town of Sitges outside of Barcelona to learn more about the future of technology. Full Stack Fest is a yearly, single track conference touching on a broad range of tech topics (frontend, backend, testing, new languages, etc). I was there to absorb some of the knowledge and inspiration from the curated list of speakers. In this post I will go over my favorite talks.

Load-driven Shard Distribution in Elasticsearch - Story of an Internship

Since July 2019 I have been an intern at Meltwater in Budapest, working in the Foundation team that is focused on developer productivity. It has been a truly valuable experience to solve challenging real-life problems, that have an impact on the everyday lives of our developers.

In this blogpost, I will share my experience as an intern at Meltwater, and discuss the details of the project that I have been working on.

Zalando & Meltwater kickoff Knowledge Sharing Around Managing Internal Software Delivery Platforms

In late August 2019 Meltwater had the pleasure of hosting Zalando in our Berlin office for a knowledge sharing session about applying product management for internal platforms that improve software delivery performance.

In this post we will explain how this meeting came to be, provide a sneak peak into the topics we covered, and what upcoming iterations of this exchange could look like.

Enriching 450M Docs Daily With a Boring Stream Processor

For our fairhair.ai platform we enrich over 450 million documents such as news articles and social posts per day, with a dependency tree of more than 20 NLP syntactic and semantic enrichment tasks. We ingest these documents as a continuous stream of data and guarantee delivery of enriched documents within 5 minutes of ingestion.

This technical feat required tight collaboration between two specialised teams: data science and platform engineering. Enabling both teams to efficiently work together around a common workflow execution engine was another problem we needed to solve. Hopefully that description fully piqued your interest because our solution (Benthos) is totally boring.