To provide structured data from the Web to our customers, our team maintains a web crawler. The system downloads and processes 600K URLs per hour, resulting in huge amounts of logs. That costs a lot!
In this blog post, Róbert from our Crawler team explains how we modified our architecture to save 40k USD/year on logging.