Apache Storm is one of the tools that fits into the category of Big Data tools for real-time processing of data. This post will try to explain Storm with a simplified analogy to a simplified version the human digestive system.
At Betfair our read services are struck with billions of requests per day, they are not evenly distributed either. These requests will arrive in huge spikes of traffic during key sporting events, putting our customer facing services under huge pressure during sustained time periods throughout the day. We develop our systems to cater for this demand, keeping true to our latency SLA’s all the while operating without downtime. Unlike comparable trading platforms used in the financial world, we don’t have the option of closing trading at 5pm – sporting events occur around the clock, every day.
When we talk about read services, we are referring to anything that is presented, in real-time to customers – either through the API or via our online channels. Notably, our price read services. They were the first to move to the streaming model. If you are not familiar with financial trading, price read services present ‘ticks’ on a market to our customers – billions of them. Ticks are price/volume pairings for a given selection on a market. See below.
Last Thursday (14th May) saw the inaugural “Meet Betfair” event take place at Betfair Towers, London. The evening featured a series of lightening talks from developers working on core Exchange technologies.
Topics ranged from a history of our architectural evolution to how continuous delivery pipelines are helping cut out manual process from our production deployments. There was even time for a quick maths lesson to discuss how probability is used to calculate odds and “cross match” bets.