Client is a leading stock broking company providing services like stock broking, financial products distribution, wealth management, and investment banking. They are serving customers ranging from the retail & institutional investors, corporates and High Net-worth Individuals (HNIs)
Stock trading and Algo trading requires proper historical market data for technical analysis. The volume of live market data being received in Indian capital market is around 10GB per day across instruments including equity, derivatives, commodity and currency.
Our client was looking for a technology partner who could conceptualize and build a highly scalable and robust architecture, in a cost-eﬀective manner. The solution should allow their platform i.e their web and mobile app as well as API and SDK users i.e retail algo traders or external partners to consume the OHLC candle data exposed via both API and WebSocket for their requirements.
Oneture put in place a team of engineers including a Solution Architect along with backend developers to evaluate limitations of existing system, transform and re-architect the existing system to cloud based scalable platform to cater client needs.
Existing system of client had below limitations:
The new architecture addressed each and every points listed above, to make the system capable of providing OHLC candle data aggregated in realtime across all channels for intervals including 1 sec, 1 min, 5 mins, 30 mins and 1 day via API as well as WebSocket.
The new architecture proposed by us (inspired by our data reference solution) consists of a Kafka based system which listens to all the market feeds of various exchanges being received via the UDP Ports and aggregates the feed as per the intervals and OHLCV logic and pushes this aggregated OHLCV candle data to AWS ElasticCache (Redis) for memory caching. The cached data is migrated to AWS RDS at the End of Day. The API queries the data from Redis or RDS, based on the time interval for which data has been queried. For intraday data, the API fetches data from Redis and for past dated historical data, it queries data from RDS.
Key Components of the application :
|Development Technologies||Python, Kafka, Redis, Postgres SQL, Flask|
|AWS Products and Services||AWS EC2, AWS MSK, AWS RDS for Postgres, AWS ElasticCache, AWS Elastic Load Balancer, AWS VPC|