The financial crisis exposed several pitfalls in the overall framework for capitalising trading activities and the regulators across the world have come with up with stringent rules on both the quality and quantity of capital which banks need to set aside to cover for adverse scenarios.
There has been lot of talk around use of big data technologies like Hadoop, Spark, and Kafka etc. over the past of couple years in the financial services industry especially in Risk, Finance and Compliance departments. Fraud With the Latest regulations and scarcity of capital it is clear that a streaming based solutions which can compute the impact of a trade on the capital before the trade is made (what if calculation) would be a great asset.
Looking at the near term regulations like FRTB (Fundamental review of the Trading book), CCAR (Comprehensive Capital Analysis and Review) and see how some of the latest streaming technologies can help achieve compliance for these regulations.
Some of the High-level changes to be implemented as a part of FRTB:
1. Trading book/Banking book boundary
2. Move from Value-at-Risk (VaR) at 99% to Expected Shortfall (ES) at 97.5 confidence level.
3. Stress VaR is eliminated and now only the stress metric (Stress Expected Shortfall, or SES) would be used.
4. Internal models to be replaced by standard models for securitised products.
5. The Incremental Risk Charge (IRC) is replaced by Incremental Default Risk (IDR) and Comprehensive Risk measure (CRM) completely disappears.
6. A comprehensive incorporation of the risk of market illiquidity
7. Model approval and testing to be performed at the Trading Desk level.
A high level overview of CCAR (Comprehensive Capital Analysis and Review):
As a part of CCAR (Comprehensive capital analysis and review) BHCs (Bank holding companies) in the US with consolidated assets of $50 billion or more are required to submit annual capital plans to the Federal Reserve for review. BHC’s capital plan must include detailed descriptions of the BHC’s internal processes for assessing capital adequacy; the policies governing capital actions; and the BHC’s planned capital actions over a nine-quarter planning horizon. There is substantial about of data which goes behind the scenes supporting these Capital Plan Submission reports which need to be done monthly, quarterly and annually (FR Y-14A, FR Y-14Q FR Y-14M) covering a wide range of stress scenarios.
It is very evident from both the regulations that there is a real need for large data storage and near real time data processing to cope with every growing regulatory needs. Gone are the days where risk and capital impact calculations are defined to be an overnight batch processing and stream/real time processing are becoming mainstream.
Some key features required for stream processing in financial risk calculations
· At most once processing (avoiding data duplication)
· Using the source event timestamp for windowing
· High Performance & Low Latency
· Back pressure support
· Fault tolerance and scalability
Latest stream processing open projects like Apache spark streaming and Apache flink backed by messaging layer like Kafka seems tick all the boxes. With primary difference of Spark being is a batch based processing engine supporting streaming using the notion of micro batches whilst Flink is pure stream processing engine which is also gaining lot of momentum in the open source community.
Data sources like front office trading systems, market data providers can feed into Big Data messaging queues to be picked up by Spark streaming / Apache Flink applications which does the calculations and writes the results to a long-term store like HDFS. With easy horizontal scaling running additional scenarios/calculations is just a matter of additional few commodity hardware.