The ever-changing nature of financial regulations is putting increased pressure on IT resources, making it critical that business stakeholders have control over their data and reporting processes.
A perfect example of this is the Comprehensive Capital Analysis and Review (CCAR), which mandates that large U.S. banks undergo a series of stress tests to prove they have sufficient capital in the event of a severe economic downturn. CCAR is a data-intensive regulation with banks required to submit numerous reports, each featuring hundreds of data elements that must be accurate, complete and properly formatted.
What makes CCAR even more challenging for banks is that the U.S. Federal Reserve regularly changes the rules for the stress tests, often a day or two before the reporting period begins. In addition, some CCAR reporting periods have increased from quarterly to monthly, putting increased pressure on banks to be able to react and respond quickly.
I have worked on CCAR projects recently that underscore the challenges that business stakeholders face when trying to keep pace with this regulation. For example, after reaching out to IT with information on a new data domain which required reporting, the business owner was informed that it would be at least eight to twelve months before IT would have the project scoped and the data ready for reporting. This was unacceptable for the business, as reporting deadlines were much more imminent.
During our engagement, Trillium was able to demonstrate to this institution how our CCAR solution could eliminate this problem, empowering the business owners to accelerate compliance while minimizing risk and resources. By leveraging our “business rule” platform and in-depth knowledge of CCAR edit checks, Trillium was able to accurately scope data requirements, efficiently generate CCAR submission compliance reports, and provide business stakeholders with dashboards that gave on-going insight to data defects and key compliance indicators.
The solution is designed to provide “self service” capabilities to the business, enabling them to efficiently access new data sources and modify data edit checks as the Fed releases updates. This approach enables business owners to minimize their reliance on IT, cutting turnaround time, improving efficiency and giving them the confidence that they can attest to the accuracy of their data.
Vice President, Strategic Consulting, Trillium Software
Jon has over 12 years experience in information management and data analysis, where he has worked for both global consultancies and software vendors in many international clients. Initially working within direct marketing operations, assisting clients to build a single customer view and optimise customer analytics, Jon developed a detailed understanding of managing disparate data systems. Most recently, Jon has focused on data quality management and has advised clients to plan their data quality strategy and build business case justifications for data management initiatives such as Data Governance and MDM. Jon has held senior consulting positions at both HP Consulting and Deloitte.