Compliance with BCBS 239 standard

Discover how we supported a Hungarian Bank to be able to comply with the BSCBS 239 risk requirements

The standard number 239 issued by the Basel Committee on Banking Supervision (BCBS 239) prescribes requirements concerning risk data aggregation and risk reporting. Its principles comprehensively address all major subject areas from governance, data architecture and IT infrastructure, through accuracy, integrity, completeness, timeliness and adaptability of risk data aggregation, to accuracy, comprehensiveness, clarity, frequency and distribution of risk reporting. Our client trusted us to provide them business and IT consulting to enable them to adhere to six of the fourteen principles of BCBS 239.

The challenges – data architecture, IT infrastructure, accuracy, integrity, completeness, timeliness of risk data aggregation

The selected principles of BCBS 239 to focus on were those that define data architecture, IT infrastructure, and the accuracy, integrity, completeness and timeliness of risk data aggregation.

Though these principles may seem to represent separate aspects of requirements, following thorough analysis we came to a digest that is much more straightforward to be adopted by top management with a holistic view: all risk related data throughout the organization must be collected

  • comprehensively (that is, covering all risk areas, all customer segments, all product and services in all departments),
  • via automated and audited business and IT processes,
  • in a well-structured data model
  • residing in a central repository
  • that can be accessed based on controlled access to various data consumers across the entire bank.

As the bank already had a data warehouse (DWH) in production for quite a few years, the idea to integrate the data model into it may have seemed obvious. After evaluating the forecasted overhead of pushing the required changes through the DWH, which had very heavily governed change processes and the broad community of its consumers were meant to be affected, the proposed solution was to build a separate risk data hub that fully complies with the requirements derived from the principles in focus. At a later stage, when the data in the risk data hub was audited and proved to be of high accuracy and reliability, they would switch over to use the same data feeding processes as the risk data hub.

To make the audit trail complete, not only the processed that feed the data hub, but the processes by which data consumers perform analyses and produce reports from the data hub had also to be assessed and documented in order to quantify the probability of data corruption at each data transfer or processing step.

Our contribution – audited dataflow from source to risk data hub

The risk management domain consultant and the two data management subject matter experts assigned by Advocate committed themselves to accomplish the following goals:

  • Compile a business data catalog – a business requirements specification document (SRS) – that includes a detailed assessment of each data item that needs to be included in risk data aggregation.
  • Derive a technical level catalog – a set of system requirements specification documents (BRS) – that derives from the BRS the specifics of which IT systems and in what structure the required data resides in, together with a mapping that links data items described in the BRS to those specified in the SRS.
  • Design the logical data model (LDM) of the risk data hub that can accommodate the data collected.
  • Design a physical data model (PDM) of the risk data hub that can accommodate the data collected.
  • Design the interfaces between the systems involved in the data feed and specify them in a set of SRS documents, one for each source system and also those in between the sources and the data hub.
  • Compile a document that describes in detail the subsequent steps of the processes that transfer data from their sources to the risk data hub. This document contains the assessment of possible data corruption for each step in the process, may it be an interface that simply transfers data or one that changes the structure or content of the data.
  • Perform an analysis of data transfer/processing dependencies and bottlenecks to be able to optimize and parallelize the feed process framework – thus to provide for timely (next business day) availability of risk data to their consumers.
  • Create a study that includes recommendations of how the risk of data corruption at individual transfer or processing steps can be mitigated.
  • Support the system analysts and development teams during the implementation of the above.

What has been achieved – audited risk data aggregation, analysis and reporting

After concluding the design and implementation of the changes as specified by the SRS documents, it became an easy task to show for any piece of risk data, from which system (or systems) it came from, through what subsequent transfer and processing steps, with a quantified reliability – that is in reverse proportion to the probability of data corruption – assigned to each step, as well as the whole chain of steps leading from sources to the reported data. Clearly, any data processing trail inherits the worst reliability, that is, the highest probability of data corruption to be found among the steps that constitute it.

Risk managers and the department responsible for regulatory reporting were enabled to use aggregated risk data from a single audited source, available the next business day, with a well-known data reliability.

The team responsible for data quality management were provided a clear picture of where they should focus their efforts, to enhance transfer and processing steps that bring down the reliability of the data consumed.

Related Case Studies

Compliance with EMIR Trade Reporting Obligation

The ESMA obliged EMIR Trade Reporting and related regulations meant a great challenge for the whole sector. Find out how Advocate consultants made sured a Hungarian Bank's Treasury to comply with the new obligations.

2019-07-13

Learn more

Knowledge Domains involved