Solvency II data remediation agenda
Over the last 12 months, regulators have repeatedly highlighted data management as an area where the insurance industry still has more to do, relative to other areas of Solvency II.
For many insurers, including those with relatively mature Solvency II data workstreams, the biggest challenges are yet to come if they are to achieve compliance.
Data governance — moving from design to implementation
Few organizations have implemented the data governance frameworks they designed over the last 12 to 18 months. Establishing a framework is a potentially complex and lengthy process.
Implementation of a data ownership model can be equally challenging. If not properly defined, the use of broad terms such as data owner or data steward can cause confusion, especially where data frequently changes hands and passes through multiple systems and processes.
Insurers must consider all facets of data ownership, for example, who creates or produces data, who receives or consumes data and who is able to change or update data.
It is important that individual roles and responsibilities are clear, and that those individuals receive appropriate support and training to make the roll-out a success.
Data quality assessments — the importance of business rules
When responding to the challenge around data quality, many organizations have commissioned some form of data proﬁling to identify potential data issues. This presents a risk that insurers are adopting an approach that fails to ﬁrst deﬁne a comprehensive set of business rules.
In data management, business rules describe the expected characteristics of the data, for example:
A data of birth ﬁeld may always require six numeric characters. The format of the ﬁeld must be DDMMYY, and the date speciﬁed must be at least 18 years prior to the date entered in the policy inception ﬁeld.
A simple profiling routine can then assess whether these rules have been broken for an entire population of policy records or for a sample group. More complex rules could potentially draw upon census data and past trends in policyholder age to assess the validity of the data entered.
Without business rules, data proﬁling can highlight potential issues based on statistical analysis, or generic criteria that are assumed to be unacceptable. Although this can provide early warning of major data integrity issues or conﬁrm long suspected problems, such proﬁling affords only limited insight into the true quality of data.
It is important that organizations develop business rules that facilitate a comprehensive assessment of the completeness and accuracy of their data by engaging with subject matter experts throughout the business.
Once rules are deﬁned and agreed on by business stakeholders, data proﬁling methodologies and tools can provide in-depth, insightful data assessments in a highly efficient and repeatable manner.
Remediation – the hidden cost
Insurers and local regulators generally accept that the current state of data quality will likely fall short of Solvency II minimum requirements.
Most Solvency II programs have focused on understanding the current-state data landscape:
- mapping data lineage from model back to source
- identifying existing control points
- baselining the current quality of key datasets
Relatively few programs have identiﬁed key deﬁciencies across this landscape or developed a robust remediation plan to improve their data quality before the end of 2012.
For most insurers, the extent of the data remediation effort required to achieve compliance, and the associated cost and timeframes, remains a complete unknown. It is imperative to start planning as soon as possible.
The extent of technological change required to support minimum Solvency II may only become clear once a full remediation analysis is completed and business requirements have been crystallized.
By initiating IT change before a remediation plan is in place, organizations are at greater risk of either going too far, missing the deadline and overspending, or not going far enough and falling short of the minimum requirements.
Solvency II data remediation priorities
In our view, Solvency II data remediation priorities fall into three broad categories:
- Direct data profiling and cleansing
- Process and control remediation
- Sourcing additional data
These categories are summarized in our Solvency II data remediation agenda.
Internal model validation – focus on the data
Asset, liability and ﬁnancial data are often a significant area of change for ﬁrms developing and applying for approval of an internal model. To gain internal model approval, it is necessary to demonstrate that the model meets further governance requirements and standards of statistical quality.
Processes and controls on the data ﬂows for external as well as internal data will need to be demonstrated. In particular, Article 121 on Statistical Quality Standards requires that all data used for the internal model shall be accurate, complete and appropriate — and this will need to be evaluated as part of the validation of the internal model (regardless of whether the data is sourced internally or externally).
To meet the Use Test, it will be necessary to provide evidence that the ﬁrm's governance system has an understanding of the underlying data limitations of its internal model.
Many organizations are performing initial validation of their internal model this year. We recommend that insurers include a robust data validation exercise, including both inputs and outputs, as part of this process.
It is important to recognize that insurers are not all starting from the same place in terms of their technology infrastructure and, therefore, it is difficult to set common industry benchmarks in terms of a minimum level of IT spend for Solvency II. Some programs are making little or no change to current IT infrastructure, while others are investing significant sums in new IT to support Solvency II.
<< Previous | Next >>