13 minute read 22 Apr 2022
Bank Leverage

How can banks leverage ever-increasing reporting burdens to create a truly data-driven business model?

Authors
Casper van Hilten

EY Netherlands, Manager Financial Services Consulting

Casper is a manager in the EY CFO Consulting Financial Services team, focusing on data strategy, regulatory transformation and digital finance.

Igor Djukic

EY Netherlands, Senior Manager Financial Services Consulting

Igor is a Senior Manager in the EY CFO Consulting Financial Services team, focusing on data strategy, data management, regulatory reporting, and digital finance.

13 minute read 22 Apr 2022

Given the challenges banks are facing from external factors such as regulatory requirements and technological developments, we envisage three main focus areas to develop a truly data-driven business model

In brief :

  • Formulating an explicit data strategy is essential as banks are uniquely challenged to integrate defensive and offensive data capabilities
  • In order to execute a data strategy, it is of paramount importance to invest in the right platforms and lay a future-proof data foundation while maintaining flexibility and agility
  • People and culture cannot be neglected in any transformation process while becoming a data-driven bank

In our previous article, we explored how the trifecta of regulatory, technological and competitive pressures is uniquely challenging banks’ business models. Increasing reporting burdens lead to rising expectations of existing finance and risk organizations, while technological developments open the door to transformation towards more data-driven business models. As competitive pressure mounts from peers and digital-native banks to utilize the tools of the data revolution, we suggest banks focus on the following three areas.

  1. Integrate your reporting strategy with your overall data strategy and governance

The importance of a data strategy

Only two decades ago, data and reporting were mostly regarded as a tedious finance and IT exercise, not adding much direct value to the bottom line of many banks. However, with the advent of the big data revolution, data is becoming a strategic asset and factor of production for many banks. According to a recent study , 83% of leading financial services firms say that data is their most valuable asset, even though only 16% consider themselves “excellent” in extracting value from their data. According to the study, Data & Analytics technologies could potentially provide banks with up to one trillion euro in annual revenues.

According to a recent study, 83% of leading financial services firms say that data is their most valuable asset, even though only 16% consider themselves ‘excellent’ in extracting value from their data

A first step to unlock that value is to formulate and design a data strategy. Broadly speaking, data strategies can be divided into offensive and defensive strategies . In this dichotomy, defensive strategies generally aim to achieve security, quality and compliance, while the objective of offensive strategies is to improve one’s competitive position. A key challenge here is the trade-off between control and flexibility in terms of how you use your data. If your bank currently has a low maturity in data analytics & management capabilities, it might make sense to focus on a more defensive data strategy. This ensures the appropriate organization and foundation is built so that a defensive prerequisite such as high-quality reporting is attained. Given the strict data governance, quality and reporting requirements to which banks are subjected, they are in a unique position to at least formulate strong and coherent defensive strategic objectives.

As competitive pressures mount and a solid data foundation is established, a pivot can be made to a more offensive strategy, in which this foundation can be leveraged to provide customers with high value insights and further personalized product offerings. Without the articulation of a strategy, initiatives will remain fragmented and the overall attitude towards data and reporting highly reactive. This in turn leads organizations to accumulate technical debt, as components of data infrastructure are modernized while no overall strategy exists for phasing out legacy systems.

Without the articulation of a strategy, initiatives will remain fragmented and the overall attitude towards data and reporting highly reactive

Integrated data governance

Data governance, according to DAMA , takes its place at the heart of your company’s data management organization and should be thought of as supportive of your overall data strategy . Banks are in a unique position to use regulatory standards as a lever to build best-in-class data management and governance practices since they are already obliged to do so by their supervisors. For example, BCBS 239 requires banks to have strong governance arrangements in place for their risk data aggregation and reporting practices. These standards are also gaining traction beyond the banking industry as best practices are being established and shared across industry lines.

Taking a holistic approach to your data management and governance creates several advantages. First of all, management might become “regulatory fatigued” as significant investments and expensive programs need to be established to become compliant. Turning that built up knowledge and established frameworks into more offensive use might reinvigorate the organization’s efforts in these areas.

Secondly, a reporting data lifecycle can be highly conductive towards an analytics data lifecycle as much data has already been defined, cleaned and transformed into meaningful information products. Typically, the reporting data lifecycle starts with data definition, and continues with data collection and transformation, to end with data exploration and reporting for regulatory purposes. If this process is supported by a highly integrated data dictionary, ensuring a common vocabulary and understanding of data elements across domains, this facilitates the use of this data for more strategic purposes as well.

By effectively creating a Single Source of Truth according to the define once principle, an integrated data dictionary not only facilitates regulatory compliance and data quality, but also enhances business understanding and thereby data-driven decision making. This Single Source of Truth can consequently be supplemented with domain-specific knowledge at the level of data consumption, effectively creating Multiple Versions of the Truth which are still consistent with the enterprise-wide understanding of the data. Banks are well positioned to use this approach as the required data controls and governance necessary are already in place or are being implemented.

By effectively creating a Single Source of Truth according to the define once principle, an integrated data dictionary not only facilitates regulatory compliance and data quality but also enhances business understanding and thereby data-driven decision making

  2. Investing in a future-proof data foundation which facilitates further integration, scale and data use

Re-evaluate your current technology stack and data architecture

Many traditional financial institutions are dealing with huge quantities of data for which they are ill-equipped to take full advantage. In our experience, most banks cannot even access let alone use and analyze all the data they acquire. Legacy IT and data architectures which in some cases have been built decades ago are rendered obsolete by today’s requirements.

Legacy technology is costing financial institutions in two major ways. First is the increased cost of running the bank on outdated technology, leading to increasing operating expenditures. This is because more is being demanded of systems compared to the reason for which they were developed in the first place. Moreover, tinkering with existing systems and corresponding implementation of end-user-computing for reporting purposes results in more and more technical debt being accumulated. This in turn renders the transition to a more future-proof architecture more difficult than before.

Second is the opportunity cost of forfeited income from innovative use cases. This is the consequence of investing in keeping legacy architecture in the air, instead of building the bank that customers want and regulators demand. In this sense, legacy systems are inhibiting institutions to grow and keep up with more digitally native FinTechs.

Given these challenges and your defined data strategy, it is imperative to critically examine your current tech stack and data architecture. Is it future-proof, and does it enable or inhibit your strategic objectives? The cornerstone of most modern data architectures nowadays is the data lake or data mesh. We see two considerations that play a major role here as far as financial institutions are concerned.

First is zero-data latency, which refers to the ability to use and process real-time data. This includes the capability to use both batch and real-time data processing. The ultimate objective here is to be able to provide customized products and services on demand and in real time. But using a data architecture with high (real-time) availability also enables data consumers to create tailor-made and automated reports used for both management and regulatory reporting, which are often time-critical.

Second is increased data consolidation. As information is historically stored in different departments and legacy systems, data is often not readily accessible to end users, and complex data exchange processes need to be established in order to comply with reporting regulations. This is further complicated by scattered data management and a lack of semantic integration of different data concepts. A data lake as the cornerstone of your data architecture enables organizations to break down data silos and create a Single Source of Truth for all data necessary for consumption.

Making all source data available according to pre-defined definitions and data governance principles enables the consumption of data from different domains through a single platform. At the same time, a platform of this nature enables end users to create specific views and tailor-made data products at the consumption side of the lake. Key prerequisite here is the right technological know-how and a well-functioning data governance organization. Further to the last point, the focus should be on data quality. This will be a key consideration on the consumption side, especially as it relates to regulatory reporting.

It is imperative to critically examine your current tech stack and data architecture. Is it future-proof, and does it enable or inhibit your strategic objectives?

Modular data architecture design to increase agility

As change accelerates, competitive forces increase and regulatory demands are more frequently adjusted, agility becomes one of the key capabilities of the future-data driven bank. From a data and technology perspective, this means building your foundation in a highly modular fashion as opposed to using pre-integrated tools and solutions. A modular data architecture refers to an architecture comprised of several components that can be connected or disconnected from each other. Key here is one can add or remove parts of the architecture without affecting the overall system.

A modular data architecture refers to an architecture comprised of several components that can be connected or disconnected from each other.

A modular design can be achieved by using concepts emerging from DevOps which increase code portability. One example is container technology (such as Docker), which facilitates software isolation from its environment and removes dependencies from specific providers, helping to prevent vendor lock-in. Most vendors these days use container technology and make it easier than ever to move applications to alternative (cloud) vendors if necessary. Analytics solutions such as Amazon Sagemaker and Kubeflow build on these concepts and can connect to a large variety of backend solutions and databases and enable the design of highly modular systems.

From on-premise to cloud

One of the most disruptive forces for the modern data architecture is the move towards cloud solutions. The reason for this is that cloud solutions provide the opportunity to scale applications and storage rapidly as data capture scales with the business. The large Cloud Service Providers (CSPs) – Amazon, Google and Microsoft – have truly revolutionized the way organizations use and run their digital infrastructure and applications at scale. Specifically, cloud offers the opportunity to deliver on-demand storage (vertical scaling) and on-demand computing and processing power (horizontal scaling). In addition to providing the flexibility needed in a rapidly-changing world, the move to cloud can lead to significant cost savings, as cloud-based platforms usually run on a variable cost model. This means limited expensive overhead needs to be in place in terms of on-premise systems which might not be used at all. The upfront costs are negligible compared to installing extensive and expensive on- premise systems.

Using a hybrid-cloud (leveraging on-premise and cloud solutions) or multi-cloud (using several cloud providers) also provides the additional benefit of creating redundancy enabling the necessary high availability of data for time-sensitive reporting and insights. On the flipside, regulators are putting special emphasis on certain aspects of using Cloud Service Providers, such as detailed security considerations. For example, The EBA has published stringent outsourcing guidelines for credit institutions in the EU. This has been supplemented by specific recommendations to institutions considering the use of Cloud Service Providers, including the right to audit by authorities, the right to physical access to the premise of the CSP, and guidance on the security and data systems used. According to the EBA, institutions should adopt a risk-based approach in this respect and implement adequate controls and measures such as encryption technology to ensure security of the overall architecture.

Critically examine vendors and vendor alliances

Next to security considerations, the EBA believes institutions should also have specific contingency plans and exit strategies in place for the outsourced parts of their data infrastructure. This emphasizes the need for a coherent and flexible outsourcing and vendor selection strategy when creating a future- proof data foundation. Agility is again an important consideration in this respect, as well as preventing vendor lock-in. Vendor lock-in occurs when the cost or effort of switching platforms outweighs the benefit of doing so. Institutions should have clear exit strategies for their outsourcing partners and prevent being locked-in to one service provider’s technology stack. Fortunately, modern architecture strategies such as containerization leading to loosely-coupled applications and multi-cloud enable institutions to maintain flexibility and retain some leverage over their vendors.

According to the EBA, institutions should also have specific contingency plans and exit strategies in place for the outsourced part of their data infrastructure. This emphasizes the need for a coherent and flexible outsourcing and vendor selection strategy when creating a future-proof data foundation

  3. Developing the skillsets and culture needed for a truly data-driven model

While building the technical data foundation is critical to become truly data driven, many companies overlook the importance of developing the right skillsets and culture to accommodate such a model. Especially in non-digitally native institutions, it can be difficult to obtain the appropriate amount of buy-in and enthusiasm from existing employees and management.

Several steps need to be taken to create buy-in from the organization. Firstly, it is important not to “impose” a data culture or strategy emphasizing the use of data for data’s sake. The business objective and value-creating nature of better decisions and processes should always be at the forefront. Secondly, the “tone at the top” needs to be very clear, and there should be a very clear commitment from senior management to becoming truly data driven. This requires top management to articulate a clear vision and understanding of the required capabilities for becoming data driven and to coherently communicate this throughout the organization. Thirdly, one needs to have certain “data missionaries” within the organization who bridge the gap between the business and technology teams, and are able to educate and create enthusiasm for the use of data within the organization. Lastly, it is important to define the different roles and skillset needed within the organization and consequently consider whether to fill those roles with external talent. If so, it is important to focus not only on culture-fit, but also culture-add throughout the hiring process; what does a candidate bring in terms of skillset, mindset and experience that is complementary to your organization, and helps build a data-driven culture? Overall, the creation of a data culture is a key driver in becoming data driven, and if done successfully can infuse enthusiasm and energy around data within the organization.

In your hiring process, it is important to focus not only on culture-fit, but also culture-add; what a candidate brings in terms of skillset, mindset and experience that is complementary to your organization, and helps build a data-driven culture?

Conclusion

As reporting obligations and related data requirements increase, banks are in a unique position to use these as a launchpad for developing a truly data-driven business model. The transition to a data-driven business model is vital given competitive pressures and technological developments that have enabled the rise of digitally-native banks.

In order to achieve this, banks would benefit from developing a holistic strategy and vision around data, for which internal and external reporting requirements could be a useful accelerator. Once a coherent strategy has been laid out, it is crucial to develop the right data foundation catering to the pre-defined strategic objectives. Finally, the true value of newly-acquired data assets and technology can only be unlocked once the right skillsets and culture around data are well developed within the organization.

Summary

Banks are in a unique position to develop a truly data-driven business model. As a matter of fact, it’s becoming vital to adapt because of the rise of digitally-native banks.

Banks need to be developing a holistic strategy and vision around data, the right data foundation catering and gaining the right skillsets and culture around data.

About this article

Authors
Casper van Hilten

EY Netherlands, Manager Financial Services Consulting

Casper is a manager in the EY CFO Consulting Financial Services team, focusing on data strategy, regulatory transformation and digital finance.

Igor Djukic

EY Netherlands, Senior Manager Financial Services Consulting

Igor is a Senior Manager in the EY CFO Consulting Financial Services team, focusing on data strategy, data management, regulatory reporting, and digital finance.