The cost of good data is high; the cost of bad data is ruinous

The cost of good data is high; the cost of bad data is ruinous

Data quality 6.jpg

Never has it been more important for banks – especially those with assets of over $10 billion – to implement rigorous standards for data quality, governance and control. Failure to do so will have a whole host of negative consequences, writes Eugene Ludwig.

Last month I talked about the evolving technology landscape and its impact on banking, outlining my intention to cover various technology-related topics in future opinion pieces. I begin this series with Data and data management, both because it is a critical area for a successful tech stack implementation – it must be done right for other projects to work well or at all – and because it is a weak point in many banking organizations.

Data quality, management, control and use are becoming increasingly important for banks. At its core, a bank is a balance sheet – a collection of assets and liabilities. Unlike retailers, wholesalers or professional services firms, banks own few physical assets. Without solid data, you are not a solid bank. If you take away a bank’s general ledger, it can no longer track assets and liabilities – and is essentially bankrupt.

Soon, data quality and management will be critical to remaining a highly competitive, best-in-class financial institution and receiving positive regulatory ratings. For larger institutions, the future of high-quality data management programs is now. Almost every technology tool, especially those powered by artificial intelligence, relies on good data. But for too many banks, their data programs are seriously lacking.

The Bank for International Settlements published in 2013 Principles for effective risk data aggregation and risk reporting for the 31 global systemically important banks and has published regular progress reports since then. The most recent report, from November 2023, shows that only two of the 31 banks are fully compliant. This reflects, at least in part, the reality that as the economy and technology change, there will always be work needed to make new products and services compliant. Leading banks may be working with fintech partners on innovative new business models, exploring generative AI, or positioning themselves for cryptocurrencies – all of which involve leveraging new data that needs to be accurate, protected, and integrated into their data management frameworks. Moreover, the technology itself is becoming more powerful, so the stakes are getting higher and higher.

Of course, the strategic value of data management goes beyond regulatory compliance, but the two are closely linked. The recent bankruptcy of Synapse and Silicon Valley Bank underscore the need for robust data management to mitigate risks arising from complex business models, rapid data growth and data velocity. Events in the capital markets such as the Archegos bankruptcy have increased the focus on more effective management of counterparty risk in investment banking and trading.

Data quality, architecture, management and interoperability pose significant challenges for banks. Banks collect vast amounts of data from customers, suppliers, markets and internal sources, but much of this data remains siloed and is captured in ways that limit its usability. Manual data entry exacerbates these problems because it requires extensive “cleaning” to make the data usable. While data collection, storage and processing practices vary from institution to institution, the general problem of dirty and/or idiosyncratic data is pervasive. This limitation hinders the usability of data across individual business units and across the bank.

This is a difficult topic that is not easy to tackle and handle properly. Banks face significant challenges in achieving flawless data quality and management. Even smaller banks with limited business lines face challenges with manual data entry, inconsistent data capture and manipulation across systems and vendors. As banks grow and diversify, ensuring data consistency across business lines becomes increasingly complex. Are Gene Ludwig and Eugene Ludwig the same person? What about ABC Advisory, LLC, which is owned and controlled by Gene Ludwig?

Acquiring another institution—which may have solved problems differently or not at all—adds additional complexity to data management. While siloed operations can mask these problems, consolidated data is needed to achieve enterprise-wide efficiencies and cost savings.

In the past, some banks were able to operate with low-quality data programs because bank personnel supplemented important information with knowledge of their specific silos and the people or environments that mattered to them. Today, with larger banks, increased competition in fintech, and a growing reliance on data-driven technology, this is increasingly less the case.

For many banks facing margin erosion, competition, and high data management costs, prioritizing data management and control may seem like a luxury. However, this can prove to be a false economy. First, banks must capture and manage sensitive customer data, which is why thoughtful data management is critical to cybersecurity. In addition, effective data management is essential to defend against charges of compliance violations and to identify new trends that may threaten the health of the bank, from customer trends to issues related to balance sheet composition.

While the benefits of high-quality data are often recognized, cost pressures often lead to suboptimal workarounds and solutions. This approach can be counterproductive due to the competitive environment, the complexity of the bank’s business model and/or the size and complexity of the bank itself. In addition, regulators increasingly expect banks, especially larger institutions, to have robust data management programs in place.

It is easy to express the importance of a high-quality data architecture, but actually building that architecture is difficult.

To attract top talent and encourage collaboration across business units, senior management commitment, supported by the Board and budget, is essential. Achieving a bank-wide standard requires adapting existing definitions. Trade-offs for the good of the whole have historically been difficult to achieve. But establishing a baseline, setting goals and continuously measuring are critical to progress.

A competent Chief Data Officer who has both vision and execution skills is critical. Second, it is important to develop a roadmap and start the journey, even if the path to a world-class data program is long. Delays exacerbate challenges as new programs, products and data complexities emerge.

Ultimately, a robust data management program is critical to the long-term success of banks, especially those with assets exceeding $10 billion.

Leave a Reply

Your email address will not be published. Required fields are marked *