Here is a Data Approach that is Helping Financial Services Firms do More with Less

1374788_35279525Banks are under siege and it looks like this is the new normal — many new entrants are emerging and focusing on the most lucrative areas of banking.  Banks are also under pressure from regulators and while not new these increasing requirements are added on top of everything else.  Most of the people who I know in the industry feel over worked at an unprecedented level and are being asked to do increasingly more with increasingly less.

This is certainly the case with data; more data is needed, data is increasingly seen as strategic, but there is increased pressure to reduce costs and increase efficiency.

Typically, stakeholders from finance, risk, fraud and operations are siloed and not interested in finding ways to collaborate especially when it comes to data — yet they are all interested in the same data elements and activities: payment transactions, liquidity relevant events, information about parties and counter parties involved in transactions, a historic view of behaviour, a perspective on which figures are reconciled against the GL.  Each of these topics and many others are challenges to define, capture and store yet financial services organizations have typically built these solutions separately.  Data semantics, storage, movement, and history are all duplicated which is really expensive and very brittle from a change perspective.  Some have realized that it’s time to take a new look at how this is handled.

The approach that has evolved recently involves separating the view of the data from the storage and allows for a unique lens on the data for each one of the stakeholders while centralizing the storage, movement, and history.  The semantic is interpreted during the view rather than hard coding this into the model.

The technology that is helping to enable this is often referred to as “Big Data” but the real story is not about the technology but rather how it enables the separation of the semantic from the storage and how this allows loosely connected or disconnected groups to work together from the same platform without having to agree on the low-level details about data definitions.

This approach can help to save as much as 3/4 of the data costs associated with finance, risk, fraud and operations – and even more when cost of change is considered.  What is required to achieve this is an open mind and a willingness to adopt a new model.  Since the spend in this area can be massive for larger institutions there is a potential here for tens or hundreds of millions in savings.  Given these financial incentives, I expect these approaches to continue to be adopted by banks that are serious about efficiency.

Share It:

Original Post: http://www.corebankingblog.com/2014/02/dataapproach/

One Comment

  1. Interesting approach on big data, banks are increasingly embracing big data and analytics to leverage customer data to provide better services, such developments will be able to leverage historical customer data and retain the existing ones. I work for McGladrey and there’s a whitepaper on our website with great information for banks on optimizing existing technology and other related insight into improving overall performance. @ “Eight ways for your financial institution to boost performance now” bit.ly/1fBWe9u

Leave a Reply