I spoke to a really frustrated “Client Reporting Data Manager” at the FSO “Investment Management Industry Transformation and Outsourcing Strategies Forum” in London on April 20th last.
Their issue was that their institutional client reporting team spent more time fixing up masses of data prior to publication than they do actually on reporting to clients.
I have referred to this concept on many occasions as “just-in-time” data management – the just-in-time data management operating model can be a disaster and I would not recommend it as a modus operandi.
So how do you go about getting out of the state of “data damnation”?
First of all you need to drop the operations hat and don the sales hat – because you clearly have an issue and you are going to have get buy-in from top-down and bottom-up that the issue should be addressed.
Next question – how do I go about getting buy-in that there is a problem that needs to be solved? Well before you start talking about your problem you need to build a business case – don’t waste valuable C-level time bringing a problem to the table without bringing the solution. Remember at C-level many of the actors are not aware there is an issue – using the duck pond analogy – what they see is a duck swimming across the pond gracefully i.e. they believe that the company’s client-facing data is of good quality and is timely, accurate and consistent – what they do not realize or see is that beneath the surface the duck’s legs are paddling furiously i.e. the process of producing high quality data is enormously manual, non-systematic, high-risk and resource intensive.
1. Build a solid business case that highlights the upsides that will be delivered by moving away from the ‘just-in-time’ model to a model that is structured around governance, de-centralized ownership, accountability, oversight and transparency. Examples of upside sells are:
- Better client facing data will mean you have happier, “stickier” clients. Your sales/distribution network will place greater trust in your data and you will ensure that there are no outflows, loss of mandates etc due to poor quality data being received by your clients. Identify clients / mandates you have lost due to poor service or bad quality data – identify the exact financial costs to your company.
- Identify the potential upside in new mandates and inflows as a result of brand recognition in the market for having excellent high quality data
- Identify how your own team’s ‘output’ will improve – get specific on the activities you will be able to devote more time to as a result of not having to chase your tail, fixing data at the last minute.
2. Outline the risks that will be mitigated by moving to the new target model – you need to don the insurance sales person’s hat here. You should talk about the following:
- Identify the cost of the accident which is waiting to happen
- Identify the probability of the accident happening if no action is taken
- Put an actual value on the following: the damage to your brand and reputation – what cost would be involved from a marketing perspective to dampen negative PR as a result of the accident happening? Some would argue your brand and reputation are priceless – that is because the PR cost to put it right runs into millions and tens of millions od dollars in many cases. What impact would it have on your AUM base – note the 400m USD outflows from AXA Rosenberg recently due to negative news – this was reported on FUNDfire on April 29th 2010 – “AXA Rosenberg has been fired from a $400 million enhanced large-cap equity mandate by theFlorida State Board of Administration...“
- Put a value on the cost of a fine from the regulator – remember the fines are now commonly a 7 figure value
- What impact would a regulator fine have on your brand?
3. Outline the costs that will be saved and include:
- How many FTEs will be reduced / re-allocated as a result of your new operating model?
- How will your vendor relationships change? – outline how it will be simpler to move particular vendors once you have a clean data interface – typically vendors who supply services such as client reporting, automated fact sheets, micro-sites and compliance have deeply-embedded, difficult to shift relationships – they know this and charge a premium as a result.
If you do not have a strong data governance organization permeating your company, set about introducing one – this really does require strong “C-level” leadership and drive – many companies adopt the ‘Chief Data Officer’ role, or Data Tzar, while others employ a broader steering committee approach where senior data stewards oversee the data governance at a company level. Each approach has its own merits and typically the organization’s culture will determine the best fit.
Identify data stewards who will take ownership of data at the ‘origination’ of that data i.e. at the earliest point in your structure – i.e. where the data enters your structure or is created within your structure. This is the aspect of the ‘sales process’ that is bottom-up. This will be a thankless, fruitless task if you have not executed the top-down sales process.
I will follow up soon with a post that deals with what the target operating model for client-facing data should look like…
As an aside, at the same FSO event, I was the moderator on the “Thought Leadership: Best Practices for Data Management, Performance Measurement and Client Reporting” panel.
The background theme to the panel discussion centered on the rapid technological advancements and evolving operational initiatives that have brought into focus the importance of centralized data management. These changes also highlight the need to translate mundane data into meaningful strategies and analysis to enhance client reporting. The panelists’ goal was to debate the pressures of effective data management and the role of shared industry data utilities in the financial services sector. The discussion was also to focus on the latest technological advancements that support valuable data management, improved client reporting and servicing and a sound performance measurement framework.
The specific topics discussed were:
- Drivers for re-architecting data management post the financial crisis Read the rest of this entry »