I was at the TSAM (UK) 2010 event that was held on March 9th in London and was lucky enough to get myself onto two of the panel discussions on the data management stream.
The following is a synopsis of discussion one – “How to get the message across internally that investment in data management should be done” – the theme for the discussion was broadly around the following topics
Getting buy-in from the business to enable generation of value for business, where and how?
- How to get action plans signed off and accepted through the ranks
- The impact of data quality on exposure to risk, client satisfaction, costs and audit overhead
- Considerations around outsourcing / off-shoring to create a utility for data management and the cost factor
- What are the implications of delivering poor quality data to the market?
The panelists were:
- Hans Lux, Enterprise Data Architect, UBS Global Asset Management
- Shannon Walker, IT Architect, Deutsche Bank
- Ronan Brennan, Chief Technology Officer, MoneyMate
- Colin Close, President, Netik
- Gerard Walsh, Head of Change Management, Web and CRM, Schroders
- Danielle Newland, Product Manager, Data Management, Eagle Investment Systems
- Abbey Shasore, Chief Executive Officer, Factbook
The key focus of the discussion centered on how you should go about getting buy-in from the business that investment in data quality management needed to be made. Some of the key points made in the discussion were as follows….
One panelists view was that you have to prove you are getting value for the business.
It is challenging as you have to get funding to fix the problem and a lot of the time, more people are “thrown” at the problem.
“C-Level” do not necessarily care how much time people are spending on this area – they are more concerned with whether it is happening.
Clear viewpoints were expressed that – to assist the selling process you need to provide metrics to support the buy-in request e.g.
- How many adjustments do you make each month to your reports
- How can you report to a client on e.g. what is my exposure to “x” (where x is a troubled company)
Senior management often do not realize just how much work goes into data cleansing.
Also, sometimes people in the middle-office are hardest to convince – they are used to current practices and “it’s the way we’ve always done it”. People “in the trenches” can be convinced more easily as they know exactly what is involved and how much pain they go through to get their data to market.
Another panelist was of the opinion that oftentimes data management is not the main project, often the main project will be around outsourcing or client reporting. The difficulty is sometimes building the case and showing that data management is a necessity. The “audit argument” can be your best friend – where you can demonstrate audit trails for all of your data points.
My own view here was that the likelihood of getting buy-in would be directly correlated to how well data governance is managed within the organization already. If the organization does not have an existing governance structure, be it an data czar regime or committee led, then it is unlikely that data management and data quality are high up on the C-Level agenda and this will make life harder.
My point here was that if a culture of ownership and accountability for data quality does not pre-exist then this is in fact your first challenge and you need to get the messages across vis-a-vis the relative advantages and disadvantages that strong data governance delivers.
Additionally, I tried to make the point that there is no point selling just a positive or a negative story – you need to have a really well-balanced argument that is quantifiable in either how it will drive costs down, or make the business more efficient – balanced with the great upside stories of client retention, satisfaction and inflows – counter balanced with the risk mitigation scare stories – or as Colin Close eloquently referred to them as “the accident that has not happened yet”.
One of the other panelist’s view was that ideally projects should not be positioned as data management – if you go to your COO and say there’s a problem with our data they will respond – “what’s wrong with it and why haven’t you fixed it already” – which to be honest is not very far from many people’s reality. The key is to demonstrate that you will either generate more revenue or reduce costs – or preferably both!
There was a question from floor: “how do I get my Finance Director to sign off an investment of half a million dollars in a problem they don’t recognize?”. This obviously generated some stimulating debate along the lines of..
- It’s back to generating revenue, attracting more customers or else reducing costs.
- Data management is a “secret” strategy – it might be perceived as a “nice to have” – always bring it back to costs, performance etc.
- Vendors must prove value and benefits achieved – and – demonstrate real ROI.
In summary though the panelists views were fairly clear – ensure you have very clear ROI and a real business case.
To whatever extent possible deliver real world cost-benefits – be subjective if you have to – but do not over sell on fear – if your case is built on clear quantifiable measures the proposal will sell itself.
Next the discussion moved onto considerations around offshore and outsource and particularly how each could impact data management.
Again the panel had clear and common view points – data ownership, accountability and transparency are all key aspects you must get right before you engage.
Don’t try to push your existing issues over the proverbial “fence” – this was also a key element of a later talk presented by Invesco.
Gerard from Schroders made an interesting aside at this stage which is worth sharing: “what piece of data is never wrong?”
Which is a really excellent point and this goes back to ownership – find the person responsible for each piece of data – make sure they are accountable, and make sure that their ownership is transparent – i.e. track and measure quality – ensure MIS is centralized and visible to all players, albeit at different levels of ‘depth’.
Another panelist thought that – when outsourcing, the client must have a very clear picture of what they want to do and where they want to go.
While one of the other speakers had the view that – you can’t completely outsource data management as the client needs to be heavily involved in all parts of the process but you can outsource parts of it.
My own view point here was that if you’re looking to outsource or offshore aspects of the data management process it must be done in a with-source model, this is ‘MoneyMate-ism” we use to explain our own ‘outsource’ model which is not truly outsourced, but rather it is very much partner-oriented. My view is that your outsourcer must actually be working with you on a partnership-oriented relationship – it cannot be supplier-client – it must be equal, with shared risks and rewards. Cost should never be the core driver in a partnership but obviously cost-control should be!
In my own experience certain things really help in getting “with-source” to work
- A partnership approach as opposed to client-supplier
- Service Level Agreements should not be a fixed schedule in a contract. They need to be designated as working documents, they should be reviewed and amended at least quarterly
- Data dictionaries should be defined as the first step in the BA discovery phase to mitigate mis-communication risks
One of the panelists had an interesting point here – “Trust is good, control is better.”
Another’s view was – “if you outsource a bad process, you will be even worse off.”
There were also discussions on the impact of data quality on exposure to risk, client satisfaction and overhead.
Again the viewpoints were fairly consistent – and in summary
- Risk: fairly obvious answers here were that reputational damage was the key risk, the financial world is built on reputation and you should take whatever reasonable means possible to prevent tarnishing your brand. Clearly there are also financial risks, be they penalties from regulators, loss of major clients, or outflows.
- Client service: good data means better trust – bad data leads to lack of trust – lack of trust will damage client relationships and lead to loss of clients and outflows
- Overhead: there are really clear overhead benefits, be that direct cost savings, resource refocus or process efficiency to be achieved. Obviously getting rid of manual error prone processes was the key benefit, but also audit overhead costs should be driven down.
To round off the moderator asked what the top 3-ways to get buy-in for investment in this area – naturally not everyone had the same top 3-ways, but the following were recurring themes:
- Present a case with quantifiable upsides and cost savings – ensure the cost benefits are clear and tangible
- Promote benefits of governance, (de-centralized) ownership, accountability, (centralized) oversight and transparency
- Mitigation of serious risk – get across the message about the accident that has not happened yet. Use real-world case studies – do not ignore potential exposure to risk.
Other points made were;
Data management needs to be looked at an enterprise level. It is a strategic play, not just business level or departmental level.
Vendors should sell pain, sell gain and take advantage of opportunities. Don’t just sell negatives – look at ROI and quantify it.
Front, middle and back office don’t understand each other and don’t work together. Organizations need to build up the ethos of “we’re all in the same lifeboat trying to get to the same shore!”.
Initiatives like this are COO level and COOs are the people that need to be convinced!