Data Lessons Part 4: Data Oil and Seized Up Engines

dataoilI like to use the metaphor of data and oil a lot. The analogies are many – for example mechanical engines need clean oil to function efficiently, the sales engine in an asset management firm i.e. the distribution function also needs clean data to function efficiently. If you feed the distribution function with erroneous poor quality data, that function will seize up, in the same way a mechanical engine will seize up when running on dirty oil.

There is also another oil and data analogy that makes a huge amount of sense and this relates to refinement. Oil in its unrefined state, while of value, is not as valuable as its refined derivative gasoline or petrol. In the same way data in a refined state is far more valuable to a firm. The level of effort a firm puts into its data refinement process leads to dividends being returned downstream when that data is put to use in the investment management process, distribution and regulatory reporting.

Data is a valuable resource in an asset management firm – it’s the raw material in the investment management process, it is the oil in the distribution engine and it is the canary in the coalmine, that is regulatory reporting. Firms that value their data, that place great store in their data architecture, that manage their data debt in a strategic manner are firms that have the edge on their competitors.

Finally to round off the analogies – the rise of cyber incidents and the related spillage of sensitive data into the public domain is likely to garner as much news as physical oil spill.

Data is the only mechanism we have to describe our product offerings. We need to refine our data, treat it with care, and secure it appropriately.

Data Debt and Silo Heaven- A Marriage made in Hell

Marriagemadeinhell.png

The prevalence of data silos in a firm is a direct link to the level of data debt being built within the firm’s data architecture. Good data architecture and design naturally leads to low levels of data debt.

Data debt can be thought of as the outstanding work that needs to be done before the overall data architecture can be considered proper. Just like interest on a financial loan, unpaid “data debt” will keep on accumulating interest. The accumulating debt makes it even more difficult and expensive to deliver on the change that the business demands as it evolves, and as demand for scale becomes an absolute necessity.

The production of data debt (e.g. tactical data siloes) is often unavoidable in terms of a firm’s need to meet immediate requirements with respect to regulatory projects and keep business projects on target. While data debt is not necessarily a bad thing, the unchecked accumulation of debt (growth of siloes) can lead to long term challenges in the data architecture that can advance to spiralling costs of maintenance and inflexibility.

The growth in tactical data silos in an asset management firm can be attributed to lots of different drivers e.g.

  • Required response to demands from distribution
  • Required response to regulatory reporting change
  • The need to evolve the investment management process in line with new product and strategies

Well managed firms succeed by recognising the point of inflection where a more strategic decision making process is required to unravel tactical silos and prevent inevitable chaos ensuing. These silos are replaced with a strategic solution that better positions the firm for a future response to distribution, regulation and product evolution.

Recognising the point of inflection and then taking action is not an easy task for a firm, but ignoring the tactical silos can create more problems then what it takes to unravel them. A stitch in time saves nine!

Next week’s blog will face the issue of seized up engines.

The Dirty Data Theory

thedirtydatatheory

The “broken window theory” argues that residential areas with broken windows will suffer a higher level of serious crime than neighbourhoods without.  The theory, put forward initially by social scientists James Q. Wilson and George L. Kelling, states that keeping urban environments in a well-ordered condition may stop further vandalism. The social theory was adopted by the NYPD  and inspired many new controversial polices in the interest of fixing “the broken window”, before there was further deterioration in certain urban environments.

In other words, broken windows came to be an indicator of bigger problems in a neighbourhood. Whether the broken window theory is a sound one or not is irrelevant; what really matters is that people made inferences and decisions based upon it – a typical human behavioural trait. The old adage of “don’t judge a book by its cover” relates to that specific tendency.

I wrote before about my own related dirty data theory which puts forward the premise that public displays of poor data quality lead to inferences of below standard investment management processes. This article also pointed to situations where regulators view marketing materials as the canary in the regulatory coalmine when they are running firm exams.

To what extent can one make valid inference about the quality of client or public facing data quality and the value of the investment management process? I believe it is a difficult hypothesis to prove; on the other hand, whether the hypothesis can be formally proved is not as important as the likelihood that others will make those inferences as a result.

Have you ever run a hiring process and reviewed CVs or conducted interviews? Have you inferred anything from poor spelling or grammar in CVs? The sloppy appearance of a candidate at an interview?  Their time keeping for the interview? These are natural, often innate, inferences that we make from our own observations.

We should not be surprised that consultants tend to run their selection searches close to quarter end to weed out managers that struggle to update that consultant’s database in a timely manner.

We should not be shocked that egregious errors in marketing materials are used by regulators as pointers to poor data management and governance which may prompt a deeper exam process.

Whether we like it or not, if we are putting forward sloppy, inaccurate, inconsistent, incomplete or untimely data in the public domain – be that marketing materials, consultant database updates or regulatory reports – then we cannot be surprised if this leads to inferences of:

  • Inability of the firm to report to a high standard in the future
  • Of a deeper malaise in the firms operating models – for example a poor or sub-standard investment management process

Dirty data is a window into a firm’s investment management process. It is in a firm’s interest to make sure the data and the window are clean, highly polished and without a scratch.

Thanks for reading – next week’s blog will focus on data debt and silo heaven.

5 Realities of data management

datamanagement

I had the opportunity last quarter to spend a lot of time on my soapbox and listening to senior executives sharing their strategic views on success. In March at our breakfast briefing in NYC I listened to our four panellists share their views on the challenges of supporting the distribution function in asset management with high quality data, and how to deal with the rise in demand by client driven regulatory reporting.

Here are a few nuggets from Q1 of ’16 that I would share:

  1. Only a strategic approach with C-level commitment will succeed:

You can have all the top-down and bottom-up approaches squared away, but if your initiative does not have C-level support, and even more importantly C-level commitment and engagement, then you’re doomed to failure. Note there is a very big difference between having someone’s “support” in principle and having a fully committed engagement in practice. Think about how the chicken and the pig fair when it comes to your breakfast of eggs and sausage. The chicken is supporting the engagement; the pig is fully committed!

  1. Investor appetite for data is increasing and will continue to do so:

Both retail and institutional clients are demanding more insight and transparency. Ultimately this translates into more data. From a retail perspective, web and factsheet demands are growing. In the new connected-world, digital strategies are pushing asset managers to deliver better web presence to an investor set that are increasingly intermediated by platforms and advisors. On the institutional front the distribution channels and consultants are demanding more data, in tighter timeframes. The depth and breadth of portfolio reporting is increasing, and the once acceptable T+30, T+60, and T+90 embargo periods are under huge pressure. Investors who are subject to regulatory oversight are driving a huge element of demand for more timely and broad portfolio reporting, as are needs being directed  from more rigorous risk management oversight.

  1. Data capability is a reflection of your firm:

We’re all familiar with the adage ‘don’t judge a book by its cover’, but no matter how often we hear this, the natural human inclination is to come to a conclusion very quickly based on initial appearances. For an asset manager, data is their book cover! If your data delivery is tardy and quality is not up to par– then do not be surprised if prospective clients, consultants and distributors have a poor impression of your firm’s investment management capability. After all, if you cannot manage your data, it is reasonable to assume you’ll struggle to manage money.

  1. All executives should have the appropriate level of access to trusted data at their fingertips:

If your firm is not looking at analytic and big data tech that will bring better, more insightful data to your executives, then you’re on the outside looking in. There is exponential growth in the number of firms that are looking to blend their product, client, flows and distribution data into big data type tech (e.g. Cassandra or Hadoop). These firms then look to layer visual analytic tech on top (e.g. Tableau, Logi, or Birst) to deliver better intelligence to the C-Level suite. The problem is that at the start of some of these projects, the priority is about getting the data from anywhere it can be found; what will really make the project successful is ensuring the data entering the analytic field is coming from trusted, approved sources. While it’s great to have highly visual data views that will deliver market insight and ultimately drive decisions on the firms distribution strategy, bear in mind these reports MUST have a solid data foundation or the value of them is highly questionable.

  1. A data governance model must have context of usage and full accountability:

Governance in the industry is starting to mature which is a hugely encouraging sign. Now when you hear firms talking about governance, instead of the old debate which centred on data quality management and stewardship, the broader governance picture is coming into play. The discussions I am hearing these days are more likely to be centred on ownership of data and appropriate contextual correct usage of the data. A key development is also the growth in ‘public facing data governance’ committees and steering groups. In firms with mature governance these teams have stakeholders from product, compliance, legal, performance, operations and front-office distribution to ensure all the correct people are involved in decisions about data being used in client, distributor and regulator facing channels.

The tasks outlined for asset managers in the above might seem daunting. To some firms the work required might equate to climbing Everest, however the resulting operating model is worth the difficult implementation in order to achieve long term success.

Pictures from New York breakfast briefing 8th March

Living in Data Quality Denial

Data Quality DenialI have spoken many times about the concept of DQD (Data Quality Denial) and about how pervasive a problem it is, and even more difficult to root out.

The problem, as I see it, starts where you have technology business owners that are somewhat disconnected from the realities of how and why the business is consuming the data.

Technology data owners are oftentimes very focused on the mechanics and syntax of the data – the bits that can be  related to and understood easily. In fact technology data stewards are essential to the data quality management process in terms of the level 1 DQM sweep. Where the technology stewardship team often struggle is in the understanding of the business context of the data.  However , if business have been proactive in ensuring subject matter expertise is embedded in the process, then this is not an issue.

The problem becomes compounded when the business stakeholders take a “hands off” approach to the broader DQM process and leave the technology stewards to get on with it. The most common quip from the business though is “This is our data, about our own products – of course the data quality is good”, at the same time when the data quality is questioned one often finds the business owners are quite comfortable sitting on the fence lobbing stones at the technology stewards greenhouse (their DQM efforts!), without proactively seeking to engage the broader data quality management effort.

So who is in denial?

  1. The technology stewards are in denial that there is a problem – as far as they are concerned (or aware) the data is good quality.
  2. The business stakeholders (it is a stretch to call them owners if they have ceded ownership) are in denial that they are part of the problem.

How do you move past this?

  1. Business absolutely must believe they are part of the ownership equation, and not just “interested bystanders”.
  2. Technology should never be comfortable with a dynamic where they are the only element of the ownership equation – you cannot own something in its entirety unless you entirely understand it
  3. As you can see from the diagram below your DQM needs an N-Level approach to the stewardship function where the focus at Level 1 is on the syntax and data mechanics, while as you progress to Level N the focus is more semantic/context driven and the stewards are from the portfolio/product/SME side of the house.

Data Quality Denial is a condition worth removing in order to find the opportunities that strong data governance can bring.

opmodel3

 

6 Things to look out for in 2016

6 things

  1. Regulation:

As you would expect regulation is still the number one topic for 2016.

In Europe we have the reality of Solvency II, the demands for portfolio transparency and look-through is finally live. We are also seeing the imminent arrival of PRIIPs which will bring a new risk management regime to non-UCITS/AIF products, and Key Investor Documents at point of sale. Added to this the looming behemoth that is MIFID II is lurking in the background, with the fund community waiting to see if the regulator will blink and delay the regulation.

 Meanwhile in the US you have a whole different set of regulatory change about to touch down with liquidity management and disclosure/transparency being the key trends. Everyone is looking at the new N-CEN and N-PORT reporting requirements and SEC rulings on liquidity management. Add to this the Department of Labor’s introduction of the Fiduciary Rule for the protection of investors, and the proposed SEC rulings on derivative management seem very far away.

  1. Fee pressure:

The pressure on both asset and wealth management fees, as well as the overall fee model and structure, is relentless – there is no sign of this easing. On the asset management side we see continued pressure being applied to both passive and active fees. The SEC focus on “distribution in guise” payments and the transparency being delivered in Europe via MiFID, is driving firms to consider their fee and charging models, as well as implementing a very careful examination of the service providers and distributors being used. In addition cheaper ‘robo-advice’ services are having an impact in the wealth management world and are forcing the large wealth management houses to re-assess their client segmentation and applicable fee structures.

  1. New products and product range:

We expect the hot topics from 2015 will continue into 2016 – and we will be hearing much more about smart / strategic / fundamental beta. While many are saying that some of the new SEC rulings will put a brake on the growth of Liquid Alts – I suspect their popularity will continue, albeit maybe in a different guise. Hybrid fixed income is topical as is the re-emergence of appetite for active – genuinely active that is! Another key trend we see is the migration from fund to managed account structures – in particular where the client is subject to onerous transparency requests, and the scale of the investment makes sense. Breadth and depth of product range is something we also expect to hear a lot about, with some managers not handling the transition from low to high(er) interest rate regimes due to a lack of availability of the “next” hot product or strategy. Regulation in the pension and insurance markets is also driving product change in the asset management space, with managers positioning themselves with new products on the ALM (Asset Liability Management), LDI (Liability Driven Investment), captive and collateralised loans fronts.

  1. Big analytics, Data Management & the cloud:

It looks like data management as-a-service and data management in the cloud will finally become acceptable to main stream buyers of data management services and technology. Expect also to hear more about ‘big analytics’. With big data now having made the jump to main stream we are seeing huge levels of activity in terms of the mining/analytic engine projects that are required to make sense of the sales, client and product data. IBOR – last year’s hot new project – will continue to grow in popularity in 2016. Another 2016 talking point will be RegHub / Regulatory Middle Office type projects where firms attempt to leverage investment in regulatory reporting by exploiting the overlap in reporting data sets.

  1. Cyber/data protection:

With some very public examples of the threat from cyber-attacks being realised in 2015, expect heightened focus on security in 2016 and a serious growth in INFOSEC activity to counteract the threats being posed. Changes in data protection and privacy legislation, particularly in Europe, will lead to some serious inward analysis on where European data is located and processed within global firms.

  1. Global operating models:

The rush to global operating models will continue, as firms look to leverage scale from their regional businesses and drive more efficiency into their day to day operations. Firms will maintain 3 core operating centres, in Asia, Europe and the US, in a follow-the-sun strategy. In Asia the popularity of Singapore and Hong Kong remains unchallenged, while in Europe the dominance of London may come under threat if fears of the Brexit start to materialise, meanwhile in the Americas the financial hub cities of New York, Toronto and Boston will remain strong.