Interesting article on FTF about FATCA….Financial Technologies Forum LLC – FATCA Kills the Data Silo?.
We all talk about how important data quality has become, how important it is to deliver transparent, high quality information to our customers, and how that’s been driven by regulation and by changing investors. However, I’ve been at a number of events recently, and talking to customers and prospects about data management and I think that the stakeholders in data management projects have changed – it used to be technology, now it’s predominantly the business.
The drivers for these initiatives have moved beyond improving operational efficiencies – now it’s about improving your client service and your customer experience by sending out high quality data, and it’s about how you use that data to promote your messages as well.
It used to be all about getting the data in one place and it was all manual processes – in many organizations the processes are now automated, they can get the data faster and they have time to analyze it and use it for marketing. Wouldn’t it be great to link your sales data to your fund data so when you have news about one of your funds, you can push it out to the sales force so they have immediate access to that information for their customers … or you can push it out to your marketing department so they can immediately execute a targeted campaign to a particular group of prospects. You could really add value to your organization’s sales processes by leveraging the information in your product data…and connecting it to your advisor and customer data …and then tying it all back together with your books and records data flowing from TA.
Many asset managers have empowered their sales teams with iPads so that they have access to all the latest product information … anywhere, anytime. At NICSA’s recent conference in Miami, it was revealed that 76% of advisers share content online (up from 67% in 2010). This includes performance information, white papers, commentaries etc…. it underlines the importance of being able to provide that information, ensuring that it is always accurate.
There is no point having all these silos of business intelligence in the distribution front-office if you cannot leverage it – make the most of your data!
I love using analogies and metaphors… I used to talk a lot about C level’s perception of data management…and equating it to a duck paddling across a pond… what C level doesn’t see is the duck’s legs paddling furiously to get from one side to another… C level doesn’t see the immense and often chaotic manual processes and time spent on getting fact sheets, client reports and sales decks out to market.
Everyone is talking about data … and everyone has a different perspective. Who really cares about data? Well, today it is a hot topic in the front-office, specifically with the distribution team.
It’s really important for the sales and distribution teams to have timely, accurate, consistent data appearing in fact sheets, RFPs, presentations, client reports, sales decks and on their websites.
Ultimately, data is the oil in the sales and distribution engine … good data helps them to sell their products… and enables the process of communication with clients and prospective investors to run smoothly.
Good data will help them deliver excellent client service, retain their customers and gain new clients… and of course, good data will ensure that the company reputation is upheld.
Even more important is ‘agile data’ -> the pipeline feeding distribution with product data and market intelligence has to be adaptable and scalable and capable of reacting to new product launches, changes in distribution channels and market regulations. The demand for more strategies is leading to more products being added to the arsenal, and the demand for greater transparency in reporting is leading to more data points per product -> this 2 dimensional demand on breadth and depth of information means the data quality management processes, compliance and governance functions all have to be agile enough to meet the demands of distribution.
But, on the other hand, bad data will muddy the waters… pour dirty data into your sales engine will over time lead to it seizing up completely! The consequences of getting it wrong are that you’ll have inaccurate, inconsistent data in the public domain … you’ll be at risk of getting in trouble with the regulator, potentially being exposed to fines and worse still, bad press – your reputation will suffer and you’ll likely lose business as a result. That really doesn’t help the sales engine run smoothly at all … outflows, loss of business, poor client service…. that could all make the sales engine seize up.
Ever wondered how you can improve client services? I would argue that easy access to timely, accurate and ultimately reliable information about your products i.e. their investments, being delivered through an effective data governance programme, is a key enabler to service excellence. Arguably, the main differentiators for investors in terms of client services are the timeliness and quality of Investment Reporting coupled with a responsive and assured service that they can rely on if they wish to enquire about their investments. In the age of transparency, there is no room left for complacency in these areas.
Timely and Reliable Investment Reports
Strong data governance coupled with effective stewardship enables shorter reporting cycles therefore providing your clients with their investment reports earlier and exceeding their expectations for up-to-date information on their investment portfolio. However, timeliness of delivery will not do it alone. It has to go in pair with reliable data. The validation process in a data governance programme ensures that you get it right the first time; which in turn will save time by removing the iterative process of checking, correcting and rechecking reports. Clients do not only expect to receive reporting on time. The content has to be complete, accurate and consistent for it to deliver value.
Finalising reports earlier also provides your client service team with more time to focus on adding value when delivering the information by analysing the data and preparing to review the investment report with or pre-empt questions from their client.
Assured and Responsive Client Service
In the current environment and under increased regulatory scrutiny, asset management firms are adopting a fiduciary mind-set and strive to be as transparent to their clients as possible. Therefore, your customer service team’s ability to navigate your product data and have timely and accurate information at their finger tips is critical to your success. The changing regulatory landscape requires customer service staff to ensure they are prepared to address any question or concern that an investor may raise in a responsive and knowledgeable manner. Therefore, conveying confidence, building trust and making the investor feel that they are in good hands.
A strong data governance system will empower client services to achieve these high standards by building their own confidence in the information that they source internally, by providing them with the most up-to-date data and by allowing them to quickly identify the owner of specific data points to route investor enquiries to the right source of expertise within your organisation. Therefore, helping them to get the answer right the first time.
If you cannot demonstrate you are in control of your product data, how can you claim you are in control of your investment management process?January 26, 2012
One of the key decision criteria in any RFP/mandate selection process is the analysis of the investment management processes in the firms vying for the business. While demonstration of consistent past performance per unit risk measure is a key determining factor in the process, it is very rarely used as the final decision criteria, more often than not it is only used to create a short list. Once the short list has been created, it is then down to deep analysis of the investment management process – specific attention is now being placed on the balances and controls applied to the investment management process, after all the new post-2008 investment era is highly risk averse and there is now heightened concern and scepticism around returns which are achieved in loose and fast environments.
With such intense focus now on the controls in place within the investment management process many firms are actively investing in promoting the governance they have in place. The result is often a “dressed up” process that will not pass the muster of a deep analytical audit. The placement of billions of dollars in investment is not taken lightly and any attempt at “dressing up” the process will be seen exactly for what it is.
Product data that is communicated externally is a key facet of any due diligence and any failure to demonstrate appropriate balance and controls to ensure information being pushed into the public/investor domain is accurate, consistent and timely will be noted.
Investor demands for greater breadth and depth of the information being communicated, along with increasing demands for more frequent updates, is leading to an exponential increase in the balances and controls that need to be in place to support the product information communication processes. This in turn is leading to serious headaches for compliance as they need to be sure the governance and applicable stewardship is fit for purpose.
As a result, firms who are unable to demonstrate they are in control of their client reporting process are very unlikely to be able to demonstrate they are in full control of the investment management process. Of course, there are firms which have excellent investment management processes, but you cannot derive the inference that the client reporting and product data management governance and stewardship meet the same benchmark. The inferences tend to work on negative side – i.e. if the due diligence finds there is poor governance in one area of the company they will have the impression that the same lack of balance and controls permeate the firm.
So beware what you push into the public domain – information you communicate about your own products needs to be consistent across the web, client reports, marketing decks, factsheets and RFP responses, it needs to be accurate and it needs to be timely. There are firms that have excellent demonstrable risk-adjusted returns that are not winning mandates because they cannot provide demonstrable reassurance that they are in control of core activities within their firm, remember governance is the talk, stewardship is the walk.
This in turn is leading the heads of distributions and sales in asset management firms to demand more reliable and trustworthy data from operations. They have recognised that high quality information about their products can be used as a differentiator in winning new business, and that it positions them to deliver best in class client service which leads to higher levels of customer retention.
This data is directly used in the monthly and quarterly production cycles that serve their clients with regular updates on investments and power up the sales engines and related materials in the go-to-market side of the business.
But, surely investors are only interested in the risk-adjusted performance? Why would the quality of information in point-of-sale documentation or reporting influence an investor?
The reality is that investors do not, and should not, use past performance as the sole criteria in their decision process any more – so many other factors are important. The same applies to distribution channels for funds - fund providers need to differentiate themselves from the pack.
So clearly the distribution channels want good products to sell, but they need good materials (and good information) to help them make their products stand out from the crowd.
They not only want good sales support materials though, they want them on time, and ideally, they want them before their competitors have theirs. They want to wow the investor with the breadth, depth, and timeliness of the information. They want to ensure that whatever they present matches 100% what the investor will find on the web.They want to use the latest technologies to deliver the information to the client – support for a touch screen tablet is the new must-have request from the field sales teams.
So, having a good product is a given. Having smart and exciting ways of delivering point-of-sale information to the potential investor is a given. The best product in the world, and the sexiest of sexiest tablets will be useless if the content you are delivering is late, limited or just plain bad.
Investment decisions are built on trust, trust in the advisor, trust in the brand of the provider, and trust in the material being presented.
Trust in the product is built by providing clear, deep, transparent information on the product at point of sale – so one or two page fact sheets that are two months old do not cut the mustard.
Trust in the information being communicated is the foundation on which the investor will build their impressions – it is their window on to the organisations they are doing business with (or considering doing business with).
The investor wants an appropriate mix of qualitative and quantitative information – too much text and not enough stats make it look like you’re hiding something, too much stats and not enough text make it look like you have a lightweight analysis team.
The investor wants first-class, qualitative analysis of the market segment / strategy that the fund is targeting – they want to understand the product and market risks at play. They want quantitative and technical analysis that open the lid on where the performance and risk of the fund is being generated, and they want to understand how this breaks down when compared to peer-groups, external category averages and the stated benchmark.
Something which very few asset managers have embarked upon is providing advice on which products from the same provider (currently) have a correlation co-efficient that would lower the overall risk of a portfolio while maintaining overall target performance – think about how Amazon.com markets books that are related to each other.
Finally, clear unambiguous presentation of the fees/charges for the product, build confidence and support the trustworthiness of the advisor, provider and product alike.
To summarise, by sorting out the “plumbing” i.e. the flows and quality controls around product information from various internal and external sources, sales and distribution can leverage this reliable and trusted data to accelerate new customer acquisition and increase customer retention rates.
I have noticed a definite trend over the last number of years with respect to the convergence of the retail and institutional worlds within asset management firms.
It is not simply just a convergence of the product and service offerings, but also the internal alignment of the teams responsible for each business line.
The operating models that were at play 2-3 years ago had these teams run on separate lines, now firms are aligning their internal structures along functional roles as opposed to business lines, in turn blurring the line between retail and institutional.
So what is happening out there? What are the drivers? What is causal? What are the symptoms?
There are several key drivers that I see in play:
1. There is board and shareholder pressure to build leaner operating models that scale better and deal with financial market changes in a more flexible and predictable manner. This is borne out of the major flux we have seen in the financial markets since the end of 2008 and the renewed focus on operating costs.
2. There is a growing level of investment savviness amongst retail investors, in particular with the key market segment that has a high level of disposable income. These investors are demanding greater depth and breadth of information on their portfolios, thus driving the retail (product- focussed) reporting model ever closer to the client-focussed reporting model of the institutional market.
3. Institutional clients are demanding glossier client reporting artefacts – something which the retail side of the business are generally more adept at producing. This combined with the demands from the institutional sales teams and channels for product-like factsheet documentation for the various strategies and composites being marketed, is a key driver in getting the output production teams internally more closely aligned.
The results of these drivers are that internally the business lines are being remodelled and combined such that the retail (product) reporting structures are a by-product of the more bespoke client-focussed institutional lines.
The retail investor is also being offered increasingly complex products; synthetic ETFs, Absolute Return funds, Long/Short strategies and SMA/WRAPs.
In turn, retail investors are demanding increasingly complex statements and monthly factsheets – note the increase in retail asset managers offering detailed equity and fixed income attribution reports, both at product and account level.
Asset management firms have been quick to grasp the obvious efficiencies available by viewing the product side of the company as just another institutional client – thus enabling them to unleash the power of their considerable investments in client reporting solutions to tailor them for the retail line of business.
Another driver in the area which is driving consolidation of the systems that service both lines of business is the focus on building an investment product master to deliver a formal data quality management framework to support the considerable desire to produce better quality data and content in a more timely and efficient manner.
So in the future, we should expect to see more, not less, convergence of the business lines. Clearly, the two lines of business will always have clear demarcation lines in terms of level of service, reporting, fee structures and distribution, but the back- and middle- office teams and services that serve the business lines will see continued consolidation to leverage the obvious efficiencies and quality improvements being demanded by investors and shareholders alike.
There has been a building murmur of conversation of late in the asset management community about client service, specifically with regard to the impact of data management of all things on this. It is fair to say that given the regulators’ continued defence of the investor and their insistence on the fair treatment of customers that the necessity to communicate timely, accurate, and consistent information to existing and prospective clients is growing by the day. This combined with the increasing demands of the end investor for a more up-to-date and frequently updated, broader range of data means that today’s asset managers need to sort out their information “plumbing” or face being left behind by their competitors (and their customers).
Three years ago, buy-side firms were looking to embark on data management projects to improve efficiency and remove silos and manual processes. While these drivers are still valid, more and more data management projects today are driven by a desire to improve the quality of information delivered to the front-office. In fact, a recent asset management survey confirmed this as the number one operational focus for most buy-side firms. In addition to this, asset managers looking to achieve best in class client service or break into new customer segments and/or markets commonly recognise the value of a solid product master as the base platform that can be leveraged in order to achieve all of these strategic goals.
A product master puts an asset manager in control of the information about their funds and accounts. Once all of the data controls are in place to centralise and clean the product information, the product master can be leveraged across the enterprise to ensure that all consumers of the information (internal and external) can have full confidence in the timeliness, accuracy, and consistency of the data they are viewing. It also provides auditors and regulators with the evidence that the asset manager has recognised the importance of this data and has put systematic controls in place to address it.
While the concept of a product master may be relatively new… it is gaining momentum and we’re hearing more and more about it in the press, at events, and directly from the industry. Watch this space!
I think that surveys are a fascinating insight into people’s views on what’s happening in the marketplace. We try to conduct a survey at least once a year on current trends in data management. Last year, our survey focused on what people’s challenges around data management were and this year, it was all about regulation. We decided to focus on regulation because as far as I’m concerned it seems to be all people are talking about – they’re talking about KIID, Dodd Frank, FINRA, fines, compliance and many are not really sure about how new regulation is going to impact them or what lies ahead.
The global regulatory community has come under a lot of flak since the 2008 market implosion. In many regions, the regulator has been disbanded, restructured or at the very least forced to report to government led investigative committees. The charge levelled is that they failed to maintain a stable market by not having a clear view of the systematic market risks that were at play. The industry itself also had a role to play in the demise of the previous boom – as employees were incentivized to take on risk without the appropriate checks and balances to measure and mitigate exposures. Additionally, operating models did not keep up with the rapid change of the industry landscape – leaving behind a legacy of manual error prone processes and key knowledge data sets that were poorly maintained. In doing so, the industry and regulators to a large extent left the public to carry the can. So it is not without reason that the media and investment community alike are keenly interested in the regulatory backlash that can be expected in response to the financial crisis. Regulation was always a driver within the industry, but more recently its prominence has increased because the fines are getting bigger, and the reputational damage is all the greater, for the increased coverage being delivered by the media. Understandably, the regulators are now focused very firmly on managing market stability and ensuring the industry is treating investors fairly. From the “Know your product and client” perspective – the key focus here is that investors should be sold the most suitable product for their particular situation. The regulator (and I refer to them in general terms here) – is looking for accuracy and timeliness – and they are looking for consistency across all public communication of data – be that – printed fact-sheets, micro-sites, or institutional client reports. Sales and marketing material for investment products is coming under increased scrutiny – the regulators are increasingly treating such material as disclosure of material fact, where as in times gone by, firms were not being held to account to the levels they are today, vis-a-vis the information disclosed in such documents. Anyway, that’s some background … and I will post a blog on what the individual regulations are about in each region but on to a little more about our survey. We usually conduct our surveys at industry events such as NICSA, ICI General Membership Meeting, TSAM in Europe and also online. We presented the results in a webcast on April 13th and also in a press release which you can access here.
The objectives of the survey were:
• to gauge industry insight on regulation
• to learn how prepared these organizations are for the impending changes
• and how will it affect firms data management processes and strategies into the future.
The drivers for the survey should be obvious to us all – many organizations are actively assessing their target operating model with a view to adapting themselves for upcoming regulation, on top of this you have the rising spectrum of reputational damage due to increased media scrutiny and the fact that data management is now a standard item on the agenda for corporate risk assessment.
66% of the survey respondents were from North America (primarily USA) and 33% from Europe.
The first question on the survey was: “Would you say that the product data you distribute to the market is; always accurate and timely, usually accurate & timely, rarely accurate & timely, or, you don’t know” and I’ve put together a quick pie-chart of the results below:
What was interesting from the response was that just less than one quarter of respondents indicated that their product data was always accurate and timely, although nearly two-thirds indicated their data was only, usually accurate and timely. What can we take from this? Well it seems that most of the time asset managers product data in the public domain, is in the main accurate and timely, but, for two-thirds of asset managers, there are periodic issues with either getting their data to market in a timely or accurate manner. This is really highlighting something we already know – getting your product data into the public domain so that you have a high degree of confidence in its accuracy is not a simple solution to solve – even though this is your own information? The predominance of manual processes that deliver data from back and middle to front office is probably the core reason we see this lack of confidence. Anywhere that you have manual processes, it leads to a lack of repeatability, inability to create a systematic audit trail and a breakdown in confidence in the ability of the machine to operate under full load. What is clear is that regulators do not want to see such environments – specifically we have heard from some of our own clients that recent SEC exams focussed heavily on the ability to demonstrate repeatable and auditable processes where data was being pushed into the public domain and facing off to client investors.
The next question was “in the US and North America, which of the following regulatory discussions is your firm most concerned with? The Dodd-Frank act, the Point of Sale fund fact regulation in Canada, the reforms of the Money Markets, the 12b-1 reforms or recent FINRA interjections”
The response was hardly surprising – just under three-quarters of all respondents indicated that Dodd-Frank was at the forefront of their firms concerns when it came to discussions on regulation. It would have been surprising if the result were any less – Dodd Frank is a behemoth act, the impacts and true force of which are not even fully realized yet. I was surprised that FINRA did not feature higher than a fifth of respondents’ concerns – in particular when you consider the amount of cases they have taken recently. Even though the SEC reforms of the US money markets is to an extent yesterday’s news, it was interesting to note that nearly a quarter of all respondents indicated that these reforms were still a discussion point of concern within their firm. While the fact that a third of respondents are concerned with the ongoing trials and tribulations connected to the 12b-1 fees is not a surprise either – anything that impacts the commercial model that impacts distribution and compensation of brokers means a lot of upheaval in existing operating models. Finally the fact that only 8% of respondents are concerned with the POS Fund Facts regulation in Canada should be balanced with the knowledge that the majority of respondents were US firms which had limited exposure to Canadian investors.
This blog is getting very long…. so I will keep the results of the rest of the survey for the next post!
I recently participated on a panel discussion at the Osney Media’s Client Reporting Conference in London that was chaired by Peter Bambrough a management consultant at Citisoft. The topic for the panel was “Data Management: The Critical Issues”, and on the panel I was joined by:
- Philip Keeler, Head of Operations IT, Hermes Fund Investors Ltd
- Bob Simon, Senior Director of Business Development, CorrectNet
The first question that was presented to the panel was “Why do data management projects go wrong?”
My own view point here is that projects I have seen fail were nearly all down to a lack of clear data governance, stewardship and generally poor communication. In order for a data management project to work there has to be a common understanding of the issues at play. Communication is key here, especially between the middle and back office – all teams have to be speaking the same language. Another issue that the data management projects face is the lack of understanding from senior management. As Philip Keeler of Hermes said, “there needs to be a holistic view within the company, senior managers need to be aware of the issues and the implementation processes involved”.
Also when implementing a data management project is it important to break down the project into manageable chunks, make realistic deadlines and achievable goals, this in turn will reduce the risk and make the project less likely to fail.
Some of the interesting points that came out of the discussion with respect to running successful data management projects were:
- Silo approach is only helpful if you have complete view of the landscape
- Warehouse approach to everything is always going to lead to failure as they take too long to implement and the landscape invariably changes before the project finishes
- Better model may to have data warehouses feeding data hubs, from which business unit ‘fit-for-purpose’ data marts are published
- Communication both top-down and bottom-up is critical
- Senior management buy-in to project is essential
- Multi-tiered stewardship
- Governance and stewardship operating hand-in-hand
- Clear understanding of current cost exposure versus the new target state
Another question put to the panel was in relation to getting the right people to work with the data – who are the right people? We know that it is not a job for marketing departments or indeed asset managers. Organizations need to avoid the “Just in Time” data management operating model where a team of client reporting or marketing execs scrub and cleanse the data just prior to publication. This is a critical job and the right people need to be there to ensure that it is being carried out correctly. So who are the right people for the role? It was agreed unanimously that you need to adopt a multi-tiered approach to stewardship – you need stewards operating at the data source level – data analysts – that are comfortable dealing with the low level source oriented quality issues, you need product specialists that are comfortable looking at the data from a product/strategy perspective and you need business analysts working in the front-line teams (client reporting / sales / marketing) that are comfortable looking at the data from a reporting / presentation perspective.
The general consensus among the panel was that communication, understanding the data issues and ensuring the correct people are managing the data are all important elements in the fight against combating data management issues.