It’s 4:01pm and a customer of a bank is standing outside the branch, knocking on the heavy glass door. The staff inside seem oblivious to her presence as they tidy up for the day. But the customer is there for a scheduled appointment with the bank’s financial planner.
It may seem inconceivable, but this meeting has been scheduled for outside the branch’s opening hours. The meeting won’t be happening today, and if this customer is typical, she might not come back. She may try another branch at a later date, but there’s a high risk she will abandon the idea of seeing a financial planner altogether.
Welcome to the financial planning customer experience, circa 2016.
The results are in for one of the largest ‘mystery-shopping’ exercises ever undertaken in financial planning, and for some financial planning licensees, the facts are ugly indeed. They paint a picture of planners failing to engage potential clients on first meeting and getting even basic things wrong, and of licensees struggling to deliver consistent service and customer experience across their advice networks.
On the other hand, the research finds that some businesses achieve a high level of engagement and deliver a strong customer experience consistently across all of the practices in the network.
The problem for the industry overall is that a customer’s experience is still very much down to luck, and which branch, practice or individual adviser they happen to contact first. The range of scores across a number of key measurements should be of concern to some licensees.
The client experience research is based on about 400 mystery shops, co-ordinated and organised by the research and consulting group CoreData. The company’s director of Australian financial services, Sean Allen, says all of the shoppers used in the research have genuine advice needs, but none of them have been trained on what to ask or what to look for before heading into the field.
“We select them based on the fact that they have an advice need – self-identified – but otherwise we do not brief them,” he says. “They go out as as-normal shoppers as possible.”
Shoppers are not experts Allen says the shoppers are not experts in compliance, and do not know, for example, what a financial planner must do or when, from a compliance perspective. If a financial planner fails to provide a financial services guide (FSG) during a meeting the client won’t necessarily know to ask for one and the planner will be
scored poorly on that aspect of engagement.
“If you have trained shoppers then they tend to look for things, whereas if they are untrained they just experience what they experience,” Allen says.
He says the analysis measures a range of aspects of how financial planners interact with would-be clients. Some may seem minor, but they all add up to paint a complete picture of the contemporary client experience.
The issues can range from how many times the telephone rings before it’s answered, to how long it takes to set up an initial meeting (and whether the meeting is scheduled for when the planner’s office is actually open); whether the initial meeting is followed up and how long it takes to do that; and how long it takes an adviser to produce a statement of advice, present it to a client and move to implementation.
“The experience starts from picking up the phone, or from applying for a meeting online,” Allen says.
And it can be little things that add up to make a big difference – like at least trying to personalise emails and not sending them from a “do not reply” address, and including disclaimers that are more wordy than the email’s actually useful information.
He says the leaders of financial planning licensees are often alarmed to learn that the standard of service and client engagement offered by their branches or practices is objectively poor and, more worrying still, wildly inconsistent from location to location.
Tough enough to maintain consistency
He says it’s difficult enough to maintain consistency even when the licensee controls all aspects of the client interaction; it is hardest of all for licensee with franchised or self-employed practices.
CoreData aggregates the results of its mystery shopping research into what it calls its ACQUIRE index – an acronym for assurances, compliance, quality, understanding, intention, reaction and environment (see Table 1: ACQUIRE categories).
The assurances measure looks at “that initial information exchange that builds confidence in the client’s mind that this planner has the capability to help me”, Allen says. “Compliance looks at looks at how the ‘compliance’ elements of advice are explained, not just as a regulatory requirement but, importantly, how they may support, benefit and protect the client with the advice transaction.
“Quality measures whether there is a strong link between what the client’s needs are and the planner’s ability to articulate how they may meet those needs. Understanding is ‘it’s all about me, the client, and the planner knows me, understands me’. Intention measures whether a client will use this planner or will they keep looking, do they need a second opinion, has the adviser convinced me to act?
“Reaction is essentially a catch-all measuring the client’s emotive response to the advice process. And in Environment we look at how easy it was to make an appointment, how many calls were needed, were agreed appointments changed, through to how professional the planner and their office was.”
The average score across the industry on CoreData’s ACQUIRE Index is 76.3 – a strong result, but not brilliant, Allen says.
“Obviously in all things like this, in all benchmarks and all measures, there’s room for improvement across the seven elements of the ACQUIRE process,” he says. “We use real clients to go through the shopping exercise – and this is really important – and they are not trained. They are real clients. It’s just completely about their experience, and the adviser’s ability to influence them to act.
“For a percentage of clients, the first value-add that the adviser gives them is just education. That education at that point in time may not take them down an advice route or a product-outcome route. It’s just education about their current status and their current position.
“We think businesses still have some way to go to climb into the 80s.”
It’s the range of scores that matters most
Allen says that for some businesses the absolute score is not the issue, it’s the range of scores between the best and the worst experiences measured across the advice network.
“We have seen in some businesses where they are disappointed in the range of results rather than the overall average score,” he says. “For a franchise-type business, you want to be able to close that range. You want to be able to provide an experience to a client that’s consistent, no matter which avenue they come through. So if it’s a bank, for example, you want to have a consistency of experience whether the client comes through Pitt Street or though the mall in Brisbane.”
Allen says some factors need to be taken into account that might undermine the overall client experience but which are no real fault of the adviser concerned. For example, if a client does not fit an adviser’s preferred segment, then it is less likely that the adviser will make follow-up calls. And they’re also unlikely to refer the client to another firm.
“I don’t think we’ve come across one instance where the client has been referred to another adviser,” he says.
Not getting the client on board is a deliberate decision by the adviser, not necessarily a failure of the engagement process.
Even so, it takes “a certain resilience” on the part of a client to approach another adviser after their first experience has been a poor one.
Mystery shopping: a worthwhile concept
NAB Wealth advice partnerships general manager Ross Barnwell says mystery shopping is a worthwhile concept, and for NAB it has led to further work to help its advice practices improve the experience their clients have.
“Through several different measures you have real clients going out there, going through the advice experience, and unlike someone who’s been skilled around financial services they are looking at it through purely fresh eyes,” Barnwell says. “So the positive of that is that you get candid feedback in regards to what the engagement process was like, through the whole value chain – what worked well, and what didn’t necessarily work well. And they are real people, with real needs looking to be addressed, and some of them can end up as real clients, and some don’t.
“But that process of finding feedback, across the [NAB-owned] licensees, has definitely helped us fine-tune some of our engagement with advisers to make sure that where we can we get a better client engagement experience.”
Barnwell says that when a licensee provides the advice business with the right context for why it’s important that clients have great experiences, then “together we can have a lot of influence”.
He says that, having used the shadow shop process as a starting point, NAB has rolled out programs for its advice practices that enable them to do further in-depth client surveys.
“Again, it goes across the broader value chain, talking about everything from the digital engagement to their front-office staff to how did they feel the technical knowledge of the group was;what was the communication like, and the like,” Barnwell says. “Through that exercise, which we started as a run-off from the shadow shop, we’ve been able to get great client insight in regards to feedback of what consumers really think and what they value – and often, importantly, what they do not, or what may be missing
each practice also receives a Net Promoter Score, Barnwell explains, and together with the client experience work it’s then possible to “re-engineer the business to make sure that they’re addressing what clients are wanting”.
He stresses that each advice practice is different, and has different strengths and weaknesses.
“There are some common themes, but at the same time we are dealing with SMEs and everyone will do certain things tailored to their particular business model,” he says.
The shadow shop happens annually and provides NAB with high-level feedback, while some practices have done the additional work and “repeated it within a year, some do it on a six-monthly basis”.
“We’ve found though that the more practices that have done it, as they have shared their findings and insights with others in the licensees, we’re getting more take-up,” he says.
“It helps us to be able to tailor our offer. And again, some things you think might add a lot of value, don’t; and there are gaps in things where you think that’s not much value – but it could be perceived as valuable by that adviser.”