Quality financial advice traditionally rests on the abilities of a financial adviser and the support of their licensed dealer group. The rapid rise of automated (or robo) advice presents a new paradigm.

It replaces the financial adviser with an algorithm and, if quality advice is to be ensured, necessarily raises the level of oversight required by the licensed dealer group.

That is the central implication in a recent Australian Securities and Investments Commission (ASIC) consultation paper Regulating digital financial product advice, which sets out a range of proposed new standards governing automated advice. Note: problem with this link. I want to remove brackets but the end bracket is part of link.

Those standards also flag the potential dangers for dealer groups and licensees exploring new automated advice offerings.

The regulator’s view

Financial advisers must meet a number of stringent training and competency standards, a number which is likely to rise under proposed new legislation.

At least one robo-advice employee, responsible for significant day-to-day advice decisions, must also hold those same financial adviser qualifications, according to the corporate regulator’s proposed guidelines.

ASIC’s recommendations about monitoring and testing algorithms are likely to present a much greater challenge.

Firms must hold documentation outlining the purpose, scope and design of algorithms and test their robustness on an ongoing basis, which is outlined and recorded in test documentation. They must also outline processes to manage algorithm changes, as well as protect the security of the algorithm, and be able to control, monitor and reconstruct algorithm changes over the past seven years.

Among other changes, ongoing samples of digital advice must also be reviewed by a human adviser to ensure that they are compliant.

Responsibility cannot be transferred to a computer

ASIC’s focus on algorithms highlights the fact that there is no such thing as ‘pure’ automated advice – people ultimately remain responsible for the quality of financial advice. It is a responsibility that cannot be transferred to a computer.

To remain compliant with the law, automated advice services will need to ensure that they clearly define the type of financial situations their advice algorithms can handle.

This means placing strict caveats around the type of questions and advice provided to clients, typically via a range of triage points. If a user hits these they will be provided with warnings to highlight where the scope of advice is limited.

Unfortunately, this also means users may end up stonewalled at the end of the advice process when their real-world and more complex financial problems cannot be solved.

They either receive a message which prevents them from further using the automated advice tool or are directed to contact a traditional face-to-face adviser – solutions which fail to meet their original needs.

Build or outsource?

Many of the automated advice offerings in the market today are relatively simple, offering single issue advice on relatively simple topics, such as asset allocation advice.

Firms wanting to advise a greater number of clients with broader needs through their automated advice services will need to construct more holistic algorithms that take into account more complex situations.

But as ASIC makes clear, this entails a significantly higher level of risk to ensure the advice remains appropriate.

The added complexity and compliance risks associated with holistic goals-based algorithms and complex modern retirement products require extensive maintenance and testing. This can be overly costly for individual advice providers. Outsourcing of such functions – with appropriate internal oversight – is likely to be a key efficiency driver in a low-margin, automated advice environment.

The key question robo providers need to answer is what is the value they add to customers? Is it in their proprietary algorithms or is it in the communication, engagement and customer experience they deliver?

Join the discussion