24 June, 2016
What you need to know
- ASIC recently released Consultation Paper 254 and a draft Regulatory Guide on digital financial advice, and is likely to impose new regulatory standards on testing the algorithms behind automated advice systems.
- External Dispute Resolution schemes are likely to require the same standard of care, fairness, and consistency in financial advice whether the advice is delivered through a natural person or an automated system.
- Transparency in the nature of the service provided, the limitations or assumptions behind any advice, and in complaint handling is seen to be even more important in automated or digital advice models.
Robo and digital advice
The Australian Securities and Investments Commission (ASIC) recently released a draft Consultation Paper on digital advice 'Consultation Paper 254 Regulating digital financial advice' (CP254), and a draft Regulatory Guide '000 Providing digital financial product advice to retail clients' (the draft RG). These documents are discussed below.
CP254 suggests that only 20% of Australian adults receive financial advice. A key aspect of the recent fintech phenomenon is the opportunity to make accessible to a broader range of users, including retail users, services which may previously have been inaccessible to them, or which may have been too expensive for them.
Unsurprisingly, digital automated financial product advisory systems (‘robo advisors’) are becoming more prominent in the financial services industry in Australia and throughout the world.
What is robo advice?
Digital automated financial product advice has been available in Australia for a number of years, but has become mainstream over the past two years.
Digital advice works by a system which asks clients a set of questions about their personal, financial and lifestyle objectives and circumstances. Less sophisticated systems provide ‘calculator ‘style outcomes, and may be focused on lead generation. More sophisticated systems use algorithms to process the client’s answers, form a risk profile, and deliver financial advice and an investment strategy. Some systems go further, to portfolio analysis, and order processing and execution.
Interfaces are becoming increasingly intuitive, easy to understand, accessible at any time and on mobile devices, and are relatively cheap. They seek to put the power in the user’s hands, without the need for a human advisor.
That sounds great… doesn’t it?
Arguments for digital financial advice are strong, especially for non-complex advice, or ongoing monitoring of investments. Shifting to digital advice creates cost benefits, ease of use and accessibility, and gives the investor more control.
On the other hand, automated advice carries potentially significant risks. The client relies on the system the provider has developed, including the assumptions and methodologies which underpin it. These include: does the system ask the right questions; does the client understand the questions adequately to answer correctly; does the system identify whether the client is not a suitable advice candidate; do the algorithms operate as intended; are the assumptions in the methodology relevant? Furthermore, there would be an expectation that the provider is monitoring the system, and working with the client to resolve any issues.
If one of these elements breaks down, the client may suffer detriment. Being digital and repetitive, any problem is likely to be replicated in high volumes. If clients suffer detriment on a large scale, the advisory product, and / or the entire industry could experience reputational risk.
How does ASIC propose to regulate digital financial advice?
Digital advisors are required to adhere to the existing Australian financial services laws.
On 21 March 2016, ASIC released CP254and the accompanying draft RG for public comment. Submissions can be made publicly or anonymously before the submission period closes on 16 May 2016. The issues posing the most challenge for ASIC are how to apply the organisational competence obligations to an automated advice model, and what level of standard should be required to monitor and test the underpinning algorithms.
CP254 asks 11 questions which address the adequacy and usefulness of the proposed regulatory guidance, the skills and competency standards of the responsible manager(s), and the methods and resulting cost to provide rigour and assurance around the algorithms and advice.
CP254 proposes that at least one responsible manager holds requisite skills and competence as required under
ASIC’s standards, even if a machine produces the actual advice. This is clearly appropriate, given that there should be a responsible manager competent to monitor the advice generated by the system.
The draft RG sets out that ASIC expects any digital advisor to maintain a strong change management system, and
be able to reproduce any algorithm used in the past seven years, show testing logs, and monitor algorithms.
Further, ASIC has recognised the risk of cyber attacks, and required providers to maintain high standards of security consistent with known digital standards.
The ‘best interest’ obligations will still apply to automated advice , and if a client is not a suitable candidate for the advice, they should be informed, and the advisory relationship ceased.
Financial calculators
In the early days, digital advice more commonly took the form of a financial calculator model. ASIC issued Class Order [CO 05/112] permitting Australian Financial Services (AFS) Licence holders to offer a financial advice calculator without being an authorised advisor provided that certain conditions are met. CO- 05/112 is also subject to a consultation paper, and is likely to be remade with minor edits.
How will the External Dispute Resolution schemes approach digital financial advice?
Dr June Smith, Lead Ombudsman in the Investment and Advice division of the Financial Ombudsman
Service Australia (FOS), discussed the FOS approach to robo advice at the 2016 ASIC Forum.
FOS expects many of its 14,000 members to offer automated advice, so is preparing internally. In automated advice cases and when identifying systemic issues, FOS will continue to follow the law, and also apply its focus on fairness and consistency.
This focus on 'fairness' includes:
- fairness of service provision;
- fairness of conduct and dealings; and
- fair treatment of all clients.
FOS considers that a key risk of automated advice is clarity of service provision. This involves clarifying the upfront relationship, the client’s expectations, and also the limitations of the advice. It is important that clients are made aware of limitations, and are assessed as not suitable for the automated advice if the advice model or assumptions are not appropriate for the client.
Dr Smith indicated that accountability is still important. FOS will not accept a claim that that poor advice ‘was the algorithms fault’, and makes the point that there is still a human (the client) involved, so digital advice is not a ‘humanless’ process.
Impact of automation
US banks are adopting automated advice rapidly. New products appeal to millennials, permit low cost investing, and open the financial products market to a new segment of retail clients who would usually not otherwise invest.
Since 2015, Australian banks have openly discussed offering digital automated advisory advices to clients either through an online banking platform, or for a low fee. The Midwinter’s Robo-Advice Survey1 estimates that 90% of users will be generation Y or Z clients, making robo advice an attractive proposition for any advisory institution. It is logical to expect widespread adoption of automated advice across banks and other advisory institutions.
Proponents of digital advice claim that automation is more reliable than human advisors. Automation does provide consistency, within the parameters set by the system, but equally has limitations. An error replicated consistently has the potential to create high volume systemic breaches. Replacing human advisors with robo advisors in banks will not eliminate risks attached to the advice model – close monitoring is still required to ensure that advice is suitable for clients, and that the algorithm is performing as required. Automated advice may only be appropriate in non- complex advisory scenarios.
Automated advice will place a heavy obligation on the institution's compliance functions to monitor algorithms, monitor the quality and applicability of the advice, and actively engage in remediation programs to address shortfalls in quality and systems become more sophisticated.
The Australian Prudential Regulation Authority (APRA) commented in Australian Prudential Practice Guide 223 that Australian Deposit-taking Institutions (ADIs) should not rely on automated system vendors to carry out model validation. Doing so would present a clear conflict of interest. APRA would expect the ADI to develop its own in-house expertise as appropriate. This could see the need for current banking roles to take on more data analytics, and banks to increase their data scientist pool. Robo offerings may also create a greater reliance on professional consulting firms.
How are European regulators responding to automated advice?
Europe is increasingly seeing more sophisticated automated advice and ordering systems. Some compare and contrast investment instruments, others are moving into insurance, and pension markets.
The European Banking Authority, European Securities and Markets Authority, European Insurance and Pensions Authority, and the Joint Committee of the European Supervisory Authorities published a consultation paper on 5 December 2015 to collect market feedback on a range of questions about automated or digital financial advice.
Some key risks flagged in the consultation paper include:
- Consumers making incorrect decisions based on a lack of information and reduced opportunity to fill gaps or ask for clarification.
- Consumers receive unsuitable advice due to not understanding how the algorithms use their answers.
- Internal tool bias.
- Risk of the tool’s algorithm being hacked, and clients suffering loss.
- Volatility caused by ‘herding risk’ with clients being pushed in the same direction.
Interestingly, the paper seeks to define automated advice rather than establish methods of control or risk mitigation, other than asking for comments on whether client’s should be given the option to also speak with a person.
Until further guidance is issued , existing quality standards remain. All advisors (human and automated) are required to apply ‘know your customer’, and to establish an investment profile of the customer. The advisor must have a ‘reasonable basis’ on which to make an advisory recommendation.
United Kingdom Financial Advice Markets Review
In August 2015, the United Kingdom (UK) Government and the UK Financial Conduct Authority carried out the 'Financial Advice Markets Review' (FAMR). The FAMR investigated issues with the financial advice market from a mass retail perspective, focusing on accessibility, affordability, and consumer protection. The resulting report published on 14 March 2016 recommended possible improvements to the system.
One interesting suggestion (recommendation 3) was to consult on new guidance to support advisory firms offering services that help consumers make their own investment decisions without giving a personal recommendation. If this recommendation is implemented, automated guidance systems could become common.
An observation from Ashurst's UK colleagues is that the UK regulator is noticeably cooperative with the industry, and is working with business to achieve a positive market-wide outcome. The industry expects further guidance on automated advice and the use of technology. Until then, regulatory material is 'media / technology neutral', so rules that apply to financial advice generally apply to all types of technology. This is not dissimilar to the Australian regulator's treatment towards different types of technology or media.
The future of digital and automated advice in Australia
Automated advice is here to stay, and is likely to continue to develop into a more sophisticated offering, probably combining with money management tools, behavioural tools, big data, and becoming part of our everyday internet banking offerings.
Advisors considering offering automated advice models in Australia must expect to fit within the current regulatory framework, and must focus on:
- ensuring algorithms are regularly monitored and tested;
- ensuring assumptions are appropriate for the client base, and are clearly communicated to the client;
- being transparent about the service’s capabilities and limitations;
- being transparent about incentives, benefits, and any bias towards aligned products; and
- having clear procedures for monitoring advice and remediating clients who may have suffered as a result of the automation.
One would expect ASIC will take care when issuing AFS licenses to automated advice providers. Participants are expected to have a strong grasp on Australia’s regulatory requirements, a strong compliance culture, and a commitment to technical quality. This could be seen by some as a potential barrier to entry, but it is seen as essential to manage risks and maintain market integrity. ASIC has clearly stated on numerous occasions that its strategic priorities include maintaining investor and financial consumer trust and confidence, and fair markets.
1 Midwinter’s Robo-Advice Survey,
http://www.midwinter.com.au/midwinters-robo-advice- survey-results/.
For further information, please contact:
Jonathan Gordon, Partner, Ashurst
jonathan.gordon@ashurst.com