Using AI to help plan your finances? Here’s what ChatGPT gets wrong

By Iona Bain

Using AI to help plan your finances? Here’s what ChatGPT gets wrong

Is AI an accurate tool for learning about investing? Chat GPT getting one third of financial questions wrong doesn’t slow down usage from Gen Z and Millennials (Picture: Getty Images)

It’s the em dash, apparently. That extra-long line you might have noticed in social media posts, blogs and emails – and it could be a giveaway that ChatGPT has entered the chat.

This distinctive punctuation mark is apparently a favourite of the world’s most popular AI chatbot. Its sudden appearance in everyday writing has sparked suspicions (and a rising feeling of awkwardness among those of us who do genuinely use it!).

Maybe all those heartfelt LinkedIn posts about what the death of a family parrot can teach us about leadership aren’t quite what they seem…

Spotting more serious signs of chatbot influence isn’t always so easy, especially when it comes to our finances. New research from Fidelity International suggests that 25% of Gen Z and millennials are using AI to learn about investing.

Yet ChatGPT may be getting up to one in three financial questions wrong. That’s according to broker analysis site Investing In The Web, which asked 100 personal finance questions such as ‘How do I save for my child’s education?’ and ‘What are the pros and cons of investing in gold?’.

A panel of experts reviewed the responses and found 65% were accurate. But 29% were incomplete or misleading while 6% were flat-out wrong.

A quarter of AI-generated summaries about finance queries on Google have been found to be inaccurate (Picture: Getty Images)

And it’s not just ChatGPT. Many Google searches show an AI-generated ‘overview’ at the top of the results page. A study by financial services digital agency Indulge found a quarter of these summaries for common finance queries were inaccurate.

Ironically, Indulge used ChatGPT’s latest model to fact-check each Google overview. Phase two of the study will involve human experts weighing in.

Paul Wood, the director overseeing this research, is not impressed. ‘Anything less than 100 per cent accuracy is, in my view, a failure of the system,’ he says.

So why is generative AI often wide of the mark? It depends entirely on the prompts it is given and the data it is trained on, both of which can be flawed or outdated. It rarely shows its workings or sources.

And, to put it bluntly, ChatGPT is designed to sound polished and plausible. Too often it resembles a smooth-talking chancer trying to blag their way through a job interview. To be fair, humans don’t have a spotless record here, either. The Financial Ombudsman received 1,459 complaints about financial advisers last year and upheld 57% of those relating to misselling or suitability of advice, which made up the most complaints. That’s a tiny proportion of the hundreds of thousands it receives about the wider financial industry, but still.

For most people, professional advice simply isn’t accessible. According to a poll by asset-management giant Schroders, three quarters of advisers won’t take on clients with less than £50,000 to invest. It’s because advisers typically charge a percentage fee and smaller pots aren’t worth their while.

With banks and pension providers not regulated to give financial advice, is it a surprise users are turning to AI? (Picture: Getty Images)

Meanwhile, banks and pension providers can’t offer straightforward guidance about your money because they’re not regulated to give advice. So is it any wonder AI is stepping in?

The financial sector knows it has to catch up. The Financial Conduct Authority is changing the rules to allow more firms to offer ‘targeted support’, sometimes via AI. For example, it wants pension funds to be able warn a customer if they are drawing down money from their nest egg too quickly and investors to be told if cheaper funds are available.

A senior figure at a major financial firm recently told me about a customer who held their pension and bank account with it. When they tried to cash in their retirement pot, staff spotted regular gambling activity on their statements. Instead of waving it through, the firm urged the customer to seek help.

Some financial advisers are automating admin tasks to cut costs and serve more clients, including those with less money. Octopus Money blends AI-generated suggestions – via a proprietary algorithm – with human money-coaches.

Other tools, such as specialised chatbots, can analyse your finances and tell you where you’re going right – or wrong. Take Cleo – it offers two tones: ‘hype mode’ praises your good behaviour while ‘roast mode’ gives you a playful telling-off and might say ‘here are the companies that are bleeding you dry’.

Apparently, most of Cleo’s seven million users prefer roast mode. Maybe we all know deep down that financial tough love can go a long way.

Which brings us back to ChatGPT, infamous for telling you your ideas are brilliant. To avoid its pitfalls, give it as much detail as possible in your prompt. Always ask for sources and remember that its answers may not be current or relevant to the UK. Check privacy settings if you’re concerned about data being used to train future models.

Give as much detail as possible and check privacy settings to get the most out of AI and avoid training future models (Picture: Getty Images)

And most importantly, don’t treat its advice as gospel. Specialist financial AI could be a game-changer. But right now? I’m not sure I want the robot equivalent of Del Boy handling my investments – do you?

Read More…