Using ChatGPT to support your work as an IFA? 5 crucial things to know first

AI is the new normal – there are no two ways about it. Workplaces and individuals alike are using AI tools to support their learning and improve efficiency.

As an IFA, you may already be using ChatGPT or a similar large language model (LLM) to support your work. Our LinkedIn survey suggests that 63% of advisers are already using LLMs.

The benefits of using LLMs at work are evident – but there are also risks and downsides you simply can’t ignore.

Here are five things IFAs need to know before using ChatGPT at work.

1. ChatGPT can help you manage “boring” admin

There are plenty of LLMs with similar functionalities to ChatGPT – but for simplicity, let’s stick to ChatGPT, which is by far the most popular programme.

This software can:

  • Hold a “conversation” with you, helping you to organise complicated information
  • Provide interesting sources of information for you to explore a topic in more depth
  • Create graphs, tables, and spreadsheets (with some limitations)
  • Help you draft emails and other communications.

For many IFAs, this kind of software may be the shortcut you need to re-establish work-life balance.

As an example, you may be trying to post more on social media, especially LinkedIn, but it always slips your mind. You could ask ChatGPT to create an easy-to-follow content schedule and help you come up with post ideas.

2. ChatGPT could make your comms more relatable and empathetic

A huge part of your job is being a confidant for those who need advice and support. On top of providing data-driven recommendations (with the help of adviser-focused software, paraplanners, and compliance experts), you’re expected to listen carefully and remain empathetic at all times.

The thing is, you only see your clients once or twice a year. In the meantime, you may be responding to worried emails and need help warming up your tone of voice.

Let’s use the Autumn Budget as an example.

You receive an email from a concerned client: “I’m really worried about what Reeves is going to do. I read online that I should take my tax-free cash before the Budget just in case.”

You write out a quick reply: “We don’t know what Reeves is going to do yet. You might regret taking your cash if she doesn’t change anything related to pensions.”

But on rereading, you don’t feel it’s empathetic enough. You prompt ChatGPT for help: “Can you make this email more reassuring?”

ChatGPT suggests: “It’s still unclear what, if anything, Rachel Reeves might change around pensions. For the moment, staying put is the wisest move – and we’ll keep you informed as soon as anything changes.”

In just a few seconds, your original email becomes warmer and more reassuring for your client.

3. LLMs can “hallucinate” – or in other words, get things wrong

It’s important to remember that while ChatGPT seems “human” and can hold a nuanced “conversation” with you, it is a data processing programme that does not understand the difference between right and wrong.

With this in mind, if you are asking ChatGPT to analyse complex financial data or help you understand a new topic at work, the information it offers may not be accurate. In fact, it’s very common for it to get facts wrong – New Scientist reported in May 2025 that when summarising known facts about people, GPT-3 hallucinated 33% of the time, while GPT-4o mini hallucinated 48% of the time.

OpenAI recently released ChatGPT-5, a more powerful version of the model, but TechRadar still reports that 1.8% of its answers contain “grounded hallucinations”.

Fact-checking is still crucial. Just like any other form of research you do, you should be able to back up what ChatGPT tells you with a reputable source.

4. There are significant GDPR risks involved in using ChatGPT

One big no-no is to start using ChatGPT to analyse information clients give you – particularly if it is sensitive.

ChatGPT itself recommends the following:

  • Don’t enter any identifiable or sensitive data (names, contact details, case info, client data).
  • Anonymise or pseudonymise examples before sharing.
  • Implement a usage policy for staff outlining what’s acceptable to input.

Remember that LLMs are a very new form of technology. OpenAI has been under legal scrutiny for ambiguous GDPR guidelines for some time – so it’s worth erring on the side of caution as an adviser.

5. OpenAI and similar companies are losing money, and a market correction could burst the bubble

Nobody knows what the future holds for ChatGPT and similar programmes.

Even OpenAI CEO Sam Altman has warned that a market correction is on the horizon due to the rapid inflation of AI company valuations worldwide, Nasdaq reports. As investors plough more cash into AI, many of the big players are still operating at a loss – Reuters reports that OpenAI was $13.5 billion in the red in the first half of 2025.

There is no crystal ball that spells out the future of LLMs, especially with regard to their role in financial services. At this stage, it may be wise to integrate these software applications gradually, rather than becoming over-reliant on them.

Benefit from a full package of tech support with Corbel Partners

Our in-house tech stack and expert support team can help you build technology into your practice in a way that suits you. Pick and choose how you want to integrate technology as an adviser, and we’ll help you make it happen.

To find out more about joining our network, email hello@corbelpartners.co.uk or call 01925 637891.

Please note

This article is for general information only and does not constitute advice. All information is correct at the time of writing and is subject to change in the future.

More stories

13 October 2025

The age of AI has arrived. Here’s why your clients and prospects need professional advice more than ever

23 September 2025

Consumers are using AI to find their new financial adviser. Here’s what you should know

Corbel Partners Limited is authorised and regulated by the Financial Conduct Authority.

Registered in England and Wales. Registered number: 05280582.

Our complaints procedure is available on request and if you cannot settle your complaint with us, you may be entitled to refer it to the Financial Ombudsman Service at www.financial-ombudsman.org.uk

Information contained in this website is based upon UK legislation and regulation and is targeted at consumers based in the UK.

Corbel
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.