Not long ago, I was at the FintechLIVE conference and literally everyone was still talking about AI.
You couldn’t move for CTOs and fintech founders at pains to tell you how fast they’re intending to implement “agentic intelligence” and “hyper-personalisation.”
But sitting there, one thought kept circling in my head:
If everyone’s automating connection… who’s actually building it?
Because in the financial sector, trust isn’t a feature. It’s the whole product.
And yet, most fintechs are now trying to scale it through lines of code.
When personalisation becomes impersonation
We’re at an interesting moment.
Technology lets us talk to millions as if we know them but the more we automate, the more people start to notice when it’s not quite right.
You’ve probably seen it yourself:
• A “personalised” app notification that gets your name right but your situation wrong.
• A chatbot that confidently explains mortgage options to a student with no income.
• Or that wealth platform that keeps sending retirees reminders about “maximising your career growth.”
It’s the uncanny valley of marketing – close enough to feel human but not enough to feel real.
Automation bias: when smart tools make dumb mistakes
There’s a psychological term for this: automation bias. It’s our tendency to trust the system just because it’s automated.
Marketers fall into this all the time. We assume that because something’s data-driven, it must be right.
But AI doesn’t understand context. It knows what someone did, not why.
And in finance, the why matters.
The paradox of personalisation
Here’s where it gets tricky. You need automation to scale, but the more you automate, the less human it feels.
It’s not a simple trade-off. It’s a balance.
The tech should handle the heavy lifting. The human side should hold the meaning.
It’s rare that a fintech brand gets it right, but when it does it stands out.
Take Starling Bank’s “Make Money Equal” campaign.
They used data to highlight how financial articles using 600 stock images spoke differently to men and women. Men were often shown to be the “investors”, whilst women were portrayed as the “savers.”
Instead of using AI to fix the problem, they used it to see the problem. Then they brought the human insight: let’s make the language of money fair.
That’s personalisation done properly because it understood something real.
What trust looks like in practice
Yes, accessibility to everything digital is growing at pace. But EY’s research in 2023 found that more than half of consumers are still deeply concerned about identity theft, fraud and data misuse.
So, when fintechs over-automate, they risk eroding trust before they’ve even built it.
Personalisation works best when it feels like permission, not intrusion.
And that means putting the human filter back in place, checking what your algorithms are saying and asking:
“Would this still sound right if a person said it?”
Where the opportunity lies
This isn’t a call to go backwards. It’s a call to use automation differently.
Let the robots crunch the numbers.
But let humans craft the empathy.
Because in the end, what clients in finance really want isn’t just a tailored message.
It’s proof that someone understands what matters to them.
If you think your personalisation is powerful but your trust metrics aren’t moving, you might be sending out the wrong messages.
I’ve been creating something in the background that can help with that.
Speak soon,
Dan












Leave a comment
Your email address will not be published.