As digital scams surge, who’s responsible?

This is an audio transcript of the Behind the Money podcast episode: ‘As digital scams surge, who’s responsible?

Michela Tindera
Hey there everybody, it’s Michela. To close out the year, we’re bringing you something a little different. Back at the beginning of the month, I was in London for the FT’s Global Banking Summit. While I was there, I hosted a discussion on consumer protection in the digital age — from phishing attempts to the emotional manipulation of romance scams. The so-called social engineering tactics that scammers are using are becoming increasingly sophisticated, and all of that is raising serious liability questions. So who should be doing more? The banks? The regulators? Social media companies? For this panel, I had a group of four great guests from around the industry. You’ll be hearing from Carolina Garces. She’s the global head of financial crime risk at Citi. You’ll also hear from the founder and CEO of Open Banking Excellence, Helen Child. Also, my colleague Elliot Smither who is production manager and chief sub editor at the FT. And finally, Ethan Salathiel, who’s a director at KPMG, leading on fraud transformation.

So now we’re gonna play an abridged version of that conversation for you right here. Enjoy!

[MUSIC PLAYING]

The pace at which fraudsters are developing new ways to scam people is evolving extremely fast. I should say that I’ve mentioned to several people that I’m moderating this panel today, and pretty much every person I’ve mentioned this to says, oh, I know someone who got scammed recently. So Elliot, you and I were talking about this before, and I’m wondering if you might be able to tell us just one anecdote that you heard recently about a situation like this just to kind of set the scene.

Elliot Smither
Yeah, of course. I was actually hosting a panel at the Wealth Management Summit the other day, another FT Live event, and there was a panel — actually, I think the next host, Martin Arnold, was hosting it. But there was this guy who was a former cyber criminal, turned kind of government adviser now. And he showed this video that was posted on the dark web, and it was a live deepfake video of . . . There was this lady who was talking to what she thought was a retired Asian dentist, but in real time, you could see behind this dentist was a west African criminal gang, and he was talking and it was going live in real time through the voice of this Asian guy. And this lady was completely fooled by it. Apparently she lost about $150,000, and the scary thing was, this wasn’t just an example of it, this was an ad on the dark web for criminal gangs to go and purchase a monthly subscription to this software.

So the point is, now that this stuff is out there. In the past, you had to be a digital wizard or a tech guru to do it. Now, you can just buy it off the shelf and run it from your phone. So I think the scams is maybe the same as they’ve always been — romance scams, or you know, phishing scams or whatever. But the tech advances have been staggering, which mean that, you know, suddenly anyone can get involved and I don’t think the penalties are that great. So, you know, it’s just getting scarier and scarier. 

Michela Tindera
Well, so that kind of brings me to my first real question here, which is: the surge in scams, phishing attacks, digital fraud, it’s raising serious liability questions, so should banks, regulators, fintechs, should they be doing more to protect consumers? Ethan, if you wanna start us off. 

Ethan Salathiel
Yeah, absolutely. Well, I think, you know, just an acknowledgment that fraud represents 40 per cent of all crime in the UK, so it’s huge. And as Elliott’s already referenced, we’re seeing these emerging schemes where fraud is being purchased as a service, and these organised criminal groups are really organising themselves like business models, which is really quite frightening with an entire hierarchy and structure. And it’s gonna take us as a financial services industry to actually act like a network to bring down that criminal network of activity. So that’s, you know, my first key point.

I think, in terms of what more can be done there, there’s obviously always more we can do, but a lot of good work is being done. And I think we just need to pause and reflect on that as well. We’ve got the introduction of the Payment Systems Regulator mandatory reimbursement scheme, which is now making, you know, banks reimburse victims of fraud on a 50/50 liability basis from the sending and receiving bank. We’ve got the introduction of new pieces of legislation to hold organisations more to account. And we’ve got increasing, you know, awareness where the financial services system are working with Stop Scams Alliance. It was BBC Safe Scams Week last week. And, you know, we saw programs in terms of protecting Britain, Rip-Off Britain, so a lot is being done.

But I think for me, it’s about acknowledging the ecosystem in its entirety, and actually working from an end-to-end perspective. 70 per cent of these scams originate online. So it’s not just incumbent on the financial services industry, but we need to also try and tackle fraud at source, where it stems from, and actually tackle it there so that it’s not always, you know, at the end of the lifecycle. And of course, you know where the fraudsters cash out in the financial services. 

Michela Tindera
Yeah. So in short, the answer is yes. More should be done.

Ethan Salathiel
(Michela laughs) . . . should be done, yeah. Always more to be done. 

Carolina Garces
I’m in agreement with what Ethan is saying because, first and foremost, fraud prevention and detection is not a point-in-time strategy. It’s a continuum. We need to keep improving, and, as banks, we are doing that. I appreciate the perspective from the UK. However, fraud is borderless. And I welcome your input and reflection on the ecosystem and how we need to tackle it from the end-to-end perspective.

We banks only look at one part of the puzzle, right? And I do believe that we are focusing on the regulated segment, the one that possesses transparency and already has a cadence and certain control infrastructure, but scams stemming from social engineering are in the trillions. And if I’m not mistaken, Chainalysis issued an estimate of $10bn for crypto scam invoices, right? That runs to the $10bn in ’24 alone. What’s happening is that all of these transactions, and the genesis, the starting point of frauds, is happening away from that regulated ecosystem. And that’s where we also need to be looking into.

I think that digital literacy is not where it should be, right? And there are many players that need to become more aware and more active in this space. There are payment systems outside of that regulation, and there are market and industry participants outside that segment. And I think that those numbers corroborate that statement, so the ecosystem as a whole.

Michela Tindera
Yeah. Well, Helen, I mean, what role do you think banks, regulators, fintechs play in tackling scams, and what needs to change to improve that? 

Helen Child
Well, I think we’re all unequivocally it’s a yes to your question. 

Michela Tindera
Yes. 

Helen Child
To answer that, I’d go back to Carolina’s point about it being an ecosystem play. So I’ve been involved in open banking since the get-go, since we launched it in the UK — and the UK was the first to launch it. And I would like us to be the first in the world to put some systemic risk management in place and lead the world in that systemic risk management. So to answer your question, what does that mean for the banks? Well, if you look at it as concentric circles, you’ve got the bank in the centre of it, the regulated entity. Then around that, you’ve got, in the UK, circa 300 regulated open banking entities. And then beyond that, it’s unregulated — and it’s the wild west. It really, really is. So the regulated entities will connect to an entities that aren’t regulated, and they will connect to other entities. So you get third, fourth, and nth degrees.

Now, everybody will come back to the bank, so it’s creating an unsustainable liability model for the banks, because right out there at the wild west, you have a problem, you trust your bank. You’re gonna come right back through that ecosystem to your bank. Now, that isn’t encouraging the bank to invest more and more into the ecosystem for the ecosystem to grow. I’m not saying they’re not investing, but to continue to invest at a sustainable rate, the cost of compliance for those fintechs, the regulated fintechs, is growing and growing and growing. And yet, once you get beyond those regulated borders, the ecosystem — we’re not monitoring the health of the ecosystem — so there isn’t an accreditation process in place. It’s like an Abta. When you book your holiday, you go for an Abta-accredited supplier, a travel agent. You know, there isn’t an accreditation programme. There isn’t any real-time risk monitoring, and there isn’t any insurance-backed warranty. There’s no underwriting, there’s no insurance to the banks in the fintechs to underwrite that model, OK.

So two years ago, I started working with the Treasury. We got something in the mansion house speech, which I’m really, really proud of, which is why I’m announcing it now. And I wanted to shift the dial in the industry that I’d helped create, and I wanted to do that by making the liability more sustainable for the banks. First and foremost, because they will invest more into it, and giving the fintechs more protection to encourage more innovation. And I’m now working or advising OBE’s global advisory company on open banking, open finance, advising a company called Invela. And we are working across the pond, so in the states and in Canada and in the UK with the banks, across all the regulators to introduce exactly what this systemic governance, if you will, controls in the marketplace that will help us to continue to grow. Because if . . . One more point on this, if you look at it, I’m normally on stage saying, open banking, open finance, go over the hill, you know. The growth trajectory is like this. Of course it is. The flip side of that, as we all know in this industry, is risk. So we need to have an adult conversation in open banking, open finance. You know, about managing the risk, and that’s what we’re doing. We’re stepping up as the regulators have asked us do. So yes, you know, there’s a problem, but we are, you know, coming to the market with some solutions. Does that answer your question?

Michela Tindera
Yes. I think so. So keeping that in mind, I mean, what are some of the challenges that you see in getting the industry to, it’s like, it sounds like collaboration, more collaboration is better, is needed. What are the challenges to that? And to getting the industry to invest more in proactive fraud prevention measures, that sort of thing. 

Carolina Garces
Can I just say something? The reality is that fraudsters these days are also investing the time to build trust to scam people. So there are other players out there hidden also earning the trust of the consumer so that they can trick them. So that responsibility, that obligation, is no longer sitting just with banks. To answer your question as to the challenges that we are seeing, I want to respond from a global bank perspective.

Again, the UK has made tremendous efforts, and I think it is leading in the industry in its efforts to tackle and prevent fraud. But we operate in an international environment. And some of the challenges that we are seeing, together with the evolving nature of fraud and the use of technology, is also the rapid change of regulation. And with those changes, we also see a lot of asymmetry. When you finish implementing a regulation that is obviously challenging, costly, and trying to reconcile how it will play out vis-à-vis your international customers, then you have the next trend, right?

Then the other thing that we are seeing in fraud is that the vast majority are happening through authorised mechanism, so the client is authorising that. So your detection infrastructure needs to change and needs to adapt to that trend. At the same time, when you assess what you’re seeing in these new typologies, you realise that you’re playing with a typical data, right? In former times, you just look at who the client is, moving from A to B, from B to C, right? But now you need to take into account other data factors, like geolocations, IP addresses, telephone numbers, email addresses, looking at networks, looking at timing and certain patterns, behavioural patterns. And that all is creating a more complex environment. So our traditional investigators that were specialists in a particular crime, they need to be more of generalists, fin crime generalists.

And another important point, and I think that Helen alluded to these briefly, is the need to balance security and customer experience. We often have this challenge in that we don’t want to introduce friction to the system. We are now faced with the challenge of processing transactions instantaneously, so they have to be good, cheap, transparent, and secure. right. And just one added challenge that I’ll put out there, and again, we’ve been referring to it already, is the fact that transactions are now initiated through different platforms, applications, mobile phones, and all of that together is what we need to contend with. Do we want to, do we have to, yes, we have no choice, but we need to push all of that together and make progress. 

Michela Tindera
Yeah. I mean, thinking about it that way, I mean, what role should regulators play in all of this? You mentioned that the UK has been getting more involved, but maybe thinking on this from a more global scale, Elliot, what are your thoughts? 

Elliot Smither
I think straight away, you need to have banks that have 24/7 fraud monitoring. I think that’s something that maybe not all have, but I think it needs to be put in place that that is expected. And I think a lot of problems I’ve heard happened at the onboarding or just after that process. I think you need to have like a multi-layers of defence. It’s not anymore just biometric data or particular documents. There should be different levels that you have to go through to kind of side it out. And I’m sure we’ll get on to it, but I think there’s an awful lot of AI out there that can be used to help fight this. The criminals are certainly using it, so I think perhaps regulators can sort of point people in the direction of how this could best be used, what’s safe to use and how you can work together.

And I think mainly getting people to collaborate, because you’ve got banks, you’ve got fintechs, you’ve got tech companies. And they don’t necessarily all want to pull in the same direction. I think the banks are involved certainly at one stage, and regulators are getting banks to, you know, certainly in the UK to pay up, I think cover £85,000 worth of, you know, that’s on the banks. But this stuff originates on social media. So I think there’s — a lot of it does. So I think there has to be a way to kind of get everyone singing from the same sheet.

Carolina Garces
I think at some point, banks have been seen as the gatekeepers of the payments system. I would like to see social media and marketplaces also play a role in gatekeeping. Social engineering, as I said at the very beginning, has become the entry point for fraud and marketplaces, the conduit or the vector that facilitates the selling of fraudulent products. And I think we have to accept the fact that today, there is no not just hacking of systems, they’re hacking people. And I think they have a responsibility to also monitor that.

I do believe their algorithms are very clever, and I think whenever I see all the things that pop up in my feeds — because I also use that and my 80-year-old mother is also being a victim of the algorithm — they’re very clever. And they know what I like, and surely they figured out that I’m a dog lover. I get all those feeds that I’m also interested in regulations. So it’s a weird combination of things that I’ve see in my feed, but I think that they could be equally clever to look at what’s being posted in there. Fraud as a service, right, that we were referring to earlier. That’s something that their algorithm can also look into, right? And that 24/7 detection capability, I see no reason why it could not be implemented there.

But it is about recognising that the movement of funds is not exclusive anymore to just regulated financial institutions. You have these rewards or tips or stars or rubies or whatever, that has become a similar model and infrastructure to a payment service provider. It’s a payment intermediary, and we need to start recognising that. Wallet today, they look like a bank account, but we are not calling that. We continue to focus on regulating institutions. What about regulating the activity? And if we took that approach, I think that there is a lot that can be done in here.

Michela Tindera
Yeah. 

Carolina Garces
Sorry, I get a lot of cases.

Michela Tindera
(Laughter) Yeah!

Carolina Garces
I get animated.

Michela Tindera
Well we’ve touched on this a little bit, but just to dig it more into AI and how that’s playing a role in this discussion here. AI is advancing rapidly, scammers are becoming more sophisticated in how they use it to scam people. So Carolina, how are you thinking about this at Citi?

Carolina Garces
Well, AI is the poison, but it’s also the antidote, isn’t it? We fight AI with AI, right? We are just all in that race to find the new toy, the new technology, the new capability. How to identify those deep fakes, how to recognise the pattern that the client uses to move their mouse to instruct their transactions, how they use their mobile phone. But at the same time, I know that there is somebody else watching on the other side how we improve with that. So I think to the point made earlier by Elliot and Hellen about these collaboration, I do think that that also needs to be 24/7, right.

We need to be at the other side of the telephone of an email and see ourselves as just united in these efforts to try to improve things, share those typologies that we are seeing and learn from them and try to be faster. But I think that 20 years down the line, we’ll still be here talking about the new typologies and the new methods and new technology that we could use to detect scams. But yeah, it’s the machine against the machine. 

Michela Tindera
Yeah. Helen, what do you think?

Helen Child
The AI one is a fascinating one, but I asked the guys at Invela that are building the trust platform what they’re doing in AI, ’cause they wanted AI, they wanted to learn, because it’s moving so quickly. So it’s a great opportunity to learn. And they started talking to me about trust and governance, and that sort of seemed to get it on to home territory for me anyway. I know about trust, I know about governance, and I said, OK, tell me more. Let me. And they said it’s . . . I’m gonna just read from my notes, ’cause I’m not an AI expert. They said that everything they do at Invela is grounded in evidence and that every AI output references live data. And then the second point that they told me, which I found really reassuring, is this human-in-the loop concept. And I wondered if we shouldn’t all be talking more about human in the loop, because that would seem to me to build more trust into what we all need to do. The . . . Not the poison part, but the antidote part, I think it’s a great — I use that internally now — but I think if we talked more about the human in the loop, that would help to accelerate good use of AI. I mean, we talked about, yeah . . . 

Michela Tindera
What does that mean exactly? Human in the loop?

Helen Child
OK. Let me read my notes, OK (laughter), because I’m not an AI expert: anomalous and critical decisions automatically escalated for human review. OK. And this, the Invela’s trust and governance, is built by some former ex-Google guys and some very senior execs in the industry. And what they’ve said is that you need this human in the loop, ie, you know, AI first, but then get humans to monitor and to review it.

And then also there, there’s a way of AI marking its own homework as well. And that’s an integral part of trust and governance. But then they also said was that it goes back to some that we all know in this room; it’s also a data segregation and good data management principles. So I think AI sort of goes across all of our business, and we know a lot of this stuff anyway. We know about trust and governance models, we know about data aggregation, we know about four eyes back in the day. You know, you’d have four people looking at every document before you sent that transfer back in the days of Apacs. And that’s, you know, Apacs 30 and 40. That dates me. You know, so we know a lot of this. It’s just, I think, bringing those good principles into practice now with the advantages that AI can bring. Wouldn’t you agree? 

Elliot Smither
Yeah. I mean, I cover a lot of wealth management in my job. Wealth management, private banking, and you’re talking about the human in the loop. That’s always traditionally, particularly at that high net worth end. It’s been a people business, a relationship business, and they’ve always talked about that. But you go to the wealth management summits now, you talk to the private banks, they’re all getting increasingly involved in AI and increasingly involved in this stuff because they’re trying to move further down the wealth trend. Either get the next generation, get their business, or get the mass affluent business, and you simply have to use AI and stuff like that for that. But they’ll all be talking about how the way you can target that and serve those people with that sort of relationship in mind is, again, getting to know your customers, but maybe through a digital lens, like they can programme it. So they get to know how these people invest and move money, and that flags will be raised immediately if they start doing stuff out of character. So that is definitely a way that if you . . . But obviously it’s only as good as the data that’s being put in at the right-hand side, but there’s no doubt that AI has to be a part of the solution because you can’t do this at scale, and you can’t do it at speed and 24 hours without it.

Helen Child
And to your point about regulation, you know, our regulator, the FCA has asked the industry to step up. Well, if you look at an open data economy, which is where we’re going, that by default is, you know, open across many, many regulated environments. So we have about 150 regulators. You cannot get a piece of regulation that monitors and polices all of that. So we need to, as an industry, step up, and bring some solutions to the market rather than waiting for the regulators who will never, ever be ahead of a game in this market. They can, you know, to the point you were making, they can certainly advise and signpost us into what best practice looks like. But we really do need the industry to step up.

Michela Tindera
Ethan, I wanted to ask you, so you are working across many businesses. What are you seeing that is working in the AI space?

Ethan Salathiel
We are seeing . . . Well, I mean, it’s AI versus AI, as we’ve heard. And a lot of this is about real-time detection, and there’s a lot of technology vendors at the event today. It’s about real-time, you know, speed of application. We’re seeing machine learning deployed in a lot of our banking systems to Elliot’s point, detect patterns, and then the AI sort of overlaying that to give the context, to make decisions quickly off the back of those patterns that are, you know, anonymous against sort of what’s expected from that particular user, so it is an exciting space.

We’re also seeing, in certain areas of the world where we’ve got trust frameworks starting, where the tech vendors are actually sharing kind of consortium-based data and risk signals back with the financial services sector to actually do better and calibrate and fine-tune those rules and overlay those behavioural analytics, which is so important. It’s no longer the case that I need to know your name, address, to know who you are, because there’s so much rich data in actually the way that you interact with your devices, whether you’re left-handed, you’re right-handed, you know, your swipe patterns and speed at which you go through applications and payment journeys. So that behavioural analytics layer is just so crucial.

And that’s where a lot of the focus is, but I think that the point that we should also make, which is a hot topic at the moment, is the use of agentic AI, which is essentially an AI agent, which is increasingly being deployed to improve efficiency. And as we talked about with, you know, a number of times about fraud as a service, fraud as a business model, you know, the fraudsters are using Copilot and efficiency tools to do things quickly and defraud their victims at pace. And we need to equally, you know, use those efficiency gains ourselves to, you know, summarise cases for an agent or an operator, you know, actually — years to investigate — to actually bring data in from disparate systems into one central place, so it’s orchestrated and it’s easier to make decisions in our sector. So yeah, it’s an exciting time for AI generally. 

Michela Tindera
Yeah. And just to, you know, focus a little bit more on the individual because when a person is scammed out of money, it can be a very traumatic event. Should banks be responsible for helping that person. I mean, obviously the person wants their money back, but is there even more that banks should be doing? What do you think? 

Ethan Salathiel
Well, I think, you know, I am really glad you asked this question, Michela, because we often focus on the financial impact of fraud, but we often forget that there’s a huge psychological impact as well on these victims. And that, you know, is a case of, you know, the blame, the shame, the anxiety that is carried forward and is long lasting, actually, outside of the financial loss experience. In some of my client work, I’ve had the pleasure of working with Ayleen Charlotte, who’s one of the victims of the Tinder Swindler documentary. I don’t know if any of you have seen that, but she was defrauded to the tune of €150,000 over a long duration by her fraudster. And there was a huge amount of emotional manipulation involved in that fraud and the way it transpired. And I think it’s incumbent on the ecosystem to think about ways to support people emotionally through that journey.

In Ayleen’s words, she said her bank could have intervened more quickly. And some of the care and support in helping her, you know, get a refund and reclaim some of her loss, you know, could have been better. And we are seeing some good practice along, you know, a number of clients I work with, they’ve got breaking the spell teams in call centres, which actually intercept the fraud live and actually help to coach the victim to see whether they’re actually a victim of fraud. And, you know, trying to get them to see through, you know, the sort of mystery that they’re under by their fraudster.

So we are seeing some good stuff like that happen, but always, you know, more could be done. And I think it’s about, you know, education and awareness, making sure that it’s signposted for victims, you know, where they can gain additional supports. And obviously that whole ecosystem approach that Carolina referenced earlier. Thinking about not just the financial services industry, but more broadly.

Helen Child
One of the big things the banks could do, because ultimately the liability model is unsustainable for them, is going back to this accreditation process, knowing who you are dealing with as we grow and we have a far more open ecosystem. To have an accreditation process, I just think is common sense. It really, really is. You wouldn’t book a holiday without knowing, because we’ve been through all that trauma in the holiday industry. So, you know, as we expand, as rapidly as we are expanding, there are some tools really, you know, that we can start monitoring the ecosystem far better. A lot of the ecosystem outside of those unregulated boundaries, you know, to your point entirely is unregulated. It’s unmonitored and it, you know, there are some good actors, but there’s also some, you know, bad actors and you know, it’s a bit of a wild west out there. So I think everybody needs to step up to the mark.

Ethan Salathiel
Yeah. And just in terms of consumer protection more generally as well. And I think it’s important — you mentioned social engineering earlier — you know, that’s where fraudsters is doing their research before they target a victim. And actually the banking sector can do the same and, you know, flip that on its head, actually investigate, you know, how to communicate well with particular victims and look at the data, you know, look at the cohort segmentation of your customer base, make sure the messaging is tailored to those specific audiences. So we know students are more, you know, a potential victim of money muling activity, and elderly people perhaps more on the investment scam side of the spectrum. So actually personalising those messages, using AI to help you personalise those communications, can really help sort of up the ante here.

Carolina Garces
I would just add one point because the customers are also corporate, small, medium, or large ones, but it doesn’t really matter how big they may be. Some of them are still lacking on their data security, which is usually . . . It’s also a common starting point for fraud, and I think that there is a lot that can be done in terms of educating both corporates and individuals about that basic security. Simple things, bring it to tangible terms: how you control your data, how to keep it safe, changing passwords, and just basic things of that sort. But I also think that if we think of the individual victim, right, of these social engineering scams, if we were showing how these scams work, you know, there are fantastic podcasts that bring these to life, and it often makes you think, will I have fallen victim of this type of scam? And you’re thinking, if somebody had said these and if they had presented these and the way they operate, it just, they build that trust, they give you results. And when the bank asks you, are you sure you know this person? Is this a, a person that you know and trust? Think before you send your money. And they’ll say, yes, of course. I’ve been chatting to this individual for 10 months. The answer to that binary question is yes, but they don’t really.

Elliot Smither
And I think a key thing here as well — you, I think you mentioned earlier that 40 per cent now, I think, in the UK, of crime is now this kind of financial stuff. But there’s also evidence that the vast majority goes unreported, so it could be way, way higher than that. And I think the only way to really combat that is to encourage everyone to report everything. And there might be reasons not to — it could be embarrassment, it might be small amounts. You surely can’t build up a robust defence unless you really know what you’re dealing with. So I think you have to make it easier. And, as you say, educate people. Encourage them to report every little thing and, yeah, I think that’s the way to fight it.

Michela Tindera
All right. Well I think on that note, we can wrap up. I wanna thank my panellists again for joining me in this discussion, and thanks to the audience for listening. Really appreciate it. Thank you.

[APPLAUSE]

Behind the Money is hosted by me, Michela Tindera. Fact-checking by Simon Greaves. Sound design and mixing by Sam Giovinco. Original music is by Hannis Brown. Topher Forhecz is the FT’s acting co-head of audio. Thanks for listening. We’ll be back next week with my colleague Martin Wolf’s thoughts on what to expect for the economy in 2026. See you then.

Continue Reading