Smashing Security podcast #466: Meta sees everything, Copy Fail, and a deepfake gets hired
Unknown 2026-5-6 23:30:49 Author: grahamcluley.com(查看原文) 阅读量:1 收藏

Unknown

So these employees, they're drawing little boxes around things to say, this is a flower pot, this is a traffic cone, this is a coffee tender, whatever it may be.

And yeah, but this, this is the reality. This trillion-dollar industry of AI, the actual truth is there's an awful lot of people who are having to do this.

Smashing Security, Episode 466. Meta sees everything, copy fail, and a deepfake gets hired with Graham Cluley and special guest Paul Ducklin.

Hello, hello, and welcome to Smashing Security, Episode 466. My name's Graham Cluley.

PAUL DUCKLIN

And my name is Paul Ducklin.

GRAHAM CLULEY

Hello, Paul. Good to have you back on again.

PAUL DUCKLIN

Hello, Graham. I always forget that when you do that pregnant pause, I'm supposed to give my name, even though I've done it many times before, I make the blunder every time.

GRAHAM CLULEY

You're not the only one. Sometimes I have to whisper and sometimes the whisper is edited out afterwards, say, "Say who you are." Yeah.

PAUL DUCKLIN

So, oh no, if it gets that bad with me in any future episode, I'm happy for you to leave the stage whisper in because that will focus my mind for next time.

GRAHAM CLULEY

How has the world of cybersecurity been in the last Well, since we last spoke?

PAUL DUCKLIN

Well, it has been a lot more of the same, hasn't it? More AI panics, more bugs, more patches, more social engineering attacks, more of everything.

GRAHAM CLULEY

It's always been that way, hasn't it? It's always been on that direction. You can't ever say, "Oh, it's been really, really quiet.

This whole cybersecurity business seems to be shutting down. Maybe we'll have to find ourselves a new job."

PAUL DUCKLIN

We'll get into fishmongery. Well, phish with a PH would come after us, I'm sure.

GRAHAM CLULEY

Well, before we kick off, let's thank this week's wonderful sponsors, Action1, ESET, and Vanta. We'll be hearing more about them later on in the podcast.

This week on Smashing Security, we're not going to be talking about how a Brazilian firm that protects businesses from DDoS attacks was found to be helping a botnet that launched massive DDoS attacks.

You'll hear no discussion of how Microsoft Defender started mistakenly detecting DigiCert root certificates as a Windows Trojan horse.

And we won't even mention how hackers managed to steal source code from cybersecurity giant Trellix. So, Duck, what are you going to be talking about this week?

PAUL DUCKLIN

Graham, I am going to be talking about the latest BWAIN.

PAUL DUCKLIN

And when I say "BWAIN," it is bug with an impressive name.

And if you've been looking at the IT media at all in the last week or two, you will probably have seen the words "copy fail." And I'm going to be talking about another reason why you should probably steer well clear of Meta's smart glasses.

GRAHAM CLULEY

Plus, we'll be talking about the danger of deepfakes with Jake Moore of ESET, who actually tricked a company into giving his deepfake a job offer after online interview.

All this and much more coming up in this episode of Smashing Security. Well, we've got time now to chat about one of the sponsors of this week's show, Action1.

Now then, if you are a systems administrator managing endpoints every day, you've probably postponed patching at least once, not because you forgot, but because you didn't feel like gambling with uptime.

JOE

Meanwhile, the backlog grows, vulnerabilities pile up and patching stays stuck in manual mode.

GRAHAM CLULEY

Well, Action1 fixes that. Action1 is a cloud-native patch management platform for Windows, macOS, Linux, and third-party apps, all from one place. No VPN needed.

JOE

Curious on how easy it is to start with Action1? Well, you can use it on your first 200 endpoints for free forever with no functional limits.

GRAHAM CLULEY

First 200 endpoints for free forever? That's bonkers. Incredible, Joe.

JOE

So if you're looking to automate patching at scale and get weeks, even months of your time back, go to smashingsecurity.com/action1 and sign up for patching that just works.

GRAHAM CLULEY

That's right. It's not a disguise-free trial. There's no credit card required. There's no hidden limits. All you have to do is visit smashingsecurity.com/action1 and get started today.

And thanks to Action 1 for supporting the show. So, Duck, I need to make an apology. Not specifically to you, to the listeners, I think.

Back in February, I had a bit of a rant with James Ball about the smart glasses made by Meta, and I explained how Mark Zuckerberg's company was planning to turn on facial recognition within the glasses and how it had been reported that there was this secret internal memo inside the company that said—

PAUL DUCKLIN

I thought you were going to say inside the glasses, which would be more exciting.

GRAHAM CLULEY

No memo inside the glasses. They had this secret memo where they said, "Look, there's going to be a bit of aggro when we launch this.

You know, people who care about privacy and stuff are going to object." And so they said, "The best thing that we can do is time this to be when there's some other political chaotic event going on, distracting all those civil liberties people."

PAUL DUCKLIN

That is much, much, much more a question of deep thought and planning than, "Hey, let's just do it late on a Friday afternoon." Yes. Let's do it when World War III breaks out.

No one will notice.

GRAHAM CLULEY

Anyway, the reason why I need to apologize about this is because I know some listeners may have been put off Meta Smart Glasses because of that.

GRAHAM CLULEY

Yeah. So I'm going to hope to address that right now by convincing everybody else listening that they also should not buy Meta Smart Glasses.

So there's another reason, as if that weren't bad enough, 11 episodes ago, we are back with another alarming story emerging about how these AI smart glasses that are being sold on the high street, and I was in an airport recently, it was in a duty-free store.

They've been sold all around the world.

PAUL DUCKLIN

Are they getting bought though? Do people really like these things?

GRAHAM CLULEY

Millions of them have been sold already, Duck.

GRAHAM CLULEY

Something like 7 million or something.

People are going around with these Buddy Holly-style glasses, and of course, these are geeky kind of glasses, but with cameras and a microphone and an AI voice assistant tucked into the frames.

And if you hear what Meta are saying about them, one of the central messages they push out and they want you to remember, and this is the quote they give people, is they are "designed for privacy controlled by you."

PAUL DUCKLIN

Designed for privacy. So let me get that right.

These are spectacles that you wear that record the entire world around you, upload stuff to the cloud even when you forget that this is happening, and that somehow improves your privacy and everybody else's.

I'm loving to hear how that works, Graham. Do tell.

GRAHAM CLULEY

So their message is that you are in control of your data and your content. This is what they claim.

So two Swedish newspapers, Svenska Dagbladet and Göteborgs-Posten — apologies to anyone Swedish for the mangling I've done of your beautiful language — they were working with a freelance journalist in Kenya.

PAUL DUCKLIN

Ooh, when you mention Kenya, I've got a suspicion that I know where this is going and why.

GRAHAM CLULEY

Well, you may well be right.

PAUL DUCKLIN

Does it involve something beginning with S?

GRAHAM CLULEY

It does. And so let's explain what's going on. So when you wear your fancy Meta glasses and you say, "Hey Meta, what is this thing that I'm looking at?" the AI helpfully responds.

It tells you, right? And you may well ask yourself, who actually sees that footage that you've just taken? Who analyzes it?

And the answer is that it's 1,000-odd people sitting in front of computer screens in Nairobi. In Kenya, there is a firm called— Duck, do you want to fill in the blank?

PAUL DUCKLIN

I don't know how you say it. I presume it's Sama.

GRAHAM CLULEY

Yeah, I don't know if it's Sama or Sama. S-A-M-A. Which one should we go for today? I don't know what it is. Sama. Let's try Sama.

So, they are the same Sama that Meta hired previously to moderate Facebook posts.

Now, as was well documented, that earlier contract ended in lawsuits and former employees of Sama describing frankly the horror and the psychological trauma that they had to endure from being made to watch what is uploaded to the internet by Facebook users in their Facebook posts.

And they were doing this 10 hours odd a day, not very much money. And Sama later said that they regretted, you know, taking on that work for Facebook.

PAUL DUCKLIN

Yes, who would have thought that paying people modest income to decide that videos of people getting beaten up, mutilated, beheaded, perhaps even worse, is an illustrious career to offer to people in the developing world, eh?

GRAHAM CLULEY

It's ghastly, isn't it? Anyway, Meta then hired Sama again, but they gave them a different job.

Their new job was to look at the videos coming out of their smart glasses and label up what was in them so that Meta's AI could learn the difference between a flowerpot and a traffic cone.

PAUL DUCKLIN

But don't these glasses just come on willy-nilly? Most people just run them all the time.

GRAHAM CLULEY

I imagine that would drain the battery and so forth.

I think normally you have to say, "Hey, Meta," a bit like, "Hey, Google," or, "Hey, Siri." You have to ask them to do something, but maybe there's an option to have them permanently on.

PAUL DUCKLIN

Do they detect whether you're actually wearing them or if they're just on the side stand or in the bathroom? And do they work if they're just lying around innocently somewhere else?

GRAHAM CLULEY

It appears that you can just leave them lying around somewhere and they will carry on recording.

So they don't do a sort of biometric nose print to know that they are balanced upon your nozzle. Nothing that.

PAUL DUCKLIN

You'd think that would be pretty trivial.

So you got them recording, you take them off, you put them down somewhere in a coffee shop, and then you wander off to the toilet or to the counter or whatever, they just carry on recording surreptitiously while they're on the table?

GRAHAM CLULEY

They carry on recording. Oh, great.

PAUL DUCKLIN

Well, that definitely improves your privacy, Graham.

GRAHAM CLULEY

Well, okay.

So these employees, as I said, they're drawing little boxes around things to say, this is a flowerpot, this is a traffic cone, this is a coffee tender, whatever it may be.

And, you know, but this is the reality. Duck, this is the reality. This trillion-dollar industry of AI.

The actual truth is there's this unglamorous manual labor which is critical to the foundations of it.

You know, there's an awful lot of people who are having to do this to train the AI, if AI even exists at all, to do these things.

PAUL DUCKLIN

Yes. The last time I was on this podcast, we talked about— was it Your AI Slot Balls Stop Me? Where it was fake AI that actually was done by humans in the background.

But it pretended to be AI and its behavior was surprisingly AI. So maybe that website had more going for it than you might think.

GRAHAM CLULEY

It was actually quite a fun game, wasn't it? It was quite addictive pretending to be an AI.

PAUL DUCKLIN

But it does raise certain questions, doesn't it?

GRAHAM CLULEY

Yes, it does seem that we come across this problem all the time. It's actually humans behind these things. So the reporters interviewed more than 30 Sama workers in secret.

Because these people had signed NDAs, they were terrified of losing their jobs. And what it turned out was that they weren't labelling flowerpots and chihuahuas.

As one of them said, they said, "We see everything from living rooms to naked bodies. We see everything." And what does everything mean? It means everything.

So they said, "We see footage of people on the loo." Footage of people undressing, footage of people's bank cards, footage of people watching pornography while wearing the glasses.

PAUL DUCKLIN

I'd thought of all of that stuff about catching you without your underpants on.

But what I hadn't thought of, which is really obvious when you think about it, is when you get out your credit card to pay for something online.

Yes, you will unavoidably look straight at it. And modern credit cards have the numbers writ large so you can't get them wrong.

GRAHAM CLULEY

Or if you've got your glasses on, if they're recording, obviously, and you go into your password manager and some website is saying, no, you have to type in your password, you can't cut and paste it.

And you say, well, show me what the password is then. Then that password is being beamed through as well, isn't it? Absolutely.

PAUL DUCKLIN

Or if you've got one of those things that says, check that your passport has at least 6 months before it expires and you get it out and open up. Yes, it does. And close it.

You've just given away all the data that somebody would need to read out the NFC chip.

GRAHAM CLULEY

So, as we were just discussing, one Sama worker, he described a video he saw where a man had taken his glasses off, put them on the bedside table, walked out the room.

Shortly afterwards, his wife walked in, started getting changed, and the glasses were still recording. She had no idea. But it's not just the video footage.

There's also transcripts of what is being said by people wearing the glasses.

PAUL DUCKLIN

That's good for your privacy, isn't it?

GRAHAM CLULEY

Another great thing for your privacy. Thank you, Mark Zuckerberg.

Workers reviewing these conversations, they said they heard chats about crimes and all kinds of what was ominously described as dark things.

PAUL DUCKLIN

You'd also basically hear people unknowingly violating things like NDAs, right? By, say, discussing work, thinking, oh well, I won't tell this to anyone else.

Meanwhile, it's going into the cloud so that some system can record what they said, and someone in Kenya gets to listen to it in case it got the words wrong.

GRAHAM CLULEY

Wow. So, this isn't good. I don't— You know what, Graham?

PAUL DUCKLIN

I reckon I might not buy these things. I might just skip it.

GRAHAM CLULEY

Okay, I've convinced one person. All right, let's work on the rest of you now, you listeners.

PAUL DUCKLIN

Thanks for the warning.

GRAHAM CLULEY

So, Meta told the BBC that footage stays on your device unless you choose to share it with the AI. So they say there's clear user consent, they say. They say faces are blurred.

It's all very above board, says Meta. So the reporters looked at that one by one to see if that was true, and they found that the blurring fails.

According to workers at Sama and former employees at Meta, faces which are meant to be obscured aren't, particularly in low light, like for instance, a bedroom, or when the camera's moving quickly, which of course, if you have glasses on your face and you turn your head, your camera is moving quickly.

So the blurring is not reliable. Also, Meta says that it's only when you choose to share your data.

So these reporters in Sweden, they bought a pair of glasses themselves and they went through the setup and the app asked, would you like to share extra data with Meta to help improve the products?

And they said no. Oh, extra data. Yes, I wondered if that extra data was carrying quite a lot.

PAUL DUCKLIN

It's like cookies. Do you want no cookies? Okay, then we'll only use the ones we really think are important.

GRAHAM CLULEY

So they tried to use the glasses offline. The AI part of the glasses offline wouldn't work, wouldn't even start. The AI required a constant connection to Meta's servers.

So you can opt out of sharing data all that you like, but every single time you try and use that sort of, hey Meta feature, your audio and video are being beamed off to a data centre and may well end up in Nairobi being analysed by a person.

PAUL DUCKLIN

Yes, because there's absolutely no way you could get enough computing power into a pair of spectacles to do cloud-level environment-ruining AI comprehension.

GRAHAM CLULEY

Yeah, these things aren't good. These things aren't good.

Oh, and finally, the journalist went round to about 10 opticians in Stockholm and Gothenburg who actually sell the glasses, and they asked them, how is the data handled?

And one said, nothing shared with Meta. That was a big concern for me as well, but you have full control. Not true. Another said, everything stays locally in the app. Not true.

So these are the people selling the product in high street shops in countries that take privacy fairly seriously.

This has become big news and you'd think Meta might apologise 'cause the ICO and others, they're all sharpening their knives over this.

And what Meta actually did last week was they sacked everybody. They terminated the contract with Sama. 1,108 employees in Nairobi have been fired.

The very people who blew the whistle have been kicked out the door. So—

PAUL DUCKLIN

Is that under guise of breaking their NDA or something?

GRAHAM CLULEY

Well, who knows?

PAUL DUCKLIN

Or is it just one of those things they go, oh, sorry, the contract's over. So we haven't exactly sacked you. It's just that your job became redundant.

GRAHAM CLULEY

Yeah, I think so. So Meta says that Sama did not meet their standards. So they've gone to someone else, presumably. It seems the real reason is that the workers spoke to journalists.

PAUL DUCKLIN

They didn't meet their standards. Oh, you mean, so if they just kept stum. Yeah.

And this whole thing about facial blurring, it just occurred to me, Graham, that that is a little bit of a red herring if you think about it.

Because if your video has ever caught you, say, looking at your passport or looking at your password manager or looking at your credit card, who you are is pretty obvious.

And then if there's anything that looks like a household background, it's pretty obvious who the faces in the picture are most likely to be. So blurring them doesn't really help.

It could easily be pieced together.

GRAHAM CLULEY

So if you've been using these glasses and used their AI features, if you've looked at a bank statement, if you looked into the mirror, if you've done something a bit romantic while wearing them, there is a non-zero chance that footage has been seen on a screen by a human.

And there's nothing you can do about that now. You can stop using the AI features going forward, but that's about it.

But worse still, and I'm sure this is the sort of thing that upsets you as well as me, Doug, is that if you see somebody else wearing them, someone you didn't agree to be filmed by, you're probably right to be paranoid that maybe your picture or what you're doing is being uploaded there as well.

PAUL DUCKLIN

And although in a public place, certainly in the UK, you don't have much, you have some, but not much expectation of privacy.

People are allowed to film in public places if they want to take photos.

This is a little bit different where somebody is recording their point of view and the things that are going on in their vicinity essentially continuously, and then possibly even without realizing it themselves, are kind of sharing that information with someone else.

It's not even enough to trust the person who's wearing the glasses and go, well, they're a good friend, they wouldn't do me down, 'cause they might be doing you down without even knowing.

GRAHAM CLULEY

Horrible. So, I hope this apology has been acceptable, because I know there were some of you who hadn't been put off buying Meta glasses. Hopefully now all of you are.

And now a word from our sponsor.

PAUL DUCKLIN

Let me guess which company it's not.

GRAHAM CLULEY

We've just got a moment to thank one of this episode's sponsors, ESET.

Now, there's no shortage of cybersecurity vendors claiming to be the best, of course, but ESET is one of the few that's been proven it for 30 years.

Research has always been at the core of what ESET does.

Their threat intelligence teams are actively tracking APT groups and ransomware affiliates and publishing findings that the security community actually reads and references.

That's not a marketing line. That's 30 years of doing the work. And here's what makes it interesting.

3 decades of research means that ESET has built up global telemetry that most vendors simply don't have access to.

They combine that telemetry with AI-native technology and human expertise, and that's what powers both their products and their MDR service.

Real intelligence behind the protection, not just pattern matching. 110 million users worldwide trust ESET with their endpoints, cloud, email, and mobile devices.

That number doesn't happen by accident. So why don't you check them out right now? Go to smashingsecurity.com/ESET. That's smashingsecurity.com/ESET.

And thanks to ESET for supporting the show. Duck, what have you got for us this week?

PAUL DUCKLIN

Graham, I would like to talk about what I call a brain bug with an impressive name.

Listeners will probably remember things like Heartbleed and Lucky 13 and Poodle and amazing bug names that stick in the mind because somebody decided, hey, I'm so proud of myself for finding this bug and disclosing it that I don't just want to be a name in someone's bug report 3 months later.

I want a website, a special domain, a logo. And even in one famous case, there's a theme tune.

And so some bugs maybe get a notoriety or a degree of coverage that's perhaps bigger than they really deserve. Not that it's bad to know about bugs.

The problem is that if you're focusing on something which is important but not critical, what are you losing sight of? Because there's one that's caught the media's attention.

And the one I want to talk about right now is called Copy Fail.

And it is all over the IT media and it's getting posted and reposted and copy-pasted all over social networks, I guess, because clicks.

GRAHAM CLULEY

So Copy Fail, how did it get the name, first of all? Because that obviously is the most important thing of all.

PAUL DUCKLIN

Well, the name is not entirely unreasonable.

It is a feature in the Linux kernel that aims to speed up some particular process by actually avoiding copying memory in and out between the kernel and a program and back from the program to the kernel to save time.

GRAHAM CLULEY

Oh, so it's nothing to do with the clipboard, for instance?

PAUL DUCKLIN

No, it isn't. So it's not— don't confuse it with ClickFix, which does include the clipboard. So the copy is just a generic term for copying memory.

And in fact, the whole reason for this bug is trying to avoid copying things between blocks of memory as much as possible. And fail just means it didn't work.

So that's the researchers' justification for choosing the name.

My suspicion is that the nice thing about that name rather than any other is that firstly, there is a top-level domain,.fail. Is there? And yes.

And the domain name copy was not taken. And as you say, copy, people might think, hey, there's a clipboard involved, all of that stuff.

So I suspect that they chose the name because it's not unreasonable, but they were able to get the domain.

So I think it's a bit like, you know how US lawmakers love to give their laws fancy names that you can remember. And you think, hey, what's the chance of that?

So, you know, they have a law and say, oh, this is the Fandango Act. And then you think, Fandango, what on earth is that?

And then when you hear it, it's some weird concoction like, I don't know, I'm trying to make one up as I go along, Federal Arrangement Never to Disparage Aerospace Navigation by Geographic Obstinacy or something.

And you think—

GRAHAM CLULEY

Round of applause, round of applause for that.

PAUL DUCKLIN

TikTok. Well done. I may have missed some letters out. And you think, okay, I get the idea.

So I think in this case, they maybe figured, hey, we can get the domain name, we can get the website, we can promote this because for better or for worse, the reason they want to talk up this particular bug, Copyfail, is that the principle of the bug was uncovered by a human researcher who then used this company's AI bug hunting tool to go looking for code examples in the kernel where this very kind of bug might have happened.

Okay. And they found several, and this was the worst of them.

So clearly, if not their primary goal, at least their major secondary goal is to talk up their bug hunting statistical analysis tool.

GRAHAM CLULEY

Fair enough. It does strike me that AI is at the very heart of every fail, isn't it? It is at the centre of the word fail. So, I mean, it's— so they found this bug, right?

And without getting into the really nerdy detail of how it works, what does it do? What is it?

PAUL DUCKLIN

That's a great question because it gets to the heart of is this really as big a story as you might imagine if you just scroll through LinkedIn and when you see people posting, of all the cybersecurity bugs you're going to get, this is the new disaster area.

So as you've mentioned before on the podcast, Graham, there's a thing called CVSS, which I think is Common Vulnerability Scoring System, which is a kind of mark out of 10 for how bad the bug is.

So Log4Shell, remember that about 5 years ago? That was a 10.0 critical.

Like, deal with it now because somebody could go to your website, put in their name into a form on the webpage, then add some weird characters and bingo, they just told your web server, hey, go and download this Java program and run it.

Game over. So that's pretty obviously a clear and present danger. Remember WannaCry, the EternalBlue vulnerability stolen from the NSA in the US?

Well, it had been patched for a couple of months beforehand. Lots of people didn't patch.

That virus went everywhere, spread automatically, wiped people's hard disks, That only got an 8.8 high. And this bug is at 7.8 high.

So you could think, why is this considered such a big catastrophe if not merely for the fact that it's a bit of a marketing vehicle, maybe slightly dubiously, you might say, for the company that found and reported it?

So as bugs go, the worst bugs are normally the ones that are called RCE, remote code execution, like the Log4Shell one.

It means someone on the other side of the world can do something to your computer, typically by sending it a network packet or typing something in on your website, and bingo, they implant malware on your computer just like that.

They execute a program without permission remotely. Clearly, that's super dangerous, but this is not one of those.

It's not at that level that somebody can take over your web server or your cloud backup service or the webmail service you provide to your customers or your financial system or your online store.

It's not like they can use this bug to go in and just break in in the first place.

GRAHAM CLULEY

And this bug is in Linux boxes, isn't it? It's not going to impact people who have got Windows.

PAUL DUCKLIN

I think you need to be a little bit careful about saying that. Because more of the world runs on Linux, you might say, than runs on Windows these days.

So although it doesn't affect your laptop, it might affect the systems that your laptop connects to.

Which include AI servers, backup services, webmail hosts, all of that stuff, which mostly do, not all, but very many do run on Linux.

So the good news is that this is not a 10.0 critical game over. So it probably shouldn't have got quite the coverage that it did.

It is what's known as an EOP or elevation of privilege.

In other words, it's the kind of bug that if someone gets in, but they only managed to break in as Graham Cluley or guest, they can then use this second bug to basically boost their access and get admin level, or in Linux terms, root power.

So that is bad because, as we know, particularly ransomware crooks love to chain exploits together.

They break in and you can ransomware someone's computer or servers without having root access because usually you, the user, have access to all your files and that's all you care about.

However, it's much more devastating if they can get admin-level access first. So you do often get remote code execution bugs chained with EOPs, elevation of privilege.

So I don't want to underplay this bug, but it doesn't mean that someone can break into your computer.

And as you say, because it's Linux-specific, it relies on you running the Linux kernel. It's not directly going to affect anybody who's got a Mac or who is running Windows.

GRAHAM CLULEY

So if I'm a sysadmin listening to this, is there anything that I should be doing about this, or is there anything which I shouldn't be doing?

Even if you're saying it's not the end of the world as we know it, there'll be many people who will think, well, I still don't want this problem residing on my systems.

PAUL DUCKLIN

No, particularly if you run a Linux server where you provide multiple users with access.

So maybe someone's got a server and they say, "Hey, Graham, would you like a login on it?" You go, "Yeah, that would be very handy.

I might run a web server." And they said to Duck, "Would you like a login?" And I said, "Oh yes, I'd love to use it for my backups." Now, what if either of us could promote ourselves to root access and therefore read each other's files?

That would not be good. So you definitely want to stop this. Now, this bug has been patched. So it's a great reminder to everybody who runs Linux systems: patch early, patch often.

The new fixed kernel that no longer has this buffer leak is immune to this particular exploit.

GRAHAM CLULEY

Well, we've got time right now to chat about one of our sponsors this week, Vanta.

JOE

Oh yes, my favourite.

GRAHAM CLULEY

What do they do again? They stop you running your entire security program out of a spreadsheet, Joe.

JOE

That seems aimed at me personally, Graham.

GRAHAM CLULEY

Well, it is a little bit, yes.

But you know how most companies have to prove they're secure to customers or auditors and regulators, and the whole thing involves chasing down evidence, filling in questionnaires and forms, updating the same spreadsheet cells over and over again.

JOE

It sounds utterly soul-destroying.

GRAHAM CLULEY

Yeah. Well, Vanta automates all of that. Automates it, how? Well, their trust management platform keeps a continuous eye on your systems.

It pulls everything into one place and keeps you audit ready around the clock.

So no more staring at the ceiling at 2 AM wondering whether you've got the right controls in place or whether one of your suppliers has been breached.

JOE

The stuff of nightmares.

GRAHAM CLULEY

Yeah, it would be, wouldn't it?

But this Vanta solution uses AI as well, and it's the useful kind, flagging risks, collecting evidence, slotting into the tools your team already uses so you move faster, scale without the headaches, and perhaps actually get some sleep.

Go to vanta.com/smashing to find out more.

JOE

That's vanta.com/smashing. And thanks to Vanta for supporting the show.

GRAHAM CLULEY

And welcome back. Can you join us for our favourite part of the show? The part of the show that we like to call Pick of the Week.

PAUL DUCKLIN

Pick of the Week.

GRAHAM CLULEY

Pick of the Week is the part of the show where everyone chooses something they like.

Could be a funny story, a book that they've read, a TV show, a movie, a record, a podcast, a website, or an app, whatever they wish.

It doesn't have to be security related necessarily. Well, my pick of the week this week is not security related.

My pick of the week this week is a radio program which I heard, a lovely little radio documentary.

You can go and find it on BBC Sounds, and it's just 20 minutes long, and it's about a man called Arthur Haley who came from Luton in Bedfordshire.

And do you know what Arthur Haley's claim to fame is?

PAUL DUCKLIN

Is this a fictional story, or is it actually a sort of documentary thing?

GRAHAM CLULEY

This is a true story. Really? Yes.

PAUL DUCKLIN

I know about Bill Haley and the Comets. But I don't know about Arthur Haley.

GRAHAM CLULEY

Well, it does hail from the sky. Oh, well done, Graham. That's the same time as Bill Haley and the Comets were rocking around the clock.

Because Arthur Hailey, he was on an aeroplane back in the 1950s, and he was on this plane, and he wondered to himself on this rather long journey, because he lived in Canada at the time, although he'd been born in Luton, and he wondered, "I wonder what would happen if the pilots became ill?

Would I be able to land this plane?" he thought to himself. So, this got him thinking, and so he wrote a little play.

And in 1956, it was shown on CBC, which is the Canadian Broadcasting Channel, a TV drama called Flight Into Danger about exactly that, about what happens when there's a plane and the pilots become sick from food poisoning or whatever, and one of the passengers has to land the plane, guided down by someone on the ground in air traffic control.

PAUL DUCKLIN

I've seen a film that, but it was not a documentary, Graham.

GRAHAM CLULEY

No, what you are thinking is the hit comedy movie Airplane!, of course, with Leslie Nielsen, as well as other films over the years. What's your vector, Victor? That one. What was it?

"Surely, you're joking," or something. Anyway, but yes.

PAUL DUCKLIN

Yes, "Don't call me Shirley." All of that, yes.

GRAHAM CLULEY

So Arthur Hailey wrote this as a straight play. Yes, I'm thinking it sounds the kind of thing that would be quite thoughtful.

Apparently, it was an absolute sensation on Canadian television. They didn't have to have very many sets. It could be done cheaply.

And at the time, these were all broadcast live, right?

They would just act in front of the cameras, and all you needed was a set of the man down in air traffic control and over on the other side of the studio, the cockpit.

And apparently, people went crazy for this. And the BBC, they decided to do their own version, and then they got hold of the Canadian version. They said, "You know what?

This Canadian version is so good. We're going to show the Canadian version instead on BBC." And again, it was a sensation. People have said they'd never seen anything like it.

What I listened to was a documentary all about Flight Into Danger. And even though I've never seen it, I was absolutely fascinated. A lovely little documentary.

PAUL DUCKLIN

So this is the real story behind the— Yes. I think the word fatuous applies perfectly well to the film Airplane!

GRAHAM CLULEY

They interviewed people who played, for instance, the air stewardess. Wow, she's now 93 years old, and they were interviewing her as well as to her memories of all of this.

Lovely documentary.

It also helped the career of people Sidney Newman, who came to the British Isles and launched TV shows The Avengers and Doctor Who after the success of this Canadian programme as well.

I'd recommend it. Go and check it out. I'll put links in the show notes. Lovely 20-minute-long documentary. I just thought, this is really great, and I wanted to share it with people.

So, there you are. That was the origin of the movie Airplane!, and it sounds it's really good. Maybe someone recorded it. Maybe there is a version online which I can go and check out.

PAUL DUCKLIN

That would be good, wouldn't it? Although an awful lot of old film and video footage has been lost. It has, yes.

I know the BBC used to just reuse the videotapes because they were so expensive. Just record over a programme they'd made, no matter what a hit it had been.

GRAHAM CLULEY

It's an absolute cultural tragedy that so much has been destroyed.

And that's why I love things the Internet Archive, which has been preserving old websites, keeping all this stuff because it is our culture.

You know, we don't want that to have all been erased as well.

PAUL DUCKLIN

Well, did you see lately there's an appeal from some European broadcasting union asking people to look around and see if they have any recordings in any condition whatsoever of various early Eurovision song contests?

Because apparently several years are just missing. And they desperately try and reconstruct the history of Eurovision from the very first one.

They're probably in an attic somewhere, but how would you ever know?

GRAHAM CLULEY

It'd be some masochist who's kept a copy of them. Duck, what's your pick of the week?

PAUL DUCKLIN

My pick of the week, well, I had a pick of the week. I've suddenly got an extra one, if that's okay.

So when you started talking about the Meta Glasses fiasco, the reason I knew about Sama, was because of an excellent book that I read quite some time ago that my wife bought for me, which is called Code Dependent, with the subtitle How AI is Changing Our Lives, by a British journalist who's originally from India called Madhumita Murgia.

I think she's a journalist with the FT.

And she actually went to Nairobi and met with several of the people who worked on various parts of Sama, people who'd seen the terrible stuff people who'd sat there for hours looking for traffic cones and everyone in between.

And that's just one of the many chapters that she digs into. It's a very well-recommended book.

But the one that I actually declared in advance to you that I was going to mention, I will now mention. I just happened to reread this.

It's a book that came out towards the end of last century, so it's very slightly dated by now. But it's interesting to see how much of it has not dated.

And it is a book called The Code Book, subtitled The Secret History of Codes and Codebreaking, by a well-known British science popularizer journalist called Simon Singh.

And it's an excellent review of some key stuff in cryptography, the strengths and the weaknesses.

It's entirely coincidental that I was reading this book when the whole copy fail thing came out, and it turned out that ironically, a bug in some cryptographic code that was supposed to make the world a safer place ended up potentially opening up your computer to anybody who could log in as root.

That was a coincidence. But the book is full of reminders.

In fact, there's even a chapter called Le Chiffre Indéchiffrable, the Undecipherable Cipher, which is a cipher called the Vigenère cipher, which was thought to be unbeatable because it was just so clever.

And eventually, actually, Charles Babbage, I think, was the first person who came up with a way to defeat it.

But he's not remembered for that because he came up with his way to solve it very reliably, even though the cipher's quite hard to crack.

He came up with a trick for breaking into it, and he discovered this just before the Crimean War with Russia back in the mid-19th century.

So the theory is that he was told to keep quiet about it, and he never got any credit for cracking it.

The credit went to someone else several years later, and his cracking it was kept quiet because I suppose the British were afraid that the Russians might stop using it if they knew that it had been cracked.

So it's a fascinating book full of those stories about how what seems impenetrable at first sight can either be cracked by a bit of cleverness or sidestepped completely by a bit of manipulation, bribery, corruption, social engineering, or what have you.

Well, I can recommend the book as well.

GRAHAM CLULEY

I read it many years ago, and I also read another book of Simon Singh's, Fermat's Last Theorem, as well. An interesting story.

PAUL DUCKLIN

That's particularly interesting if you live in Oxford, because not too far from me is the Sir Andrew Wiles Mathematical Building.

And Sir Andrew Wiles got his knighthood because he's the guy who finally, after centuries, proved Fermat's theorem in hundreds of pages.

He made it his life's work from apparently when he was a kid, and he just focused on that. And he got there in the end.

GRAHAM CLULEY

Terrific. Well, great picks of the week. Right, well, we've got a bit of time now to have a featured interview. We're going to chat to Jake Moore, the global security advisor at ESET.

Hello, Jake. Great to have you on the show.

JAKE MOORE

Hello, Graham. Lovely to be here.

GRAHAM CLULEY

Well, you have been experimenting lately with this old deepfake nonsense, haven't you? Tell me about this face swap experiment which you did.

JAKE MOORE

Yeah, so I love deepfakes. I love to play with the latest tools that criminals might be using and test them for good, of course, so people can learn from it.

But there are so many incredible tools out there. And I saw this incredible headline that said that a company had hired a North Korean cybercriminal by accident.

And I was so intrigued by that. I thought, I wonder if that's actually possible. And so a couple of years ago, I started playing around with face swapping technology.

And it wouldn't work very well. It just looked so fake.

And so recently I've been able to do it and I thought, I wonder if I could actually fool someone for the greater good of education.

And I got through a round of interviews by multiple people as someone else. And it was a lot of fun.

GRAHAM CLULEY

Hang on, this is interesting. Okay, so who did you target when you did this?

JAKE MOORE

So I was looking around for companies that would allow me to do this, 'cause I wanted permission from the top.

And I got permission from a CEO that I know, and he said this would be a great experiment for us as well. You know, good to see how our hiring process is.

And so I got all of the things together, CV, I even made a passport, and all the extras that might be looked into.

And the biggest thing that I had to get around was actually the nerves. I haven't been for an interview for a long time.

And now I was able to use the software through a virtual camera, throwing myself into this Teams interview and see two people staring at me.

And so nervously looking back at them thinking, surely they're gonna spot that I'm a deepfake.

GRAHAM CLULEY

But it continued. So are we talking a high-end studio rig, which you're using, or is this something you could do on a gaming laptop in your bedroom?

JAKE MOORE

Yeah, you can pretty much do this on any laptop. Well, I say that you do need a fast GPU, but the quicker the computer, the more you get, say, better quality cameras.

So in my experiment that I was testing, I found that the default camera that it came with was too good.

In fact, I had to turn down the resolution and blur the background to make it look there could be some, say, audio sync issues and other problems as well.

Also, I had this kill switch to just kill the virtual camera and throw in this green screen, say fuzziness that I'd added in the background just to give me that get out clause.

GRAHAM CLULEY

So weren't there any tells that a sharp-eyed interviewer might have spotted, or is it really genuinely good enough that the human eye can't catch it?

JAKE MOORE

Well, I think what we are really looking at here is just trust. People don't expect it.

A lot of people have heard of this, but I do find it tends to be in our industry that we know this is happening.

When you go to say HR, who are the people on the front line of this, they don't necessarily know about this yet, or at least at this scale, or even think it's gonna happen to them.

And so if you throw out a good backstory, and I made up this story that I was a teacher for 14 years and I just wanted to go into the IT profession, and I'd had 2 years at a bit of IT admin, and I was starting to go into sales.

The story added up, and by the time that I was chatting to them, they're probably not looking for things like, oh, the head moved slightly differently, then maybe there was an audio issue.

This is online remote interviewing. Things can go wrong, but they overlook that because effectively they believe it.

GRAHAM CLULEY

So I have to ask, did they offer you the job?

JAKE MOORE

What happened? So after the first interview where I was interviewed by HR and someone in IT, I very nervously got through that interview, but then I did get a second interview.

So I knew that I'd fooled them. In my mind, that was the end of the experiment, but I thought, well, I'm here now. I might as well go through it. They gave me a task.

They asked me to create a presentation. I got ChatGPT to create it beautifully, poetically.

ChatGPT said, hey, if you're making a presentation for this company, do you want us to make it in their brand colors as well? I thought, yeah, what a great way of doing this.

So I made this presentation. They loved it. The next interview, it was one of the same guys and then another guy from IT. That did worry me.

Now I've got two very probably clever guys looking at me that I thought they might even know about this. And they didn't. 45 minutes into through the call.

We finish it all nicely, and a couple days later I did get an email that said congratulations. This was a £38,000 job. Wow.

GRAHAM CLULEY

What happened when you told them?

JAKE MOORE

Well, I went straight to the CEO to say, you will not believe this. He was so excited to hear what happened. And I said, look, I'm gonna turn the job down.

Obviously, I'm just gonna say another job's come up, but I said I don't think we should tell them.

And he said, surely we have to, we need to get you in and tell everyone what's happened. And we, this is the educational piece that you were talking about.

And I said, well, I feel so bad because what if they then feel silly? What if they feel fooled into this? Anyone could fall for this.

You know, this is not a way of catching people out.

And if you think how phishing exercises have changed, some people feel very bad about it if they've clicked on a link for whatever reason.

I didn't want to use it on that level, especially as AI ironically came up in the first interview and we at length talked about AI is everywhere.

And it was kind of funny, but at the same time, I don't want to put a lot of heat on those people. So I asked them if I could use the story. I do put it in a presentation.

I blur the faces, I change the voices, and I use it as an educational piece. I've been doing it a lot with HR people as well, just so they're aware of it.

But no, I just happily turned down the job and looked elsewhere.

GRAHAM CLULEY

You weren't tempted to carry on deepfaking yourself for the rest of your natural life?

JAKE MOORE

Funnily enough, I did go and tell my Chief Product Officer. We always chat every so often about what we're up to, and he did ask about this.

And I said, you'll never guess what I've done. And he was all ears, but he's very difficult to impress. And so I did go and tell him the story.

And at the end of it, he went, well, that's not that impressive.

All you've done is you've used some software to put someone else's face, and by the way, this is an AI face, doesn't exist, on top of yours, and you used your voice, and you've just got the gift of the gab.

You probably gave it a go. And I said, what, does that not impress you?

He said, you know what, Jake, if you really wanted to impress me, why don't you do this whole interview as a woman?

And then you'd have to change your hair, your voice, and your body as well to go with it. And I said, hold that thought.

And so for the next five months, I created another new persona and I had to start the whole process again.

GRAHAM CLULEY

And was that successful?

JAKE MOORE

It was a lot harder because by now it was now January that I was applying for jobs. I tried everywhere, but it seems that everyone's applying for jobs in January, right?

I was very nervous about going for these job interviews because the technology which I had tested and tested, I was always worried in a live situation, as everyone knows.

You don't want to be doing live demos. And when you change your voice through software, you can't tell what they're hearing. So things like that would panic me.

But I did get a few interviews as this lady, and there's lots more to go with it in this story, but I don't want to finish with telling everyone how it happens because actually I'm starting to give this talk out at a lot of conferences all over the country this year.

GRAHAM CLULEY

So if anyone wants to find out what happened next, go and see him on the speaking circuit.

This is a serious problem, of course, because as you said at the beginning, we have seen North Korean IT workers infiltrating Western companies, and that's a real problem, isn't it?

I mean, what is their endgame when they're doing that?

JAKE MOORE

Yeah, so it's so creative. I do take my hat off to them. I genuinely am fascinated with crime.

I always have been, being in the police force and now with ESET looking into what criminals are using. So I'm just so fascinated with it.

It's something that was never on anyone's radar, but yet if anyone is to infiltrate a company, why not get straight into the company itself and penetrate it from within?

If they're able to be sent a laptop, there are lots of remote jobs out there.

I speak to big businesses who genuinely have this problem, who say, well, we've got contractors all around the world.

We've got to send out these laptops to them and once they are on their laptop, they can do so much more than their remote attacks.

So it is a huge worry, but on the flip side, there are these big companies saying, well, what can we do about it?

We can't interview them all in the UK or wherever we might have a base because they might be anywhere in the world.

GRAHAM CLULEY

So what should companies be doing about it? What are the practical things they can do?

Or are there any practical tells which they can see if they are on a call and they think it's suspicious?

JAKE MOORE

Yeah, so there have been a few viral situations where people have, say, not wanted to cover their face with their hand.

There've been some great videos like that, but it's difficult to say, come up with one simple thing like, oh, get the interviewee to cover their face with their hand or look like they're waving in front of it, because it'll only be a year or so before that can be circumnavigated.

Actually, the software that I use, as soon as it sees a hand near the face, it goes back a couple of seconds and freezes to when it was just your mugshot, and it looks like you've got connection issues.

Last year when I did that, the software would fall apart and it would show the true face. So I don't like to say that that's the way of preparing for it.

So it's adding other verification methods, speaking to someone in their country as, say, like a third party to come and actually meet them.

Meeting people in real life is still so, so vital. But I know we've got HR having their own problems because it's so busy out there. They need to cut those corners where they can.

And unfortunately, that's where cybercriminals like to take advantage.

GRAHAM CLULEY

Is there more that the big video call platforms should be doing? The Zooms, the Microsoft Teams, the Google Meets, are they keeping pace?

JAKE MOORE

Yeah. So this has been a big thorn in their side for a few years. There's a company that's co-founded by Sam Altman. I mean, of course he's everywhere.

That have just signed a deal with Zoom and Tinder to try and help verify people. I think it's far better to go down the verification route.

This might be through a process of verifying through your phones of who you both are on the call.

I would say that's better than actually using deepfake technology in real time, because as the technology improves, we've got this major problem where we might be able to detect something now, and then just a few months later, it'll actually come back and say, no, that's a genuine video.

We don't see this as showing any evidence of AI. And so we really need to use more of those verification tools. There are a few others that are also trying to do it.

There's nothing that I've seen for Teams at the moment, but hopefully there will be something that we can all use. But I think it's that testing phase at the moment.

But as the technology improves, we do just see this issue probably expanding.

GRAHAM CLULEY

So I've just had a thought, and I'm not saying this couldn't be circumvented, but I wonder whether those video call platforms could detect the use of a virtual camera.

Because the way in which this works, right, is you've got a webcam in front of you looking at you.

You would then have a piece of software which munges that video of you to look like the deepfaked version, which is what it then sends to the video platform, right?

If they were able to spot that their input was actually a virtual camera rather than an actual camera— again, I know this could possibly be subverted, maybe that would cut out some of this.

JAKE MOORE

Yeah. And I was only speaking to a company only yesterday about this, and they do block virtual cameras through their own platform for that reason.

But we ended up chatting about how that can probably be circumnavigated because it's just one of those extras.

It's really difficult to say, use just this one method at the moment, because if you sing too loud about one detection or security method, everyone then says, well, that must be it.

And before you know it, it's been bypassed, and then that's given the advantage to the criminals again.

GRAHAM CLULEY

So Jake, if you were a criminal, which you're not for the record, what's the next AI-enabled scam that you'd be worried about seeing in 2026?

Is there a piece of security advice that's suddenly relevant again because of all of this, like meeting people in person or making a phone call to a number you already trust?

What is it?

JAKE MOORE

What are you afraid of?

Yeah, it's sad that we are having to go back to older methods of verifying because I'm a big lover of technology as most of the listeners are going to be as well.

And we all want to use AI for efficiency and speeding up our processes.

But every time we add another tool to our wonderful technology toolkit, it can also be a way that criminals can take advantage of as well.

So I really do see it's so powerful to use those extra platforms to verify who people are.

A good old-fashioned phone call, meeting people in real life, never hiring someone just remotely.

If you really can try and even get a third party, it'll cost you a lot less to hire that third party to go and meet up with whoever they are in whichever country they might be as well.

But really being able to spot something, having those spider senses of just knowing that something might be up.

The more people we can give that special tool to, then actually we do become safer. And we've got so many people that still don't know about the technology.

And I think we have got a lot of the basics right, but we still haven't made a lot of people aware that the technology is rapidly moving on.

GRAHAM CLULEY

There's also this risk that we overcorrect, right?

We end up rejecting legitimate candidates for jobs, you know, people who have unusual accents you're not familiar with, or poor lighting, or cheap webcams.

Because you may begin to think, oh, that, well, they could be a deepfake, therefore we're not gonna take them forward.

JAKE MOORE

Yeah, but that would be where I would start to go and meet them. Yeah, it's all okay to have the first or even second interview that.

Ironically, in one of the interviews that I went with, the very first introductory call was actually an AI avatar that I was having to speak to.

GRAHAM CLULEY

Oh, for goodness sake.

JAKE MOORE

Yeah, I was AI speaking to another AI.

GRAHAM CLULEY

I would refuse to work for them, Jake. I would refuse.

JAKE MOORE

When it first came up, I thought, well, this is weird. What's going on? Is she real? I was, wow, they're double bluffing me. If anything, this is actually very impressive.

They knew this was gonna be happening. But no, it was really interesting.

So their whole process was, can someone sit through an introductory call where they get to learn about the company?

At the end, I had a questionnaire to fill out and I had to answer the questions that had I been listening through the first half an hour call.

I passed them 'cause I had been, and then get to go to the next interview to meet a real person.

GRAHAM CLULEY

So yeah, it's all change out there. It is. It should be said, AI can be used for defensive purposes to improve the security of your company as well. It's not all a threat, is it?

JAKE MOORE

Yeah, AI is fantastic. It's so good at being used in, say, vulnerability finders. We've been using AI and machine learning in our products at ESET for many, many years.

It's such a fantastic vulnerability finder in itself. Of course, it's going to use the latest technology. Effectively, it's firefighting fire. We've now got AI fighting AI.

And we'll continually use that AI technology, particularly in our ESET products, to find that greater good and stop those very, say, clear attacks and even those very sophisticated ones before they go and harm those devices.

GRAHAM CLULEY

Well, Jake, it's been great chatting to you today and finding out all the mischief you get up to with deepfakes. I'm glad I'm not one of your colleagues being pranked by you.

I'd be terrified. If people want to find out more about ESET, they can go and check out your products and services at smashingsecurity.com/ESet.

And thanks very much, Jake, for joining us on the show.

JAKE MOORE

Well, thank you very much as well, Graham.

GRAHAM CLULEY

It's been great. Terrific stuff. And that just about wraps up the show for this week. Thanks to our guest, Paul Ducklin. Thank you, Paul.

I'm sure lots of our listeners would love to find out what you're up to and follow you online. What's the best way for them to do that?

PAUL DUCKLIN

The easiest way to see who I am and what I do is to go to my website, pducklin.com, or just search for Paul Ducklin on the various social medias.

So if you would like a great presenter, a great writer, a great webinar creator, a podcast editor, please get hold of me. I kind of feel I need to be on those places.

GRAHAM CLULEY

Ducklin without a G, I should probably point out. That's correct. And of course, we're on social media as well.

You can find me, Graham Cluley, on LinkedIn and all the other usual places. Or follow Smashing Security on Reddit and Blue Sky on Mastodon.

And don't forget to ensure you never miss another episode. Follow Smashing Security in your favorite podcast apps such as Apple Podcasts, Spotify, and Pocket Casts.

For episode show notes, sponsorship info, guest lists, and the entire back catalog of 466-odd episodes, check out smashingsecurity.com. Until next time, cheerio, bye-bye.

Bye everyone.

You've been listening to Smashing Security with me, Graham Cluley, and huge thanks, of course, to Duck for joining us this week and this episode's sponsors, ESET, Vanta, and Action One, and also to the following fine folks who we are raising a glass to who include Chumbucket.

That's a name gloriously unhinged, but no further comment to make on that. Mikkel Goldschmidt sounds like a Scandinavian jeweler.

Chris Pestle, Ashley Woodhall, Johan V, keeping his surname strictly classified. MJ Erasmus, maybe they kill mice for a living. James S, another initial-only last name.

This podcast is practically a witness protection program. Someone here called Satan's Burgers, who we've got lots of questions for, starting with, can we see the menu please?

Alwin, Brian Jansen. Thank you all.

Everybody who's actually signed up for our Patreon, we really appreciate it from the bottom of our hearts and also from the bottom of our chum bucket, whatever that is.

So those are all just a few members of Smashing Security Plus, which means that they get episodes ad-free earlier than the general public, and they can have their names pulled out at random to be mocked at the end of the show.

If you'd like to join Smashing Security Plus, just head over to smashingsecurity.com/plus for all of the details. And you can become a patron.

But you can also support the show in plenty of other ways which don't cost a penny. You can like and subscribe. You can leave us a 5-star review wherever you listen as well.

You can tell your friends about the show. That's a really good one, actually. I like that. Go and tell people, go and spread the word because every little bit helps.

It makes all the effort worthwhile. Well, I hope you have enjoyed this week's show and you will be with us again for next week's show. Until then, cheerio, bye-bye.


文章来源: https://grahamcluley.com/smashing-security-podcast-466/
如有侵权请联系:admin#unsafe.sh