Deepfake Coworkers? Cybercriminals Are Infiltrating Work Calls—and You Might Be Letting Them In

By James Smith, Head of Offensive Security at Bridewell

The era of remote and hybrid work has led to a startling increase in video conferencing. Whether you’re hopping on Zoom, Teams, or Google Meet, it’s easy to assume you know who’s on the other end of the call.  

However, a growing number of cybercriminals are exploiting this trust by sneaking into work meetings, impersonating colleagues, and using deepfake technology to blend in.  In a recent survey, 74 percent of Critical National Infrastructure organisations revealed that deepfakes are of high concern. 

The rise of the ‘fake colleague’ 

The rise of generative AI has transformed the way that people are able to work and create but it doesn’t come without risk.  

In one of the most significant reported losses, an employee at a multinational firm was duped into wiring $25 million to cybercriminals. The employee attended a video call, where everyone looked and sounded like familiar coworkers and bosses. But everyone in the call was a deepfake, AI-generated imitations of real people used to manipulate the employee into making the transfer. 

This wasn’t an isolated incident, either. Recently, Microsoft confirmed that Russian hackers had posed as IT staff on Teams calls to gain access to company systems. Employees thought they were talking to internal tech support. Instead, they unknowingly helped install malware. 

If it sounds like something from a sci-fi movie, that’s because, until recently, it was. But rapidly advancing and now widely available technology is making it possible, and people in workplaces are susceptible to falling for it. 

Why this matters to you 

When a scam like this hits, the damage isn’t just financial, it can also come back on you. If you were the one who let the scammer in, accidentally shared sensitive data, or approved a fraudulent request, you could be held accountable, even if you didn’t realise what was happening. 

Cybercriminals might be targeting your employer, but you may be the entry point. A scam that starts in your video call can end in a data breach, a ransomware attack, or a PR disaster.  

How to protect your employer (and your career) 

You don’t need to be a cybersecurity expert to stay safe. A few small changes in how you handle video calls can make a big difference: 

  1. Slow down and verify. 
    The first rule for avoiding falling for a scam is to slow down. Cybercriminals will use social engineering to create a sense of urgency and pressure you to make decisions quickly. If someone on a call is asking you to transfer money, reset passwords, or share sensitive details, don’t rush. Rather, hang up and contact them through a different channel, like a phone call or Slack, to confirm it’s really them. 
  2. Stay alert to unusual behaviour. 
    AI and deepfakes can be deceiving, but they’re not perfect. If a co-worker’s voice sounds a bit off or their camera seems strangely blurry, it may be a sign of something unusual. Other signs that may indicate something is amiss include unnatural blinking or speech that is out of sync with their lips.
  3. Discuss AI policies with your colleagues. 
    AI can be useful for a myriad of tasks, but it’s essential for workplaces to have detailed guidelines on its use. Zoom announced that it is planning to roll out custom AI avatars to create a realistic clone of you for work meetings. However, new advancements like this could be dangerous if colleagues aren’t fully aware they’re being used.  
  4. Don’t let just anyone into the room. 
    If you’re hosting a meeting, enable waiting rooms or lobbies so you can approve who joins. If you’ve been invited to a meeting, double-check the invite to make sure it comes from someone you know.  
  5. Question ‘IT’ help that seems off. 
    If someone appears in a meeting claiming to be from IT and starts asking you to install software or give them access, be cautious. Confirm with your IT department what their procedure is to make changes to your device.