Scammers have rabbit sex toy videobeen exploiting deepfake technology to impersonate job candidates during interviews for remote positions, according to the FBI.
The agency has recently seen an increase in the number of complaints about the scam, the FBI said in a public advisory on Tuesday. Fraudsters have been using both deepfakes and personal identifying information stolen from victims to dupe employers into hiring them for remote jobs.
Deepfakes involve using AI-powered programs to create realistic but phony media of a person. In the video realm, the technology can be used to swap in a celebrity’s face onto someone else's body. On the audio front, the programs can clone a person’s voice, which can then be manipulated to say whatever you'd like.
The technology is already being used in YouTube videos to entertaining effect. However, the FBI’s advisory shows deepfakes are also fueling identity theft schemes. "Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants," the FBI says.
The scammers have been using the technology to apply for remote or work-from-home jobs from IT companies. The FBI didn’t clearly state what the scammers' end goal. But the agency noted, "some reported positions include access to customer PII (personal identifying information), financial data, corporate IT databases and/or proprietary information."
SEE ALSO: 13 of our favorite deepfakes that'll seriously mess with your brainSuch info could help scammers steal valuable details from companies and commit other identity fraud schemes. But in some good news, the FBI says there's a way employers can detect the deepfakery. To secure the jobs, scammers have been participating in video interviews with prospective employers. However, the FBI noted that the AI-based technology can still show flaws when the scammer is speaking.
"The actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking," the agency said. "At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually."
Topics Artificial Intelligence
Charlie Chaplin was the original Distracted Boyfriend memeCosplayer blends Nigel Thornberry and Sailor Moon for a terrifying costumeThe only good thing to come out of Bitcoin is this catThe latest easter egg from Spotify celebrates Pride monthThe only good thing to come out of Bitcoin is this catThis guy has apparently watched 'Infinity War' 43 times since its releaseSimone Giertz, creator of 'Shitty Robots,' undergoes surgery to remove brain tumor17 times Lena Headey's Instagram deserved a place on the Iron ThroneWildly sexist 'New York Post' front page refers to Kim Kardashian's butt 3 timesBall of love Andrew Garfield shares his Tony with the LGBTQ community Millie Bobby Brown's Golden Globes Instagram post about Drake is 11/10 Kid's super deep response to classroom riddle is way better than the actual answer CNN's Don Lemon and Anderson Cooper slam Trump's 'racist' comments Kellyanne Conway falsely claims 'nobody here talks about Hillary' J.K. Rowling tweets response to Donald Trump cancelling his UK visit Princess Charlotte started school today and we're all officially ancient Bannon is gone at Breitbart, and the internet says good riddance Heartbreaking video of little boy singing in memory of his sister goes viral Gwyneth Paltrow's Goop suggests you use coffee to clean your poop chute. Don't. Trump rumors are fun, but they're a dangerous fantasy
0.1387s , 10296.2421875 kb
Copyright © 2025 Powered by 【rabbit sex toy video】FBI: Scammers are interviewing for remote jobs using deepfake tech,Global Hot Topic Analysis