Improving Technologies Make Impersonation Scams More Effective

Most of us have received phishing emails in our inbox and smishing messages on our phones impersonating people or companies we trust. According to the US Federal Trade Commission (FTC), consumers lost $1.1 billion to these types of social engineering scams in 2023. That is three times more than in 2020, with strong growth expected now that artificial intelligence (AI) technologies can be used to make phishing communications more convincing. 

We have often been able to spot these impersonations by noticing non-standard language in sentences, grammatical errors, or messages that don’t seem to apply to the situation. But as AI tools improve, scammers can use them to rapidly create very convincing messages that lack the tell-tale signs we’ve become accustomed to spotting. It is even possible to use so called deepfake tools to create convincing audio and video of someone speaking – using only short clips of the real person speaking.  

Here are other clues to help identify impersonation scams:  

  • Verify the source. Check the email address the email was sent from, and if a suspicious email comes in to your GatorMail email, double-check whether the message is flagged in red as [External Email]. On your phone, smishing messages often appear to come from an email address rather than a phone number. 
  • Check with the sender. Impersonation scams often want to give you the impression that the real person being impersonated is not available, which is why they need you to quickly take some action for them. But it doesn’t hurt to give the real person a call or send a message to verify, because if they answer, it was probably a scam! Do not hit ‘reply’ or ‘redial.’ Instead, look up the person in your contacts or find a reliable contact for companies independently (such as calling the phone number on the back of your credit card if you get a text purportedly coming for your bank) 
  • It’s a good idea to agree on a way to authenticate communications with people ahead of time, such as by creating a ‘code word’ that family members can use if they are really in trouble.  

If you find yourself the recipient of an impersonation scam, you should report the fraud to the FTC. This helps federal investigators stop scammers before they can reach more people. 

For more information on impersonation scams, visit https://security.ufl.edu/learn-security/.

Identifying Deepfake Videos

Misleading content online becomes more sophisticated with each technology advancement. One type of “fake news” becoming more prominent across all social channels is the deepfake, a video that’s been modified to make the subject appear to be doing or saying something they did not.

Deepfake videos are made to fool viewers for a variety of reasons including political agendas, financial gain, to embarrass someone or a group, or to use for blackmail. Public figures can be made to say things they never said, inciting viewers or followers to think a certain way and take action based on misinformation. A viral deepfake video supposedly of Tom Cruise has more than a million views. Here’s a breakdown by the video’s creator on how he utilized AI to construct the video: DeepTomCruise TikTok Breakdown.

It is possible to identify some deepfake videos by noticing changes in skin tone, jerky facial movements, or lip movements that do not match dialogue. But as the technology improves these clues could become even harder to spot. If you have concerns about the authenticity of a video purporting to be from UF, please contact the department posting the video or send your concern to the UFIT Help Desk.