close
close
The FBI offers tips for combating AI-powered fraud attempts

The FBI offers tips for combating AI-powered fraud attempts

The FBI warns that fraudsters are increasingly using artificial intelligence to improve the quality and effectiveness of their online scams, ranging from romance and investment scams to recruitment schemes.

“The FBI is warning the public that criminals are using generative artificial intelligence (AI) to commit fraud on a larger scale, increasing the credibility of their schemes,” the PSA says.

“Generative AI reduces the amount of time and effort criminals have to spend to deceive their targets.”

The PSA presents several examples of AI-powered fraud campaigns, as well as many topics and lures commonly used to raise awareness.

The agency has also provided advice on how to identify and prevent these scams.

Common schemes

Generative AI tools are perfectly legal tools to help people generate content. However, they can be misused to facilitate crimes such as fraud and extortion, warns the FBI.

This potentially malicious activity includes text, images, audio, voice cloning and videos.

Some of the most common schemes the agency has recently uncovered include:

  1. Using AI-generated text, images and videos to create realistic social media profiles for social engineering, spear phishing, romance scams and investment fraud schemes.
  2. Using AI-generated videos, images, and text to impersonate law enforcement, executives, or other authority figures in real-time communications and solicit payments or information.
  3. AI-generated text, images and videos are used in promotional materials and websites to recruit victims into fraudulent investment schemes, including cryptocurrency scams.
  4. Creating fake pornographic images or videos of victims or public figures to extort money.
  5. Creating realistic images or videos of natural disasters or conflicts to raise funds for fake charities.

Artificial intelligence has been used extensively for over a year to create cryptocurrency scams that include deepfake videos of popular celebrities like Elon Musk.

Deepfake crypto scam on TikTok
Deepfake crypto scam on TikTok
Source: BleepingComputer

Recently, Google Mandiant reported that North Korean IT workers have used artificial intelligence to create personas and images that appear as non-North Korean nationals to gain employment at organizations around the world.

Once hired, these individuals are used to generate revenue for the North Korean regime, conduct cyber espionage, or even attempt to deploy information-stealing malware on corporate networks.

The FBI’s advice

Although generative AI tools can increase the credibility of scams to a level that makes them very difficult to distinguish from reality, the FBI still suggests some measures that can be helpful in most situations.

These are summarized as follows:

  • Create a secret word or phrase with the family to verify identity.
  • Look for subtle imperfections in images/videos (e.g. distorted hands, irregular faces, strange shadows, or unrealistic movements).
  • Listen for unnatural tone or word choices during calls to detect AI-generated voice cloning.
  • Limit the public content of your image/voice; Make social media accounts private and limit followers to trusted people.
  • Screen callers by hanging up, researching the organization provided, and calling back at an official number.
  • Never share confidential information with strangers online or over the phone.
  • Avoid sending money, gift cards, or cryptocurrencies to unverified people.

If you suspect that you are being contacted by scammers or have fallen victim to a fraud scheme, it is recommended that you report this to IC3.

When submitting your report, include all information about the person who contacted you, as well as financial transactions and interaction details.

Leave a Reply

Your email address will not be published. Required fields are marked *