Artificial intelligence is generating big returns for crooks, big losses for the rest of us
Artificial intelligence is transforming industries, but it's also giving scammers powerful new tools to steal money. Experts warn that AI-driven scams are becoming more frequent and harder to detect.
In a KUTV report, Salt Lake City business owner Terence Mills shared how his employees often receive emails that look like they're from him, saying, "There's virtually no way for them to tell it's fake."
Sgt. Jeff Plank of Utah's State Bureau of Investigation said cybercrime "is a huge problem," with money stolen by cybercriminals quadrupling in the last five years. Utah residents are especially at risk, losing more money to scams than the national average.
Cybersecurity expert Perry Carpenter described AI scams as "fraud made easy," allowing even inexperienced scammers to send thousands of convincing emails or texts. "They can now create deception that was unthinkable five years ago," he said.
Phishing season is always open
The FBI notes that phishing remains the most reported type of cybercrime, with incidents growing annually. In 2022 alone, losses due to cyber scams reached billions of dollars, highlighting their financial impact.
Scammers are even using AI to mimic voices, making it sound like a loved one is calling for help. "Youd think, Im talking to my mom. But youre not," Mills warned.
While AI could also be used to fight scams, Mills suggests companies like Google or Apple might develop tools to protect users. However, Carpenter is skeptical, saying, "They havent even solved email phishing yet."
Carpenter advises focusing on self-protection, reminding people that all scams share the same goal: "Money or minds."
Most frequent cyber scams
New scams are being dreamed up all the time, so there's no way to produce a list of every single one but here are some of the most frequent examples:
-
Phishing Emails and Messages
- Scammers send deceptive emails or messages posing as legitimate entities like banks, employers, or government agencies.
- Victims are often directed to fake websites or asked to provide sensitive information like passwords or credit card numbers.
-
Malware and Ransomware
- Clicking on malicious links or downloading infected files installs malware on devices.
- Ransomware locks users out of their systems until a ransom is paid, often in cryptocurrency.
-
AI-Driven Scams
- Using artificial intelligence, scammers create realistic deepfake videos, voice mimics, or automated messages to impersonate trusted individuals or entities.
- These can include fake calls or emails that sound convincingly real.
-
Social Media and Impersonation
- Scammers impersonate friends, family, or colleagues on social platforms, asking for money or personal information.
- They may also create fake profiles to lure victims into scams.
-
Online Shopping Fraud
- Fake e-commerce websites advertise goods at low prices but either never deliver the products or steal payment details.
-
Investment and Cryptocurrency Scams
- Fraudulent investment schemes promise high returns, often involving fake cryptocurrency platforms or Ponzi schemes.
-
Tech Support Scams
- Victims receive unsolicited calls or pop-up warnings claiming their devices are infected, prompting them to pay for unnecessary or fake tech support.
How to protect yourself
- Verify Requests: Always double-check the source of emails, texts, or calls, especially if they request sensitive information.
- Avoid Clicking Unknown Links: Dont click on suspicious links or download unsolicited attachments.
- Use Security Tools: Install antivirus software, enable firewalls, and use secure passwords.
- Educate Yourself: Stay updated on common scam tactics and red flags.
- Report Suspicious Activity: If you suspect a scam, report it to authorities or organizations like the FTC, FBI, or Anti-Phishing Working Group.
By staying vigilant and informed, individuals can significantly reduce their risk of falling victim to cyber scams.
Photo Credit: Consumer Affairs News Department Images
Posted: 2024-11-18 02:25:01