Skip to main content

Beware: many ChatGPT extensions and apps could be malware

ChatGPT fever has overtaken the internet, and rightly so since it’s such a powerful new tool. Unfortunately, the most sought-after content is often fertile ground for hackers and scammers.

In a recent video, cybersecurity-focused YouTuber John Hammond warned that many ChatGPT extensions and apps could contain malware. It’s a valid point, and we should all use caution when installing desktop browser add-ons and mobile apps.

OpenAI's ChatGPT blog post is open on a computer monitor, taken from a high angle.
Photo by Alan Truly

When you visit a webpage, such as ChatGPT, you know who can access the information you provide. OpenAI is a known quantity that most people respect, even if there is some concern about the rapid pace of OpenAI’s updates to the public.

The privacy policies of browser extensions and apps vary dramatically, however. Even more alarming, regardless of the privacy claims, you might not recognize the developer or know whether they are trustworthy. It’s easy to claim your information will not be shared or sold, but who will enforce that policy?

Hammond notes that it goes deeper than the information you might voluntarily provide to the extension or app. Hackers have ways of bypassing security features, particularly when doing so with software you’ve installed on your device.

Citing a recent Guardio report on a fake ChatGPT Chrome extension, Hammond explains that the extension contained malware that used a backdoor to access Facebook account information.

By stealing numerous Facebook accounts, the malware created bots that made advertisements promoting the extension. The ads drove traffic to the extension, generating more bots that posted more ads.

The goal of the self-replicating malware was to collect user information to sell on the dark web. Google took down the extension, but another soon popped up, and the battle against malware is seemingly unending.

Most ChatGPT Extensions Are Just Malware

The critical takeaway from the video is to be careful with every browser extension you install. Any software that resides on your computer has greater access than a webpage. The same is true of mobile apps.

If you want to use ChatGPT, you can do so from OpenAI’s website. GPT-4, the OpenAI technology behind ChatGPT, powers Bing Chat and is available in a tab on any Bing Search. Bing Chat is also available on your phone via the Bing app or Edge browser.

If you still want to use a browser extension or app that adds extra capabilities or makes the advanced AI more convenient, proceed with caution. Check the privacy policy, read reviews, and learn more about the developer before trusting that your data will be secure and private.

Alan Truly
Alan is a Computing Writer living in Nova Scotia, Canada. A tech-enthusiast since his youth, Alan stays current on what is…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more