Skip to main content

Malware cleverly weaponizes Discord to steal game currency from Roblox players

Discord Screenshot Desktop MacOS
Image used with permission by copyright holder
A flaw in Discord’s Application Program Interface (API) has allowed nefarious individuals to steal login credentials for Roblox, one of the first MMOs to support VR. From there, the Robux in-game currency can be funneled into a separate account and cashed out. Although only exploitable through traditional phishing practices, the flaw does raise concerns over the integration of popular applications with games that have real-money stores.

Discord is a popular chat application often used by gamers. It has the ability to handle group VOIP (Voice Over Internet Protocol) conversations and various other social functions. While popular with Roblox players, it has proven problematic as of late due to its API’s ability to execute user-generated code and applications. Because of that, infected systems can use the tool to steal user account information from Roblox in a money-making endeavor.

The method of attack first requires that a system be infected with malware. TrendMicro found an instance of a particular infectious program masquerading as a cheat app on one forum. That malware can sink its hooks into Discord and then wait for a user to play Roblox. When they do, it steals their account cookie and then sends that file over Discord to a specified channel.

Those behind the attack can use that cookie to log into the victim’s Roblox account and summarily transfer out all of their in-game Robux, which can then be transferred out of the game and turned into actual cash.

There are even variants of the malware that persistently steal login details, making it difficult to control the damage done with a password change. What you can do to prevent such attacks in the first place is be very wary of unofficial applications that claim to have the ability to help you cheat in multiplayer games. While their use is unfair to other users, you also run the risk of infecting your system.

TrendMicro also recommends running a decent anti-malware program. It’s also good advice to keep such applications, as well as your operating system, updated. You should also be very wary of sharing credentials online, though in this instance, the exploit does it automatically for you.

It’s important to not trust any chat app too much. As TrendMicro’s other research shows, the APIs of many VOIP platforms have been leveraged heavily by hackers in recent years as their usage has grown.

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more