Skip to main content

Virtual currency, real crime: FBI alleges theft of millions of dollars in FIFA coins

fbi investigates theft fifa coins fifa17demo
Image used with permission by copyright holder
More and more games today have their own forms of in-game currency, whether it’s loot gained by killing monsters or virtual dollars earned by achieving wins in sports titles. That currency can be in high demand even if can’t be spent in the real world, because it can make the difference in how far a player can advance in a game and is often tied closely bragging rights.

In-game currency can actually be sold for real dollars, meaning that some titles attract real criminals. FIFA Ultimate Team football is just such a game, and the FBI has been investigating the theft of millions of virtual coins, as Hot For Security reports.

One alleged perpetrator, Anthony Clark, was arrested by the FBI in Fort Worth, Texas, where property and banks accounts worth almost $3 million were seized. Clark is allegedly a member of a team of hackers who wrote an exploit that “mined” the Electronic Arts system that manages currency to “earn” FIFA coins. The criminal team could then move those coins on black markets in China and Europe, that specialize in acquiring the currency and selling to real players.

Along with Clark, Nicholas Castellucci, Ricky Miller, and Eaton Zveare are also accused by the FBI in an unsealed indictment of being members of hacking organization RANE Development. That organization is further associated with Xbox Underground, a gang of hackers that has pilfered software and technical specifications from a number of game-related companies, such as Microsoft, Epic, Activision, and Valve.

The FIFA currency hack was accomplished through the use of a gaming console that had been modified to enable the exploit, and it may have been acquired through the Xbox Underground organization. The impact of the theft of virtual currency can be significant for the gaming companies in terms of lost revenue, and may potentially deflate the value of the legitimate virtual currency that players work to earn themselves. Stealing virtual currency is not, therefore, just a virtual crime.

Mark Coppock
Mark has been a geek since MS-DOS gave way to Windows and the PalmPilot was a thing. He’s translated his love for…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more