Skip to main content

Documents suggest some Best Buy Geek Squad employees were paid to inform by FBI

For some time, the Electronic Frontier Foundation (EFF) has been investigating an alleged relationship between the FBI and Best Buy’s Geek Squad repair service. According to the EFF, the FBI has been working with Geek Squad employees to gain access to incriminating information present on the PCs of Best Buy customers. Thanks to a Freedom of Information Act (FOIA) request and lawsuit, the EFF has uncovered new information it says supports its allegations.

The information implies a relationship between the FBI and employees of Best Buy’s Geek Squad that could be even “cozier” than the EFF originally suspected, according to a recent blog post by the EFF. A number of documents were released under the FOIA request, which can be viewed here and here, describe a very close relationship that involves the FBI allegedly paying Geek Squad staff as informants and could potentially be a violation of PC owners’ Fourth Amendment rights.

In one example, the EFF believes that Geek Squad technicians were paid by the FBI to act as informants, and that such payments were made as part of the investigation into Dr. Mark Rettenmaier, who was charged with possession of child pornography. The image in question in that case was apparently discovered by the Geek Squad on unallocated space on Rettenmaier’s hard drive, a process that usually involves forensic software and wouldn’t normally result from typical data recovery processes. Ultimately, the judge in Rettenmaier’s case threw out this evidence due to “false and misleading statements” by the FBI, as the Los Angeles Times reports, and his case was dismissed.

According to the EFF, the new evidence indicates that the FBI’s alleged payments to Geek Squad employees to dig into customer hard drives for potentially incriminating evidence represents a potential Fourth Amendment violation. Some documents indicate that the FBI is notified only when illegal materials are discovered as a normal part of a data recovery process, and going beyond that process to find incriminating evidence would usually require a warrant.

Apparently, FBI agents would visit the Best Buy repair facility in Kentucky to view images and other information that was discovered by a Geek Squad technician. If there was evidence of a crime, then the FBI agent would remove the equipment and ship it off to the FBI office near the customer’s location. Then a warrant would sometimes be obtained in order to investigate further. The important distinction is whether the Geek Squad employee discovered the incriminating evidence while performing the contracted services or, perhaps induced by an FBI bounty, went beyond those services to “actively sweep for suspicious content.”

For its part, Best Buy disputes the EFF’s allegations of a formal relationship between the company and the FBI. In a statement to ZDNet, Best Buy said:

“As a company, we have not sought or received training from law enforcement in how to search for child pornography. Our policies prohibit employees from doing anything other than what is necessary to solve the customer’s problem. In the wake of these allegations, we have redoubled our efforts to train employees on what to do — and not do — in these circumstances. We have learned that four employees may have received payment after turning over alleged child pornography to the FBI. Any decision to accept payment was in very poor judgment and inconsistent with our training and policies. Three of these employees are no longer with the company and the fourth has been reprimanded and reassigned.”

The EFF will contin ue to seek to gain access to information that the FBI has so far withheld despite its FOIA suit filed in 2017. Open questions include whether or not the FBI has similar relationships with other companies, and whether it has procedures or training in place on how its agents gather information from computer repair services.

Mark Coppock
Mark has been a geek since MS-DOS gave way to Windows and the PalmPilot was a thing. He’s translated his love for…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more