Skip to main content

Thumb drives and porn: How Osama bin Laden communicated from his lair

Osama Bin Laden Dead KilledNo phone and no internet might make Jack a dull boy, but Osama found a way around the disconnect–through pornography and couriers.

In an Associated Press report, government officials learned how Osama bin Laden conveyed email messages to his followers worldwide, despite having no internet access. He would type up email messages and save them to USB thumb drives. Then his trusted courier would copy the message and send it out at an innocuous Internet cafe, far away from the compound. The sneakernet also worked in reverse, as the courier would take messages and other data back to Osama.

Close to 100 thumb-flash drives as well as 10 hard drives have been found in bin Laden’s Pakistan hideout. The trove of data Navy SEALS have uncovered contains thousands of email messages, and hundreds of email address which U.S. Officials hope will lead them to other lead  al-Qaeda figures in the network. The information gleaned from the thumb drives does not point to any new terror attacks against the United States.

According to Reuters and ABC, the couriers may have been bringing back more than just digital handshakes and terror plots. Apparently, the Osama compound had an extensive cache of electronically recorded pornography kept in a wooden box. Osama bin Laden lived in the compound with his son and two other men who were his couriers, but U.S. Officials say they have no way of knowing who the porn belonged.

In an article on MSNBC, government officials are wondering whether there might be a connection between the pornography found in the possession of other al-Qaeda operatives, and the porn found in Osama’s compound.

The article said, “U.S. Officials had pursued a probe into whether al-Qaeda was using special software that would allow the email transmission of porn photos implanted with hidden messages that could be deciphered by recipients with the right code.’We thought this was the way that messages were being transmitted,’ said one official.”

Topics
Jeff Hughes
Former Digital Trends Contributor
I'm a SF Bay Area-based writer/ninja that loves anything geek, tech, comic, social media or gaming-related.
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more