Skip to main content

Sony’s USB Micro-Vault Tiny Drives Ship

Sony

Blink and you might lose it: Sony Electronics has begun shipping its Micro-Vault Tiny “byte-sized” USB 2.0 flash drives in capacities from 256 MB up to 4 GB. And when they say “tiny,” they seem to mean it: the little darlings measure just one inch long, roughly one-half inch wide, and as thick as a quarter. But they’re available in “striking” colors so, maybe, you’ll spend less time under your desk on all fours wondering where the Micro-Vault might have landed.

“In line with our Micro Vault concept, the Tiny is designed to delivera user-friendly storage solution that appeals to fashion-forward consumersranging from trend-setting students to style-conscious professionals,” saidMike Lucas, director of marketing for Sony Electronics’ Media andApplication Solutions Division.

The Micro Vault Tiny is available now in 256 MB to 2 GB flavors, with a 4 GB unit expected to reach market later this year. Retail prices line up with the Micro-Vault Classic line of USB storage devices, ranging from $29.99 to $199.99 (that top price will be for the 4 GB model; the 2 GB edition is $109.99).

The Tinies also come with a clip-on carrying case which can be attached to a phone case, purse strap, key chain, or other item as a fashion accessory (perhaps an accessory which screams “Steal my data! Please! But it’s Tiny, so you might have trouble finding it!”). The drives also come equipped with Virtual Expander, a windows utility which aims to effectively increase the Micro-Vault’s storage capacity through software compression. Sony says the Micro-Vault Tinies are compatible with Windows 98/2000/ME/XP as well as Mac OS 9 and higher, but we believe Virtual Expander is Windows 2000/XP-only.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more