Skip to main content

LaCie, WD Debut Storage at Apple Expo

Iomega isn’t the only company trotting out Mac-friendly storage products at this week’s Apple Expo in Paris: competitors LaCie and WD also want Apple fans to check out their new storage products: a two-disk RAID array and a Mac-formatted My Book Studio Edition, respectively.

For the expo, LaCie announced it is shipping its 2big Triple, a two-disk raid device available in 1, 1.5 and 2 TB capacities which features three different connect interfaces: FireWire 800, FireWire 400, and USB 2.0. The array targets graphic design and media production professionals who need a combination of high capacity with high reliability: the drive offers safe RAID 1 mode, Big of Fast RAID 0 modes, and a JBOD (Just a Bunch of Disks) which treats each drive as a separate volume. The drives are pre-formatted for use with Mac OS X, but the driver-free device can also be used with Windows systems, and the drive ships with EMC’s well-respected Retrospect backup program. Expect to see it in October for $399.

WD is courting Mac users with a new My Book Studio Edition, which is pre-formatted with the HFS+ Journaled file system for use with Mac OS X computers and sports four interfaces: USB 2.0, FireWire 400, FireWire 800, and eSATA. The external drive is available in capacities from 320 GB to 1 TB, a quiet fanless design, and an "elegant silver finish" designed to complement today’s Mac designs. The drives are available now at prices ranging from $199.99 to $399.99 depending on capacity.

Topics
Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more