Skip to main content

Transcend Offers 32 GB ExpressCard SSD

Transcend Offers 32 GB ExpressCard SSD

If you’ve been considering the benefits of a flash-based solid-state disk (SSD) drive—no moving parts, low power requirements, fast startup and wake times—but haven’t been sure about installing one as a drop-in replacement for an existing hard drive, then Transcend Information may have the solution for you in the form of is 32 GB ExpressCard/34 SSD.

The drive can be connected directly to ExpressCard-equipped notebook computers, and offers full support for Windows Vista and ReadyBoost (the drive can also be used with Mac OS X 10.4 or newer, or LInux Kernel 2.4 or later). Plus, the drive comes with a USB 2.0 adapter, so users can plug it into a desktop system or other computer as a USB storage device. The drive weighs just 19 grams and can operate at temperatures all the way from freezing (32°F) to way-too-hot (158°F).

The 32 GB ExpressCard/34 SSD is priced at $509 before taxes; Transcend also offers 8 Gb and 16 GB versions for $153 and $281, respectively.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
Samsung’s new external SSD is the size of a business card
A user types on their laptop with the Samsung T7 SSD plugged into the USB port alongside.

Portable SSD T7 Touch | Samsung

This story is part of our continuing coverage of CES 2020, including tech and gadgets from the showroom floor.

Read more
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more