Skip to main content

Sony Vaio Fit 11A batteries may go blammo, company warns users

sony vaio fit 11a batteries may go blammo company warns users battery warning
Image used with permission by copyright holder

Stop using that Vaio! No, really. Stop using it. Now.

Sony’s support team issued a warning Friday morning to users of the convertible Vaio Fit 11A, noting that batteries in the computer (provided by a third party, the company was quick to note) are liable to overheat, resulting in burns.

The company issued a statement on its eSupport site with a title that says it all: “Request to Stop Using VAIO Fit 11A/Flip Personal Computer.”

“If you have one or more of the VAIO Fit11A/Flip PC model listed below, please immediately discontinue use, shut down and unplug the PC. We are currently identifying affected PCs by serial numbers and developing a program to repair or replace the affected PCs at no charge, or to refund the purchase price for the affected PCs,” the document notes.

More: Read our review of the Vaio Fit 11A

Sony said it expects to post an updated announcement within two weeks.

But please, if you’ve got a Vaio Fit 11A with the serial number SVF11N13CXS, put down the laptop and back slowly away.

The company has not decided whether to issue a full on recall of the affected computers, something that would be unlikely to have an impact on Sony stock, explained Takashi Aoki, a fund manager at Mizuho Asset Management Co.

“The impact on Sony stock price would be limited” from a recall, Aoki said, according to Bloomberg. “Sony already decided to spin off the Vaio business.”

Sony is in the process of restructuring, which may include a sell-off of its Vaio business to Japan Industrial Partners Inc. and a split of the TV manufacturing unit into a separate business this summer.

Laptop batteries are less stable than consumers would hope, unfortunately, and Sony has gone through this type of recall before — notably a 2006 incident that led to the recall of well over 7 million batteries. Yowza.

Jeremy Kaplan
As Editor in Chief, Jeremy Kaplan transformed Digital Trends from a niche publisher into one of the fastest growing…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more