Skip to main content

Sony exiting PC business with a bang as VAIO Flip laptops get recalled due to fire, burn hazards

If you use or own a Sony VAIO Flip PC laptop, you should stop right now, because a recall of these computers has just been announced by a U.S. government agency.

The problem with the Sony VAIO Flip PC stems from their lithium-ion batteries, which pose fire and burn hazards, according to this recall advisory notice published by the U.S. Consumer Product Safety Commission. The batteries were manufactured by Panasonic, according to the CPSC document. Here’s how the agency describes the problem.

“The computers’ lithium-ion batteries can overheat, posing fire and burn hazards. Sony is aware of four incidents, which occurred in Asia, of computers overheating, resulting in units smoking, catching on fire and melting. No injuries have been reported.”

This recall is specific to the Sony VAIO Flip PC, with model number “SVF11N13CXS.” You can locate the model number by finding a black label located on the underside of the screen, the notice says. The CPSC also says that these notebooks were sold for only a brief time – between February and April of this year, and only “about 680” units were sold.

The CPSC recommends that owners of Sony VAIO Flip PCs shut down and unplug these laptops immediately. Then, contact Sony, who will arrange for a free inspection of the notebook, as well as a free repair, or a full refund. Sony’s toll free number is 866-702-7669. Alternatively, you can get help via the Product Support page on Sony’s official site.

Unfortunately for Sony, this isn’t the first time that one of their PCs has been recalled this year. The batteries in their VAIO Fit 11A laptops also posed burn risks, and were recalled as a result.

2014 is the last year that Sony will be in the PC business.

Konrad Krawczyk
Former Digital Trends Contributor
Konrad covers desktops, laptops, tablets, sports tech and subjects in between for Digital Trends. Prior to joining DT, he…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more