Skip to main content

Toshiba Recalls 10,000 More Sony Batteries

Re-igniting concerns over Sony-manufactured batteries used in notebook computers, Toshiba has expanded its recall of notebook batteries manufactured for its notebook computers by Sony. Toshiba is recalling the batteries due to fire risk; the company says only 5,100 of the batteries are potentially defective, but the company is recalling some 10,000 battery packs just to be sure all the recalled cells are exchanged. The new recall impacts models in Toshiba’s Satellite A100, Satellite A105, and and Tecra A7 lines. Toshiba is replacing the battery packs free of charge.

Overall, Toshiba’s addition of 10,000 batteries to the overall recall of Sony-made notebook batteries is just a drop in the bucket: last year, computer makers and Sony itself recalled nearly 10 million notebook batteries due to a manufacturing defect which could cause the batteries to overheat and catch fire. The total recall has cost Sony over $400 million to date.

Toshiba maintains a complete list of models impacted by the battery recall, and urges customers participate in the recall program. Customers with recalled batteries are urged to stop using the battery immediately and power their computers using AC adapters until a replacement arrives.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
10,000 Battlefield 2042 players have petitioned EA to make one simple change
Michael K WIlliams plays Irish in Battlefield 2042.

Battlefield 2042 players have yet another issue to be upset about. This time, it's around the game's lack of support for mouse and keyboard on console.

Battlefield 2042 player "Shadow" has created a petition in hopes that DICE and EA will add mouse and keyboard support to the console version of Battlefield 2042. It currently has over 10,000 signatures from players.

Read more
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more