Skip to main content

Huawei wins injunction in Motorola/Nokia Siemens deal

Image used with permission by copyright holder

China’s Huawei has won a preliminary injunction that blocks technology giant Motorola from transferring particular mobile phone networking technology to Nokia Siemens in a $1.2 billion deal that’s been percolating for the better part of a year. Back in July 2010, Motorola announced a plan to sell its mobile phone networking gear business to Nokia Siemens; however, after months of back and forth, China’s Huawei sued to block the deal, claiming the sale would illegally transfer Huawei technology to Nokia Siemens. The preliminary injunction bars Motorola from transferring any confidential information to Nokia Siemens until the dispute is resolved.

Although not well known in the United States, China’s Huawei is one of the world’s largest developers and manufacturers of telecommunication gear, with more nearly $30 billion in sales in 2010. Huawei claims that it developed technologies widely deployed by Motorola in GSM and CDMA switching systems, as well as GSM technologies sold in a number of Motorola handsets. Huawei has repeatedly indicated that it has no desire to block the Motorola/Nokia Siemens deal, but that it simply does not want its technology transferred to Nokia Siemens. The companies have been unable to come to an agreement on how to handle the issue.

Last July, Motorola also sued Huawei, claiming Huawei had obtained confidential Motorola information via a reseller. Huawei has denied the allegations.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more