Skip to main content

LG Files DVD Patent Suit Against Quanta

LG Files DVD Patent Suit Against Quanta

South Korea’s LG Electronics has filed suit against Taiwan’s Quanta, alleging Quanta infringes on four patents related to LG’s DVD technology. Quanta is seeking monetary compensation as well as an an injunction preventing Quanta from selling infringing products in the United States.

“LG’s proprietorship of DVD technologies is widely recognized throughout industry and the unlicensed use of our intellectual property is not acceptable under any circumstances,” said Jeong Hwan Lee, executive VP and head of LG Electronics Intellectual Property Center, in a statement.

The suit was filed in the U.S. District Court for the Western District of Wisconsin on July 3. An injunction barring Quanta from bringing allegedly-infringing products into the United States could have wide-reaching consequences: Quanta is the world’s largest contract manufacturer of computer systems, making notebook and other systems for companies like Sony, Hewlett-Packard, and Apple—it is also manufacturing the XO notebook for the One Laptop Per Child project. LG claims to hold roughly 5,000 international patents related to DVD technologies, and that the patents are critical to the company’s business.

This is not the first time the South Korean electronics giant has sued Quanta over patent infringement: back in 2000, the company sued Quanta in the Northern District Court of California alleging the company infringed on LG patents related to personal computer technology. The court found that Quanta had not infringed on LG patents, but the U.S. Court of Appeals overturned that decision, and a final ruling is expected in the first half of 2008.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more