Skip to main content

EBay Hit with $3.8 Bln Online Payments Suit

Image used with permission by copyright holder

Online auction giant eBay just can’t seem to catch a break: XPRT Venture has filed a lawsuit in Delaware seeking a whopping $3.8 billion in damages, claiming the company stole trade secrets from six XPRT patents and integrated them into its own payment systems. The suit targets PayPal, Mill Me Later, StubHub, and Shopping.com, along with parent company eBay.

The suit dates all the way back to September 2001, when the inventors of the patents (George Likourezos and Michael A. Scaturro) shared the patent-pending information with eBay under assurances of confidentiality. The suit alleges eBay then broke that agreement and used techniques covered in the patents in its own online payment systems. Instead, eBay went ahead and applied for its own patents without acknowledging XPRT’s prior art.

“XPRT claims that eBay’s upper management knew or should have known, that the unauthorized use of Inventors’ confidential and proprietary material ran the risk of patent infringement if XPRT’s patent applications issued as patents,” said attorneys Kelley Drye & Warren—who are representing XPRT—in a statement.

The suit contains nine counts: infringement claims on six XPRT patents, along with claims of conversion, misappropriation of trade secrets, and unjust enrichment.

The lawsuit lands at a particularly awkward time for Meg Whitman, who was eBay’s CEO at the time of the alleged infringement. Whitman is currently mounting a campaign for governor of California.

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more