Skip to main content

PayPal terminates partnership with Alibaba

PayPal AliExpress split
Image used with permission by copyright holder

Online payment giant PayPal has announced it will be ending its year-long partnership with AliExpress, the wholesale goods service of China’s Alibaba that is aimed primarily at businesses and corporations. PayPal gives no reason for ending the relationship, but industry reports have the company unhappy with the amount of goods AliExpress is retailing directly to consumers rather than businesses.

“PayPal remains deeply committed to China by providing Chinese merchants access to a safer and easier way to sell to millions of our users all over the world,” wrote PayPal Asia Pacific’s director of communications Dickson Seow, in a statement. “You’ll continue to see new initiatives, products, and services from PayPal in China that help Chinese merchants grow their businesses by selling goods and services to our global buyers.”

PayPal says its overall payment volume in “Greater China” exceeded $4.4 billion during 2010, a 44 percent increase over 2009.

AliExpress is part of Alibaba.com, which is in turn part of the Alibaba Group. American Internet giant Yahoo has a 43 percent stake in the Alibaba Group, which has simultaneously become one of the company’s most valuable assets and a source of friction. Currently, Alibaba and Yahoo are working to resolve a dispute over transitioning Alibaba’s own online payment service, AliPay, into a separate corporate entity licensed to handle online payments in China. Yahoo claims it was caught by surprised when Alibaba spun off AliPay, and is working with Alibaba to ensure it is “appropriately compensated.”

PayPal announced its partnership with AliExpress a little over a year ago.

Topics
Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more