Skip to main content

Google VPs sued by PayPal over mobile payment secrets

Google has gotten some bad press following the anticipated unveiling of the Google Wallet mobile payment system in New York. Apparently, PayPal has filed a lawsuit against Google for misappropriated trade secrets. The suit claims that ideas for Google Wallet belong to PayPal’s own mobile payment project. PayPal planned to have in-store trials at the end of the year.

Former PayPal employees Osama Bedier and Stephanie Tilenius are named in the suit and at the head of blame for violating their contracts. Tilenius left PayPal in 2009, earlier than Bedier, but both ended up VPs at Google and were behind the NY unveiling of the mobile wallet strategy recently. Both employees had spent close to a decade working for PayPal.

Telenius is being targeted in this suit for violating contractual recruiting obligations. According to Bloomberg, she messaged Osama Bedier on Facebook about an “opportunity” and also coached him with text messages while he had his interview with Google. Osama is allegedly the main person responsible for the leaked information to Google and Major retailers.

Bedier was a senior executive at the San Jose, California based PayPal, and was close to the company’s mobile point of sale research. Bedier was also in charge of negotiations with Google about bringing a PayPal payment option to Google’s Android. The complaint says that he is now “leading Google’s efforts to bring point of sale technologies and services to retailers on its behalf”. PayPal is also a little sore that he didn’t disclose the Google job-discussions, saying this was a breach of his fiduciary duty.

Google has not commented yet.

Topics
Jeff Hughes
Former Digital Trends Contributor
I'm a SF Bay Area-based writer/ninja that loves anything geek, tech, comic, social media or gaming-related.
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more