Skip to main content

Zuckerberg travels to China, learns Mandarin

facebook-mark-zuckerberg-visits-china-baiduFacebook has somewhere between 550 million and 600 million users, but Mark Zuckerberg is seeing red. The social networking CEO is visiting China, one of the largest countries where Facebook has no presence. Shanghaiist reports that Time’s Person of the Year toured the headquarters of China’s largest search engine, Baidu. The woman you see on the right is Priscilla Chan, his girlfriend.

Zuckerberg’s interest in China is known. “How can you connect the whole world if you leave out 1.6 billion people?” he asked in October. China is one of the fastest growing markets in the world for, well, just about everything. Baidu has 300 million users and is an essential ally if he ever hopes to run China’s top social network. Currently, Facebook is banned in China, along with many other social sites like Twitter and YouTube.

The biggest hurdle to Facebook operating in China will likely be the willingness of Zuckerberg to cater to the Chinese government and censor its Website. Google had a big tussle with China last year after the Chinese Gov’t demanded it censor its search results and covertly tried to hack its systems.

Still, it’s not all business for Zuckerberg. According to Gawker, he has been learning Mandarin for some time and plans to visit a Tibetan temple in Beijing today with Chan. Hopefully he’ll have some fun while he’s there.

How long until China caves and lets Facebook into its walled garden Internet? Can a social network like Facebook really operate under extreme censorship?

Jeffrey Van Camp
Former Digital Trends Contributor
As DT's Deputy Editor, Jeff helps oversee editorial operations at Digital Trends. Previously, he ran the site's…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more