Skip to main content

Parents call their baby ‘Like’ after Facebook button

facebook likeWhen parents come to name new additions to their family, many choose the name of a grandparent, or perhaps buy a book of names and spend time reading through it, taking note of the various meanings behind each one. Others turn to the Internet, where there are an endless number of websites listing hundreds of thousands of suggestions – like John or Nancy.

According to a BBC report translated from Israeli newspaper Maariv, Lior and Vardit Adler, who live in Hod Hasharon in Israel, also turned to the Internet for inspiration, but certainly not in the usual way. The Facebook fans have named their third child Like, after the button on the social networking site that allows users to let others know about things they recommend or enjoy – or like. “So is that a boy’s name or a girl’s name?” you may ask. Apparently, in this case at least, it’s a girl’s name.

The report says that parents Lior and Vardit wanted a name that was “modern and innovative.” They certainly appear to have succeeded with that wish. Like’s two sisters also have modern and innovative names – one is called Dvash (Hebrew for honey), and the other, Pie (as in steak and kidney – her parents like cooking, you see).

You won’t be surprised to learn that when Like was born, one of the first things her father did was to announce her arrival on Facebook. “When I posted her picture and name on Facebook, I got 40 ‘likes’,” he said. “Considering that I have only a little more than 100 friends on the network that’s a lot.”

This isn’t the first time the social networking site has been used to name a child. Back in February, a man in Egypt called her child Facebook in honor of the site’s role in the revolution that took place there.

We’re guessing it’s only a matter of time before some YouTube fans name their new baby Upload.

Topics
Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more