Skip to main content

Facebook ends relationship with Breakup Notifier app

We were not tracking the relationship between Facebook and the Breakup Notifier app, but if there were a Facebook Banned App Notifier, it would have emailed us to say “It’s Complicated.”

facebook-breakup-notifier-suspended-blockedAccording to the Breakup Notifier’s official Website, Facebook has blocked Breakup Notifier, a new app that lets users track the relationship status of Facebook friends. When a friend (or former lover) publicly changes his or her relationship status, the app will email you their new status. When describing it, we jokingly referred to it as a stalker’s dream come true. Well, Facebook may have took these criticisms seriously. The app amassed 3.7 million users in its first three days of release, but was unexpectedly shut down by Facebook on Wednesday.

Explaining the suspension, Facebook gave the following statement: “In this particular case, Breakup Notifier triggered one of our automated systems. We’re currently looking into the issue and have reached out to the developer.”

“I built Breakup Notifier as a bit of a joke this weekend, and it just took off. We’re working with Facebook right now to get it reinstated,” said creator Dan Loewenherz, a 24-year-old computer programmer to The Globe and Mail in an e-mail.

Though the app doesn’t appear to violate any of Facebook’s policies, perhaps the sheer creepiness of it will keep it off the site. Sadly, 3.7 million people are going to have to go back to the tried and true method of Facebook stalking…they’ll just have to keep hitting the refresh button. How cruel.

Jeffrey Van Camp
Former Digital Trends Contributor
As DT's Deputy Editor, Jeff helps oversee editorial operations at Digital Trends. Previously, he ran the site's…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more