Skip to main content

BitTorrent releases privacy-centric Bleep alpha app to the public

bittorrent releases privacy centric bleep alpha app public bleepscreen 970x0
Image used with permission by copyright holder
Are you concerned about your privacy when chatting on the Internet? If so, you’re in luck.

BitTorrent just released the alpha version of its Bleep messaging app to the public. BitTorrent claims that Bleep keeps any conversations you have with people while using it completely safe from prying eyes.

Related: How to set up privacy settings for your Facebook account

Here’s what sets Bleep apart from other chat clients. According to BT, most chat clients rely on what it calls “centralized servers” to send messages back and forth between people. BitTorrent says that these servers are quite susceptible to snoops.

Bleep offers a peer-to-peer method of communication. In fact, BitTorrent claims that even they can’t see what you and your buddies talk about when using Bleep.

Related: Tor has plans for an anonymous instant messenger

“Our big idea was to apply distributed technology to conversations,” BitTorrent says. “That means no servers required. This enables people using Bleep to make a direct, decentralized connection to someone they trust. Bleep offers the freedom to communicate without the risk of metadata being exposed.”

BitTorrent Bleep includes the ability to chat via text or voice. Though Bleep was initially available for Windows only and via invites, BitTorrent has launched versions of the messenger for Mac and Android as well. There’s no version available for iOS yet, though.

Here are a few things you should know about Bleep in its current form. You can only send texts and make calls to people who are online in the app as well. Unless you have an unlimited data plan, BitTorrent recommends that you switch your data usage to Wi-Fi only while they work on the kinds related to battery and data usage. BitTorrent posted a full list of known issues with the Bleep alpha here.

Since it’s still in the alpha stage, expect to come across some bugs if you plan to use Bleep.

If you’re interested in checking out Bleep, just keep in mind that prying eyes are everywhere. Don’t say we didn’t warn you.

Konrad Krawczyk
Former Digital Trends Contributor
Konrad covers desktops, laptops, tablets, sports tech and subjects in between for Digital Trends. Prior to joining DT, he…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more