Skip to main content

Looking for a little bit of help from Google Helpouts? You’ll have until April 20

google helpouts dying on april 20
Image used with permission by copyright holder
Even though it’s only been around for a little over a year, Google’s Helpouts service is going the way of the dodo bird. Google announced it will shutter the service later this year.

Launched back in 2013, Helpouts was intended to get people the help they needed through qualified experts. These experts, called Providers, would offer instructional video chats, which could be recorded for viewing at a later date if both parties agreed to it. Google took steps to make sure its Providers knew their stuff, though the potential to get advice from a total hack certainly existed.

Based on Google’s language in the email, Helpouts will be shut down because not enough people have used it. Even so, there were some warning signs to the service’s impending doom, such as Google bumping into monetization issues, and the fact that the company simply didn’t update the service to include new features and bug fixes.

Google says you’ll have until April 20 of this year to take advantage of Helpouts. Afterward, you can download your Helpouts history through Google Takeout, the company’s data backup service. Takeout for your Helpouts history is only available until November 1 of this year, though, so if you want your tutorials, you’d best download them quickly. In the future, users will just have to find out how to do stuff the old fashioned way: on YouTube.

You can check out the full text of the email Google sent us below:

Google Helpouts
Image used with permission by copyright holder
Williams Pelegrin
Former Digital Trends Contributor
Williams is an avid New York Yankees fan, speaks Spanish, resides in Colorado, and has an affinity for Frosted Flakes. Send…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more