Skip to main content

Gmail will be restored for everyone very soon, says Google

gmail-logo-by-google
Image used with permission by copyright holder

If you are one of the .02 percent of Gmail users that logged into your email account on Sunday and found that all of your mail was missing, Google says it’s “very sorry” and will have the problem fixed as soon as possible. In a blog post, Ben Treynor, Google’s vice president of engineering and site reliability ‘czar’ apologized to users affected by a bug that crept up when Google updated its storage software. The bug deleted all online copies of Gmail data Google had for as many as 30,000 users. Luckily, the search giant keeps a spare backup of all its data offline.

“To protect your information from these unusual bugs, we also back it up to tape,” said Treynor. “Since the tapes are offline, they’re protected from such software bugs. But restoring data from them also takes longer than transferring your requests to another data center, which is why it’s taken us hours to get the email back instead of milliseconds.”

The bug was first reported by on Sunday when a number of scared users posted on the official Gmail forums. Initially, Google claimed that about .29 percent of users were affected by the bug, but has since reduced that number to .08 percent and finally to .02 percent. There are about 150 million Gmail users, so a bit of quick math tells us that there are roughly 30,000 people suffering from email loss at this time.

Stay strong, affected Gmailers. If your emails aren’t already back, they’ll be arriving back in your inbox shortly.

Jeffrey Van Camp
Former Digital Trends Contributor
As DT's Deputy Editor, Jeff helps oversee editorial operations at Digital Trends. Previously, he ran the site's…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more