Skip to main content

Mozy drops unlimited storage options, blames terabyte-gobbling ‘power users’

mozyApparently, the cloud is running out of room. Online backup service Mozy will no longer be offering unlimited storage plans for its Mozy Home customers – and if you’re one of those users, it’s your own fault.

“Consumers have the capacity to generate more data than was ever possible when these unlimited plans came out. Very large users of these types of media can generate multiple terabytes. But three out of four users still fit comfortably within out 50 gigabytes,” VP of product marketing Russ Stockdale told PC Mag this morning.

Mozy’s customers have become so content-creator happy that the company will also be increasing prices. It’s entry level account will tap out at 50GB for $5.99 a month, and the bump up will get you 125GB for $9.99 a month. That 10 bucks will be able to store data for up to three computers, though, where it previously would only be able to cover two PCs. You can also purchase an additional 20GB a month for $2. “We’ve seen the multi-computer phenomenon on the per-household level and we increasingly see it on the per user level. We’re trying to being the incremental cost of [backup on] those machines down,” Stockdale said.

Demand for backup storage increased 50 percent this past year, and while Mozy claims that most users stay well within 50GB, there are “power users” crowding the cloud. These types (about 10 percent of Mozy’s customers) are storing things like high-def video files, high-res photos, and massive music catalogs. According to Stockdale, you can blame these guys: “The great majority of customers are growing at manageable levels, while the heaviest users bring up the average for the entire group.”

Old customers will be able to use their plans until their contracts are up, but for everyone else the new system starts today.

Molly McHugh
Former Digital Trends Contributor
Before coming to Digital Trends, Molly worked as a freelance writer, occasional photographer, and general technical lackey…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more