Skip to main content

Cox is going to keep charging folks for exceeding their data limits

comcast fastest internet false advertising att airgig multi gigabit microwaves
Image used with permission by copyright holder
Remember when paying for data overages was a problem exclusive to your mobile service provider? You’re going to start missing those days now that Cox has expanded its trial 1TB data caps. Whereas this once only affected customers in Cleveland, Cox has now brought the practice of charging for extra data to customers in Florida and Georgia as well. So start getting used to usage-based billing, folks. It’s the way of the future, at least, if you’re with Cox.

According to the internet provider, users will be given a two-month grace period to become accustomed to being charged by the amount of data they use, and make adjustments as needed to avoid the extra payments. But once those two months are up for Floridians and Georgians, they’ll have to pay $10 for every additional 50GB of data consumed over the limit.

Cox says it will warn customers once they hit 85 percent of their allotted data each month, so they’ll know when to start cutting back. It has yet to be determined whether Cox is going to make this a permanent fixture in their internet plans, but given that they’ve already expanded the trial, it seems likely that these billings are here to stay.

Sure, 1TB is a pretty solid amount of data, and Cox says that 99 percent of their customers currently pay for plans that are appropriate for their data usage. But as technology improves, so too does the amount of data required to support that technology. For example, streaming 4K videos, downloading increasingly larger files, and backing up your computer will require an increasing amount of data. Whereas 1TB may be enough for most people today, the same may not be true in the near future.

Alas, it looks as though Cox (and the rest of us) will just have to cross that bridge when we get there.

Lulu Chang
Former Digital Trends Contributor
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more