Skip to main content

Wal-Mart launches low-cost wireless plan; data still pricey

Despite the plethora of wireless plans already vying for our dollars, Wal-Mart launched a new plan that targets the low-to-middle-income families, its core customer base for the stores. Available only through T-Mobile, the Family Mobile service offers unlimited voice and text messaging for $45 a month for the first user. Additional lines will cost $25 total.

For a family of four, that’s $120 a month for unlimited voice and text messaging. That’s cheaper than what T-Mobile itself charges, and it’s significantly less than plans offered by other major carriers.

Lowering voice costs is not a bad idea, exactly, since wireless plans can get pretty pricey, such as the $99.99 charged by AT&T and Verizon for unlimited nationwide voice and by Sprint for unlimited voice and data. If you are looking for a family plan, the numbers shoot up even higher.

This is not Wal-Mart’s first foray into affordable wireless pricing. The retail superstore offers Straight Talk from Verizon and Common Cents from Sprint Nextel on its shelves. Straight Talk is also $45 a month, but does not support extra lines or family plans.

The catch is when data is added to the mix. AT&T and Verizon have $149.99 plans that offer unlimited data and voice. Wal-Mart chose the prepaid route for Family Mobile accounts, offering a free preloaded 100MB of WebPak internet access package upon activation. The package is shared amongst all the lines on the account, and ununsed data can be rolled over into the next month. To get more, however, customers will have to buy additional WebPak cards at $40 a gigabyte. Ouch.

Customers can sign up for Family Mobile right inside Wal-Mart, and can pick a phone from various manufacturers, including Samsung, Motorola, and Nokia. Starting from $35, customers can also upgrade their phone at any time. That’s a far cry from the $160 or so consumers often shell out for a new phone and being restricted to when they can upgrade. We would like to see what the phone selection is like before jumping on board, though.

Fahmida Y. Rashid
Former Digital Trends Contributor
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more