Skip to main content

AT&T announces 4G USB modem for free under contract

ATT-USBConnectForceCompletely free under contract, the USBConnect Force 4G modem connects to your home computer via a USB port and provides 4G download speeds from AT&T. Without contract, the device retails for $89.99. Customers get access to 5GB of data per month at a $50 cost under the two-year contract agreement. Any overage of the 5GB results in an extra $10 a month for each gigabyte used. Without the agreement, prepaid options include a day of access (100MB) for $15, seven days of access (300MB) for $30 or 30 days of access (1GB) for $50. Access runs out when the customer reaches either the data or time limit. Other fees include a $36 activation fee and a contract cancellation fee up to $325.

Beyond 4G speeds, the modem also has a microSD memory card slot for expanded memory up to 32GB. The modem also provides free access to all 24,000 AT&T Hot Spots. The device accesses AT&T’s HSPA/HSPA+ network for 4G speeds as well as the EDGE/GPRS for slower speeds when needed. The 4G USB modem will be available on July 17th in AT&T stores across the globe.

The USBConnect Force 4G modem enters a competitive market of mobile internet service. For instance, Sprint offers unlimited 4G access with the purchase of the U600 USB 3G/4G modem. The cost of that plan is $59.99 and there is a 5GB data cap on 3G service per month. Verizon Wireless offers the 4G LTE USB Modem 551L for an under-contract price of  $19.99. The two data plans offer 5GB of data for $50 a month or 10GB of data for $80 a month. There’s an additional $10 per GB charge for overage each month, but it offers free access to Verizon’s Wi-Fi hotspots. Check out our complete guide for 4G service for more information.

Mike Flacy
By day, I'm the content and social media manager for High-Def Digest, Steve's Digicams and The CheckOut on Ben's Bargains…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more