Skip to main content

Walmart begins testing online grocery delivery service

walmart-to-go
Image used with permission by copyright holder

It may be one of the ideas that nearly killed Priceline.com, but Walmart thinks it can get you to grocery shop on the Web. The worldwide retailer has begun testing grocery delivery in California in response to Amazon’s recent tests, reports the NY Times. Calling its service “Walmart To Go,” the company now offers those in the San Jose area home delivery of groceries, and other items.

Walmart’s new plan is likely in response to Amazon’s Seattle testing of AmazonTote, a new delivery service that, if combined with Amazon’s other Seattle project, AmazonFresh, would deliver groceries and other items to customers doorsteps in a reusable “tote” bag. Customers can leave the bag out and have it refilled next time their order is fulfilled.

The grocery service started on Saturday. Curiously, Walmart to Go also appears to use tote bags. Groceries are delivered in temperature-controlled trucks and deliver groceries the next day. However, options are currently limited as far as portions on fresh items. You can’t, for example, order 2 oranges; you have to order an entire bag. Still, it’s an interesting idea, and a natural step up from the retailer’s “Pick Up Today” service where it lets customers order food online and pick it up–all bundled and bagged–at the store.

But is there any money in this? It costs a lot to truck around merchandise and a number of companies have tried grocery delivery before and none have been terribly successful at it, except in select areas. There’s a reason why the milkman disappeared in this country: people like going to the store and buying groceries. We’ll have to see how well they enjoy having them delivered.

More on Walmart and Amazon’s crazy produce plans as they develop.

Jeffrey Van Camp
Former Digital Trends Contributor
As DT's Deputy Editor, Jeff helps oversee editorial operations at Digital Trends. Previously, he ran the site's…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more