Skip to main content

Amazon introduces doorstep delivery with AmazonTote

toteAmazon may have just introduced another way to avoid human interaction. According to The Financial Times, the online retailer has introduced a free delivery service to its Seattle customers. The service is being called AmazonTote and from the looks of it, there are no strings attached.

The site promises that this is a “free weekly delivery service” available without the hassle or commitment of “subscriptions, minimum delivery sizes, or fees.” Everything you order before 2 a.m. comes in a reusable cloth bag (or tote – get it?) the next morning at 10. And in the spirit of minimal contact, you can simply leave the bag on your porch to be picked up and later refilled with new purchases.

According to VentureBeat, there are already some misconceptions about AmazonTote. Mashable reported that the online vendor is internally testing grocery delivery in Seattle, and pairing AmazonTote with AmazonFresh. Apparently, perishable grocery items may not be eligible for the free delivery service, although seeing as both features are currently exclusive to Seattle neighborhoods, it’s uncertain.

Cutting out shipping costs in the company’s headquarters should benefit both Amazon and its customers. Taking the postal service-middle man out of Seattle deliveries would save the company some money, not to mention how it could impact returns. We reported earlier this year that Amazon was looking into an online system to automatically manage its returns and exchanges in an attempt to cut down on its own financial investment in the hassle. AmazonTote simple encourages users to leave returns in the bag on their doorstep.

We’ll keep an eye out on if AmazonTote does indeed include groceries from AmazonFresh (be forewarned, those are some spendy foodstuffs – nearly $2.59 for an Avocado?!) and more importantly, if it’s moving outside the Seattle area.

Topics
Molly McHugh
Former Digital Trends Contributor
Before coming to Digital Trends, Molly worked as a freelance writer, occasional photographer, and general technical lackey…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more