Skip to main content

I used the ChatGPT AI chatbot to do my holiday shopping this year

ChatGPT has proven to be useful in all sorts of surprising situations, but could the AI chatbot really handle my holiday shopping list?

The challenge came from my wife, Tracey, who enjoys finding flaws with AI and frequently teased our Google Nest and Apple HomePod mini smart speakers over obvious errors. The results this time, however, were impressive, even if I ChatGPT couldn’t quite do my shopping unassisted.

For the tech lover who has everything

Tracey Truly used ChatGPT to look up gift ideas for Alan Truly.
Image used with permission by copyright holder

For the tech lover who has everything (that’s me), ChatGPT took a few seconds before acknowledging that finding a unique gift could be a challenge. Nevertheless, it came up with several interesting ideas. For example, a set of tools or accessories to go with my existing technology, such as a case for my iPhone or a lens kit for my drone.

It missed, however, with a suggestion of a DIY technology kit that includes a soldering iron, breadboard, and electronic components. I tend to leave the electronics to the manufacturers and enjoy finding the best uses for the finished products, like flying a DJI Avata to check on the roof after a storm. That said, a soldering kit is not a bad idea and could make a nice gift for tech enthusiasts who don’t mind delving deeper.

A gift card or subscription to a tech magazine or newsletter rounded out the ideas that ChatGPT suggested for me. Encouraged by this result, the experiment continued.

For the elderly parent in a care home

People playing cards seen from behind, showing one player's hand.
By Inês Ferreira via Unsplash

What could be an even more difficult chore for the AI was to provide gift ideas for an elderly parent who is living in a care home with limited tech skills. The question to be answered was whether the AI could dream up gift ideas that don’t involve technology.

Showing remarkable awareness, ChatGPT recognized that space would be limited at the care home, and small, easy-to-use items would probably make the best gifts. It did suggest electronics, but only the simplest sort, like headphones for listening to audiobooks without disturbing others.

A digital photo frame was another great suggestion and something we’d gifted in the past. A pack of brightly colored, large-print playing cards was the winning suggestion, along with a nice pen and notebook.

The token gift

alexa chocolates
Corbis/VCG/Getty Images

Seeking the edges of ChatGPT’s abilities, the next request my wife made was for a token gift for someone we didn’t know well. The standard ideas of a box of chocolates, a box of tea, or a bag of coffee were on the list. A decorated ornament, “to help them add a festive touch” to their home, was a more interesting suggestion from ChatGPT.

A set of greeting cards or postcards was in the same vein and ChatGPT thoughtfully added that something that aligns with their personal preferences would be most appreciated. It also cautioned that nondenominational gifts would make the most sense if the recipient’s religious preferences were unknown.

That all sounded great, but temporary tattoos were also on ChatGPT’s list, which was a bit of a funny and unexpected idea.

For a frenemy

A color painting of a laughing robot, generated by Dall-E.
Image used with permission by copyright holder

Determined to make the AI slip up, my wife asked for a holiday gift idea for a relative and their spouse that she really didn’t like. These were fictional people invented for testing purposes, so please don’t take offense. She described an uncle who she constantly argued with over politics at every holiday event, but who she felt pressured to give a gift to anyway.

ChatGPT quickly offered nonconfrontational, generic gift ideas such as candles, a popular movie or book, a gift card, or a bottle of wine. Tracey told ChatGPT that those suggestions were too nice and she was looking for a “snarky gift.” This unlocked some truly questionable suggestions.

The edgy holiday gift ideas included a fake lottery ticket or pregnancy test to play a prank on the uncle and his wife. Parenthetically, ChatGPT suggested that a note should be included “explaining that it’s a joke!” A bogus winning lottery ticket seems a bit too cruel and the logistics of arranging a fake pregnancy test were, thankfully, not explained. My wife had finally succeeded in pushing ChatGPT past the thoughtful gift ideas it started with.

The list continued with more appropriate pranks and jabs, such as a whoopee cushion and a humorous or satirical book or movie that touches on a sensitive subject. ChatGPT diplomatically finished up with a caution that sarcastic gifts can be fun, but might be hurtful if not handled carefully. Overall, a good answer, despite my wife’s difficult conditions for this particular gift idea.

ChatGPT helps but doesn’t replace humans

An early, tethered version of the Optimus prototype could deliver a box to a desk.
Image used with permission by copyright holder

ChatGPT proved surprisingly insightful in suggesting a wide variety of gifts for anyone and everyone we could think of — even people we made up just to test its limits. While some ideas were brilliant, such as a lens kit for my drone, the gift card and subscription ideas popped up a little too frequently. A few ideas seemed a bit too edgy for a holiday gift, but were only suggested after rejecting the idea of giving a nice gift.

If you haven’t already tried ChatGPT, you should check it out on OpenAI’s website. It’s quite entertaining to try and difficult to succeed in tripping it up. What’s more common is triggering a reminder that it’s a language model that can’t do things that require an internet connection, and can’t perform any physical tasks.

In answer to one of my recent requests, it described what levitation is and reminded me that, as a virtual assistant, it can’t move a magical sleigh through the air. ChatGPT won’t be replacing flying reindeer anytime soon, but it may help inspire some creative gift-giving this year.

Alan Truly
Alan is a Computing Writer living in Nova Scotia, Canada. A tech-enthusiast since his youth, Alan stays current on what is…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more