Skip to main content

Here’s why ChatGPT requires a phone number to use

ChatGPT is very simple to get started on. It’s free to sign up for, and you can either use your own email address or log in with your Google or Microsoft account. But you may have noticed that ChatGPT also asks for your phone number.

Given all the concern over whether ChatGPT is safe to use, that may have thrown you for a loop, which begs the question — why does ChatGPT need your number after all?

A MacBook Pro on a desk with ChatGPT's website showing on its display.
Hatice Baran / Unsplash

Why ChatGPT requires a phone number

Like many things you sign up for online, ChatGPT rightfully requires some proper identification. First, it’s email, which requires you to click the link from within your email to verify. From there, it asks for your name and birth date, followed by a phone number.

This is called two-factor authentication, and it’s common these days, especially since it’s so easy to just create a disposable email address. It should be noted that the phone number doesn’t need to be distinct — more than one email address can be connected to one phone number. OpenAI might be using phone numbers to limit account creation, but I’ve tried setting up two email addresses with a single phone number without issue.

In this case, asking for a phone number is really just about ensuring that you’re a real person and that your account isn’t easily hacked. It should be noted that when logging in, ChatGPT doesn’t require two-factor authentication each time.

Is it safe to give ChatGPT your phone number?

You might be concerned about the idea of giving ChatGPT your phone number or other personal information. The service has been banned in Italy, after all, on the grounds of privacy concerns under the rule of GDPR (General Data Protection Regulation) in Europe.

But those privacy concerns don’t have to do with giving OpenAI your phone number or other personal information. The temporary ban in Italy has more to do with how the large language model was trained, which involves the collection of massive amounts of data. Regulars in Italy claim there is no legal basis for such collection. They also expressed concerns about the lack of age verification when signing up for ChatGPT.

As for the sign-up itself, the phone numbers are only required for “security reasons,” with OpenAI promising not to use your personal information for other purposes in its privacy agreements.

I would not, however, ever give ChatGPT personal information about yourself in the context of chats. Everything you send to ChatGPT is saved, even after you delete it from your chat history. That’s why OpenAI explicitly discourages sharing personal information with the chatbot.

Can I use ChatGPT without a phone number?

Unfortunately no. ChatGPT doesn’t accept VoIP (Voice over Internet Protocol) numbers, which means you’ll need an active mobile number to verify your account. The one exception to this rule is by using an app like Dingtone. The mobile app explains how it can be used to generate a virtual phone number that can be used for accounts like ChatGPT.

There have been reports that if you’re attempting to access ChatGPT in a country where it’s not available, the phone number verification blocks you from moving forward with your account. Of course, using a VPN can get you around the location restrictions, but you’ll still need an active phone number to make it work.

If, for some reason, your phone number verification isn’t working, we’d recommend getting in touch with OpenAI’s support team to help resolve the issue.

Luke Larsen
Luke Larsen is the Senior editor of computing, managing all content covering laptops, monitors, PC hardware, Macs, and more.
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more