The world’s introduction to Google Duplex — technology both impressive and a bit on the creepy side — featured a human-sounding robot having a conversation with a person who couldn’t even tell that they were talking to a robot. The demo during Google I/O 2018 freaked some people out, but it impressed our Mobile editor, Julian Chokkattu, who got a chance to try out Duplex recently.
Google Duplex is a big leap in the evolution of artificial intelligence (A.I.). No, it won’t lead to human-like robots that can do the laundry or go shopping like in the movie I, Robot (at least not anytime soon). But Google Duplex is a huge step in terms of A.I.’s ability to more naturally converse with humans. But what is it exactly?
When is Google Duplex coming out?
In October 2018, Google announced that Duplex functionality would start rolling out to Pixel phones in November 2019 on a city-by-city basis, starting with New York City. Now, however, the company has noted that the tech is far more available. In a blog post, the company announced that Duplex is available on all Pixel phones in as many as 43 U.S. states. Now, Duplex has grown beyond the confines of Google’s Pixel range, and as of April 2019, you can now use Google Duplex on a wide range of Android and iOS devices in the U.S.
According to Google’s support site for Duplex, Android phones running
What exactly is Google Duplex?
“A long-standing goal of human-computer interaction has been to enable people to have a natural conversation with computers, as they would with each other,” wrote Google Principal Engineer Yaniv Leviathan and Vice President of Engineering Yossi Matias, in a May 2018 blog post announcing the technology.
For years, businesses have been trying to create a way for people to have conversations with computers. Almost every time we call a business, we encounter an automated phone system. We have virtual assistants on our phones and virtual assistant-powered speakers in our homes. But although these computer systems can be helpful, they have their shortcomings.
In a blog post, Google notes that one of the biggest problems with these systems is that the user has to adjust to the system, instead of the system adjusting to the user. Think about all of the times you have to repeat yourself when you’re on the phone with an automated system, or all of the times that a virtual assistant hears something different than what you actually said.
Google Duplex helps with these problems by allowing the computer to have a natural conversation with a human. The A.I. system adjusts to the person, instead of the person adjusting to the system. Therefore, the person can speak normally, just as they would if they were speaking to another person. Google Duplex also makes it so the computer system sounds like a human. It uses a natural tone, as well as words and phrases like “um” and “uh,” just like a person would. During a conversation, the A.I. system can also handle interruptions and elaborate.
At the center of Google Duplex is a recurrent neural network that was built using a machine learning platform called TensorFlow Extend (TFX). When the system makes a phone call, it is pretty much indistinguishable from a live human being. You can hear Google Duplex scheduling an appointment and holding a phone conversation below.
What can Google Duplex do for you?
The main thing Google Duplex will be able to do for you is handle some of your busy work. It can make calls on your behalf, schedule appointments, or call to check the hours of operation at a business, for instance. Now, it can’t make that uncomfortable break-up call for you, but it can reserve you a table at a participating restaurant or call and make you an appointment at a hair salon. For instance, if you tell Google Assistant you want to go to a specific restaurant at 7 p.m. on Friday, the system will call and make a reservation for you and then notify you when it’s confirmed.
A few potential applications for Google Duplex
Imagine calling the cable company and dealing with an automated system that sounds and operates exactly like a human — one that can actually help you. There would be no more annoying IVR systems that tell you to “press 1 for billing questions or press 2 for technical issues.” Imagine if the IRS had this A.I. technology. During tax season, you wouldn’t have to wait an hour on hold for a representative because you could ask the A.I. system your tax-related questions.
Businesses such as doctors and lawyers who regularly schedule clients could have the A.I. do that on their behalf. Small businesses can also benefit, as research shows that 60 percent of small businesses that rely on customer bookings don’t have an online booking system, according to Google’s blog.
Concerns about Google Duplex
Many people have expressed concerns about Google Duplex. Aside from the fact that it’s a bit creepy, some people are worried about privacy and security. Is it secure to have a computer calling businesses and speaking to live people on your behalf? Is it secure for the person on the other end of the line? Others have concerns about the potential impacts on advertising, and some people even worry about how quickly the A.I. is evolving. Google Assistant just came out a couple of years ago, and now it already sounds like an actual human on the phone.
Google addressed a couple of these issues during our recent demo with the service. At the beginning of the call, Google Assistant identifies itself and also notes that it’s recording the call. That might make a restaurant owner take pause on having the conversation until the service become more widely used.
Updated on April 3, 2019: Google Duplex is now available on a wide range of Android and iOS devices in the U.S.