Skip to main content

A.I. and Google News: The push to broaden perspectives

Image used with permission by copyright holder

There is no editorial team at Google News. There is no building filled with hundreds of moderators monitoring the thousands of stories hitting the web every second, making sure the full story is presented. Instead, Google uses artificial intelligence algorithms, as well as its partnerships with fact-checking organizations providing headlines from credible, authoritative sources.

“Humans are generating the content,” Trystan Upstill, Google News engineering and product lead, told Digital Trends. “We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence. In a way, the A.I. is controlling this fire hose of human stuff going on.”

“We think of the whole app as a way to use artificial intelligence to bring forward the best in human intelligence.”

A.I. is a big part of the redesigned Google News app, which was recently announced at the annual Google I/O developer conference in Mountain View, California. The algorithms filter or demote stories after detecting the spread of misinformation, and they also understand terms and fragments of text coming through the news cycle, aligning them with fact checks from partner organizations.

But one of the A.I.’s main tasks is to provide a full picture of major, nuanced stories through a feature called “Full Coverage.” It’s a small button you can press on stories, which will lead you to similar articles from a variety of publications — including ones you do not follow or may not like. The main section of Google News shows content tailored to you, but “Full Coverage” does not respect your likes and dislikes — everyone sees the same information pulled together by the A.I.

That includes modules for fact checks, frequently asked questions, a timeline of events, international coverage, and even tweets from primary sources. Everyone reading “Full Coverage” sees the same information, which Upstill said is crucial.

“The core premise we have is that in order to have a productive conversation about something, everyone basically needs to be able to see the same thing,” he said.

While the breadth of data the algorithms pull is impressive, it’s entirely on the user to click on the small “Full Coverage” button to read more perspectives on the topic at hand. It’s why the button features Google’s red, green, blue, and yellow colors — it stands out from a page that’s mostly black and white.

“Fundamentally, we’re trying to build tools that are easy, that people can use to develop their understanding,” Upstill said. “A part of the challenge for people to break out of their bubbles and echo chambers is that it’s just hard; it’s hard work, and we set out to make that easy.”

Pulling together a variety of sources has always been a part of Google News’ roots. The desktop service began right after the 9/11 attacks in 2001, when people were scrambling to find as much information as they could about the tragic event.

“It came to the table with this idea that in terms of understanding a story, you shouldn’t read a single article,” Upstill said. “You should read a set of articles around that story to really position what you’re reading. That is a key message that resonates with people even today, in this age of people having increasingly polarized views.”

“You should read a set of articles around that story to really position what you’re reading.”

Google has been criticized for helping people stay in their bubbles. Search results are personalized based on location and previous searches, and people end up seeing what they want to see rather than the full picture. Upstill said Google isn’t in the business of censorship, and “in Search, if you come in and say ‘give me the fake news publication’ or type ‘fakenews.com,’” it will show up. But with Google News, Upstill said you shouldn’t find disreputable sources.

The new Google News app is currently rolling out on both Android and iOS, and the desktop redesign will go live early next week. Both will share the same features, but the desktop version will have a different format.

Editors' Recommendations

Julian Chokkattu
Former Digital Trends Contributor
Julian is the mobile and wearables editor at Digital Trends, covering smartphones, fitness trackers, smartwatches, and more…
This is the one AI feature from Google I/O 2024 I can’t wait to use
Google Photos app on a Google Pixel 8 Pro.

Google’s main I/O 2024 keynote was jam-packed with a ton of new AI features that are coming to desktop and mobile, thanks to Gemini. There will be new ways to search Google through video and multimodal prompt requests, while smartphones eventually get AI superpowers through the camera with Project Astra.

But there’s one feature that really stuck out to me: Ask Photos with Gemini in Google Photos.
First, what is Gemini?

Read more
Google fumbled what could have been its biggest product in years
A person holding the Ray-Ban Meta smartglasses.

What is one of the hottest, most interesting mobile devices around at the moment? It’s Ray-Ban Meta, smart glasses that not only look great but work really well too.

They’re suitably incognito yet still highly functional, giving you a reason to wear them all the time if the mood takes you. Plus, they have AI -- the big feature beloved by tech firms at the moment -- built right in. So, where was Google’s competitor at Google I/O?
A tease and nothing more
Google's concept smart glasses circled Google

Read more
I saw Google’s futuristic Project Astra, and it was jaw-dropping
Google presenting Project Astra at Google I/O 2024.

If there's one thing to come out of Google I/O 2024 that really caught my eye, it's Google's Project Astra. In short, Astra is a new AI assistant with speech, vision, text, and memory capabilities. You can talk to it as if it were another person in the room, ask it to describe things it sees, and even ask it to remember information about those things.

During the I/O keynote announcing Astra, one of the most impressive moments happened when a person was running Astra on a phone, asking it to describe things in a room. When the person asked Astra where their glasses were, Astra quickly pointed out where they were in the room -- even without being prompted earlier in the video about them.

Read more