Skip to main content

Google’s new AI solution will help make Android phones smarter

google job widget update chrome android app os
Image used with permission by copyright holder
Google is always looking to improve its mobile services, and its latest efforts will see the company turn to a new method called “federated learning.”

The method is being tested now by Google, and represents a pretty big change in how machine-learning systems work on Android. Right now, user data is sent to the cloud on a case-by-case basis, while federated learning would essentially download machine-learning models to the device, modify the model locally, then send a summary of the changes to Google’s servers. The main difference here is where data is mainly stored.

The new method is being tested on Gboard, Google’s popular keyboard. Data stored on the device will include things like the timing and context of suggestions, according to Google. After that data is stored on the device, it is processed on the phone and will begin building an update for the machine-learning model, which will later be sent to Google’s servers.

There are some issues associated with the new system. For example, Google notes that higher latency and slower connections, as well as an uneven distribution of data, can all affect how well the system works. In order to better manage these issues, Google will use what it calls “federated averaging algorithms,” which help reduce the upload time of updates, as well as how much energy the phone uses. These algorithms basically compress data into smaller packages before it’s uploaded. Uploads will only take place when a phone is idle, charging, and connected to Wi-Fi.

There are some big advantages to federated learning. For example, Google notes that the method should help improve privacy. That’s because Google won’t have access to the processed data, but rather only the small update packages sent to Google’s server. Not only that, but users will experience improvements in machine-learning models immediately, rather than having to wait for Google to launch an update.

Editors' Recommendations

Christian de Looper
Christian’s interest in technology began as a child in Australia, when he stumbled upon a computer at a garage sale that he…
AI gadgets are dead
Gemini, ChatGPT, Humane Pin, and Rabbit R1.

Ahead of Google I/O 2024, there was little doubt that Google would talk about AI. The event started on a fittingly rowdy note. YouTube sensation Marc Rebillet started the show adorned in a bathrobe after popping up from a giant cup.

The social media star set the tone for the rest of the event by asking audience members for wild musical ideas that came to life via Google’s AI DJ software. The host couldn’t have asked for a better start. In the words of CEO Sundar Pichai, Google executives uttered the word “AI” 121 times.

Read more
Android 15 release date: When will my phone get the update?
The Android 15 logo on a smartphone.

Google has announced and shown off Android 15, which is the next major version of its mobile operating system. The development and release cycle of Android typically has a three-phase strategy, and that applies to Android 15 as well.

The first phase is always the Developer Preview phase, which happened earlier this year. It’s then followed by the more public Beta testing phase, and then the final, stable version comes out for everyone.

Read more
Google has a magical new way for you to control your Android phone
Holding the Google Pixel 8 Pro, showing its Home Screen.

You don’t need your hands to control your Android phone anymore. At Google I/O 2024, Google announced Project Gameface for Android, an incredible new accessibility feature that will let users control their devices with head movements and facial gestures.

There are 52 unique facial gestures supported. These include raising your eyebrow, opening your mouth, glancing in a certain direction, looking up, smiling, and more. Each gesture can be mapped to an action like pulling down the notification shade, going back to the previous app, opening the app drawer, or going back to home. Users can customize facial expressions, gesture sizes, cursor speed, and more.

Read more