Google Makes AI Learning Faster On Android, And Introduced A Sandbox For Privacy

Android 12

Among the many things Google has in mind and has in the works, AI is always one of its priorities.

At the company's Google I/O 2021 event, the company revealed some of its AI projects it has in place for Android, describing how Google wants to make it and more consistent for developers.

AI and its machine-learning technology are responsible for powering the many features Android owners use every day.

While the technology has indeed become more powerful and more advanced, Google said there are still challenges in deploying AI-powered features that need to be solved, including concerns with app bloating and performance concerns.

There are also issues with feature availability because not every Android device has access to the same APIs or API versions.

And here, Google wants to solve them, one step at a time.

And here, Google announced Android’s updatable, fully integrated machine-learning inference stack, so there can be a set of common components across all devices.

Together, this should being a lot of benefits to app developers:

For example, developers are no longer required to bundle code for on-device inferencing in their own app. And because the machine-learning APIs are more integrated with Android, developers can improve the performance of their apps.

What's more, because Google can provide a more consistent API across Android versions and updates, Google can ensure that it can regularly push updates to the APIs.

To make this happen, Google is doing a few things.

First, Google and Qualcomm are working together to make neural network API updates easier on Android.

This project is built on top of 2020's Qualcomm initiative, which creates chips for Android phones that supported upgradeable GPU drivers to optimize performance. This time, Qualcomm is doing a similar thing for on-device AI and machine learning.

It was revealed on Google I/O that both Google and Qualcomm plan to present the model alongside the release of Android 12.

Traditionally, Android's neural network API drivers are updated along with major Android updates. But both Google and Qualcomm said that they can roll out the update quickly using Google Play Services.

This is made possible because Google is moving the neural network API away from the core Android framework.

What this means, developers can use the same API even if two devices are running different Android versions. In other words, the updates should be compatible with older chipsets and multiple versions of Android.

With this initiative, Android could benefit from a boost in performance, as if the phone has additional CPU cores, use less power, and create less heat.

In an example, Qualcomm said that apps like Google Assistant and Google Maps that use on-device AI processing can benefit from the tweaks.

Second, Google said that TensorFlow Lite for Android is going to be preinstalled on all Android devices through Google Play Services. This way, developers are no longer required to bundle it with their own apps.

Google is also introducing “automatic acceleration” that takes a developer’s machine learning model into account and can check whether the model works better accelerated on the CPU, GPU, or other accelerators.

The Privacy Sandbox

In a similar news, Google said that it is also leveraging AI a bit more to help it push new privacy measures.

One of which, is a new partition within Android to manage machine learning data more securely.

“With Android’s private compute core, we’re able to introduce new features,” said Google executive Suzanne Frey. And that “while still keeping your data safe, private, and local to your phone.”

With this Private Compute Core, Google creates a privileged space, or a sandbox, within the operating system. This approach is similar to the partitions used for passwords or sensitive biometric data.

But instead of holding credentials, the sandbox can hold data for use in machine learning, like the data used for the Smart Reply text message feature or the Now Playing feature for identifying songs.

While the feature is not sensitive in itself, the data it works on is sensitive. For example, it can make use of data like personal texts and real-time audio, and this is where the partition can make Android better in protecting that data, while still keeping it available for system-level functions.

“This means that all sensitive audio and language processing happens exclusively on your device and isolated from the network to preserve your privacy,” Google explained in a post announcing the feature.

Despite the name, the Android Private Compute Core does not exist in a separate hardware chip. This partition exists entirely in software.

Published: 
24/05/2021