Google has taken the opportunity presented by its I/O 2021 event to announce a new Android app shortcuts feature coming to Google Assistant. Specifically, that’s for the Google Assistant feature found on devices like Android phones and Chromebooks. But this feature isn’t just going to work with Google apps, reports indicate. Dubbed “Suggestion Chips,” it’s going to give third-party apps the ability to provide action suggestions directly in the AI interface.
What are Suggestion Chips in Google Assistant and how are they like app shortcuts?
Now, Google’s example use cases for Suggestion Chips are fairly straightforward. The company is showcasing the chips just below the Google Assistant UI.
For clarity, they’re placed next to the Google Lens icon below the “Hi, how can I help?” message that appears when the AI is launched. And the interface for the chips takes the form of buttons, complete with the shortcut action to be performed and the icon for the app spawning them. For instance, the Reddit icon shown in the images below.
A tap on those shortcuts, summarily, leads users to the represented content in the app shown. So, for Reddit’s “See What’s Trending” chip, that opens the trending tab in the Reddit app. And, with Android 12, developers will be able to take even more advantage of the feature. Namely, because the update allows devs to build an unlimited set of shortcuts via the ShortcutManager APIs. Previously, that was limited to 10 shortcuts per app.
This is still in preview but builds on what’s already there
Of course, there’s still plenty of work to be done before the new Suggestion Chip feature is ready. It’s currently planned to be launched first as a preview. But it actually builds on a shortcuts feature that’s already in Google Assistant.
Added near the end of last year, Assistant shortcuts presently allow apps to surface shortcuts that can be launched via Assistant. But those can currently only be launched with voice controls. And users need to set those up first in Google Assistant settings.
For the most part, the biggest differences are that Google will be surfacing those for apps based on users’ usage patterns. And it will also enable third parties to show apps directly in the UI. So it’s going to be proactive, right out of the gate. Any shortcuts defined in the requisite file for an app will be read and automatically shown in Assistant settings. Google says it will also allow the surfacing of app suggestions as shortcuts under the search bar. Specifically, when user searches are relevant to the app’s content and purpose.
Finally, developers will additionally be given the option to include in-app shortcut suggestions. Those will be shown to a user during their day-to-day use of the app in question. That way, they can find the new shortcut and adjust Assistant settings to show it in that UI as well. With the ultimate goal of helping users discover and learn to use the Assistant shortcuts rather than limiting use to those users who are already aware of them.