New Test Shows Google Assistant Is The Best Digital Voice Assistant

Google Assistant Android Q AM AH 2 1

Loup Ventures pits popular AI assistants against one another in a test of wits each year, and this year’s smartphone-based test saw Google Assistant come out on top. This “digital IQ test” consists of the same 800 questions, given the same way to each AI. Assistant’s top spot beat the runner-up, Apple’s Siri, by almost 10% on correct answers, while Alexa on smartphones only scored 79.8%, putting it at the bottom of the list. It’s worth noting that Google Assistant fully understood all of the queries, but Alexa actually beat out Siri by .1%. Still, they were all within 1% of a full score in that category, essentially making it a moot point.

Google Assistant has topped the smartphone-based test across the board since 2017, consistently scoring the highest in both understood queries and correct answers. Siri showed steady year-on-year growth for the most part in that category. Alexa, meanwhile, burst onto the scene for the 2018 test and grew more than 10% year on year for this test. That steady growth, partly thanks to the massive amounts of training that Alexa receives from being the most popular home smart speaker AI, indicates a chance of usurping Assistant’s crown in the future.

The story is mostly the same on the smart speaker side of things, though each AI took a hit in correct answers. Assistant answers things incorrectly 5% more often in smart speaker form, as does Siri, while your Amazon Echo or Echo Dot is actually 7% more likely than a smartphone-based Alexa installation to give you a wrong answer or say it doesn’t know what to tell you. That test showed small growth for Assistant, while the rest of the bunch added around 10% to their correct answer scores year on year. As in the smartphone space, that means that Assistant’s crown is in danger in the coming years.


The test itself is, by its nature, limited. There are a number of features to account for when choosing an AI assistant, things like running queries, the things it can remember about you, creature comforts like multiple user management, and of course, third-party skills that can be added on later. With only 800 questions in the test and all of the AI on board nearing 100% correctness, both the scale and scope of the test will be too limited to be meaningful before long. Once the test is inevitably expanded, it will be anybody’s game.

As a consumer, your choice of AI, on both the speaker and smartphone side, ultimately falls to personal preference. Many people use Alexa on every device they own just because they’re used to her. Others, such as this writer, may be hardcore Google Assistant users, perhaps even using the AI as far back as the Google Now days. Still others may be XBOX gamers, Windows power users, or otherwise inclined to stick with Cortana. Amazon may have made the Echo the most popular smart speaker by pushing Alexa’s capabilities and marketing hard early on, but it’s still anybody’s race, especially outside of smart speakers. Google Home recently got a few shiny new pieces of hardware in its family, for instance, while the ubiquity of Apple’s beloved iPhone will keep Siri in the race indefinitely, with or without HomePod’s help.