The arrival of the Amazon Echo in the U.S. on June 23, 2015 was notable because it was the first mass-market consumer-level device responding to voice control with general purpose functions, such as playing music. More recently on November 4, 2016, Google released a competing device known as Google Home.
At Stone Temple, we’ve been tracking the capabilities of voice-activated knowledge devices to respond to questions since we released the Knowledge Box Showdown in October of 2014. In today’s article, we’re going to publish the data on similar tests, with Amazon Echo and Google Home in head-to-head competition.
What We Tested: It’s very important to note that our test is NOT a test of all the capabilities of both devices. Our test focused on the ability to provide direct answers to specific user questions. In short, we’re only testing the ability of each device to provide factual answers to questions. The devices may use one of two sources for those answers:
- A knowledge database, such as Google’s Knowledge Graph
- Answers extracted from third-party websites (what Google calls “Featured Snippets”)
What We Didn’t Test: The devices have several other areas of functionality that we did NOT test. These include:
- Home Entertainment
- Personal Assistant
- Smart Home Controller
If you prefer to consume this post via video, you can check out my YouTube video here!
You can also see a much broader comparison of all the capabilities of Google Home and Amazon Echo here.
Structure of the Test
We collected a set of 5,000 different questions that we wanted to ask each device. We then asked the devices the question, and noted many different possible categories of answer, including:
- If the device answered verbally
- Whether an answer was received from a database (like the Knowledge Graph)
- If an answer was sourced from a third-party source (“According to Wikipedia …”)
- How often the device did not understand the query
- When the device tried to respond to the query, but simply got it wrong
Did you know that you can “Ask Stone Temple” SEO & digital marketing questions on Google Assistant and Amazon Alexa devices? Here’s how to do it!
For the first test result, let’s compare how often the two devices attempted to answer the questions asked:
Google Home attempts an answer to over 3X more questions than Amazon Echo. Click To Tweet
So Google Home tries to answer the most questions, but for all of the questions each responded to, how often are they correct?
As it turns out, there are a few different ways to be “wrong”:
- The query might have multiple possible answers, such as “how fast does a jaguar go”
- The device may have responded with a joke
- Or, it may simply get the answer flat out wrong
Let’s first take a look at that last category, the for all of the questions each device responded to, what percentage of the time were they wrong?
Amazon Echo answers questions incorrectly over 2x more often than Google Home. Click To Tweet
The remaining queries were made up of situations where more than one answer was possible, or where the question was partially answered. For example, if someone asked “how old is George Washington,” and the device responded with the year he was born, we would count that as partially answered.
All in all, for this particular test, Google Home won handily. As noted above, there are several other important categories for which the devices can be analyzed and compared that weren’t included in this test.
However, with the backing of Google’s search volume, Google Home has the clear edge in the knowledge department. In particular, 2,228 of Google’s answers came in the form or “Featured Snippets,” which means that the results were sourced from a third-party website. 2,122 of the correct answers for Google Home were from this source (not all of the 2,228 featured snippets were 100% correct!). In contrast, for the Amazon Echo, only 556 of their correct answers came from third-party websites, 527 of which were sourced from Wikipedia.