Loup Ventures


'Loup Ventures' Articles

Siri Answers 83% of Questions Correctly in Test, Beating Alexa But Trailing Google Assistant

In an annual test comparing Google Assistant, Siri, and Alexa on smartphones, Loup Ventures' Gene Munster found that Siri was able to correctly answer 83 percent of questions, beating Alexa but trailing behind Google Assistant. Munster asked each digital assistant 800 questions during the test to compare how each one responded. Alexa answered 79.8 percent of questions correctly, while Google Assistant answered 92.9 percent of questions correctly. Compared to last year, Siri has seen improvement. During the July 2018 test, Siri answered 79 percent of questions correctly compared to the 83 percent of questions answered right this time around. Alexa last year was at 61 percent while Google Assistant was at 86, so there have been digital voice assistant improvements across the board. This test covered smartphones specifically, comparing iPhones and Android devices. Munster says that smartphones were isolated from smart speakers because while underlying technology is similar, "use cases vary." Siri was tested on an iPhone running iOS 12.4, Google Assistant on a Pixel XL, and Alexa in the iOS app. Questions were based on five categories and all assistants were asked the same 800 questions. Each question set was designed to "comprehensively test a digital assistant's ability and utility." Some of the sample questions across each of the categories: Local - Where is the nearest coffee shop? Commerce - Order me more paper towels. Navigation - How do I get to Uptown on the bus? Information - Who do

Siri on HomePod Asked 800 Questions and Answered 74% Correctly vs. Just 52% Earlier This Year

Apple analyst Gene Munster of Loup Ventures recently tested the accuracy of digital assistants on four smart speakers by asking Alexa, Siri, Google Assistant, and Cortana a series of 800 questions each on the Amazon Echo, HomePod, Google Home Mini, and Harmon Kardon Invoke respectively. The results indicate that Siri on the HomePod correctly answered 74.6 percent of the questions, a dramatic improvement over the speaker's 52.3 percent success rate when Loup Ventures asked it a similar 782 questions in December 2017. Siri on the HomePod remained less accurate than Google Assistant on the Google Home, which correctly answered 87.9 percent of questions in the test. Meanwhile, Alexa on the Echo and Cortana on the Invoke trailed Siri on the HomePod, correctly answering 72.5 percent and 63.4 percent of questions in the test. Munster attributed the HomePod's improved accuracy to "the enabling of more domains in the past year," as a series of software updates in recent months have enabled the speaker to make and receive phone calls, schedule calendar events, set multiple timers, search for songs by lyrics, and more. Methodology Loup Ventures says it asked each smart speaker the same 800 questions, and they were graded on two metrics: whether the query was understood and whether a correct response was provided. The question set was designed to "comprehensively test a smart speaker's ability and utility" based on five categories:Local – Where is the nearest coffee shop? Commerce – Can you order me more paper towels? Navigation – How do I get to uptown on the