You’ve probably typed a question into Google and instead of looking through search results accepted the answer Google provides. See Below.
The search mega-giant is upgrading their algorithm to fix serious problems of bias and error in the answers; much of which you may not have even noticed.
In the Beginning
Google search has been around since 2008. It began answering people’s questions in 2012 using the well-known knowledge card format. It used sources like Wikipedia, World Factbook and other sources, but it could only answer a limited number of questions. That’s never good enough for Google.
Back when Google Glass was still a thing, Googles top minds got together and decided that since they have access to the millions of websites they could create an algorithm that could peruse the sites and cherry pick the answers from available web copy.
Glass couldn’t scroll down web pages, but it could answer questions and a more prescient answer section was born…and everyone lived happily ever after…not!
The Problem with Google Answer Snippets
When you have a question that has a definitive answer such as a number, the snippet is accurate. If you want to know the circumference of the Earth or how much a baby elephant weighs, then you’ll get an accurate answer. If you try to answer a question that doesn’t have a straight factual answer or is a moral or ethical quandary, then answers could be directed towards a bias or be straight up incorrect.
One of the classic examples of this dealt with the difficult subject of the legality of abortion. Imagine being a person with no clear bias one way or the other, and instead wanted to simply learn more about the subject and debate.
You type in “Should abortion be legal?” The question in itself is innocent, but the way Google translated that question pointed people to an answer that was biased to people who believe it should be legal. By simply rephrasing the question to “Should abortion be illegal?” Google pointed the person to an answer biased to people who believe it shouldn’t be illegal.
Moral quandaries aside, providing biased answers to people with an unbiased opinion is dangerous.
The other issue is Google bots and spiders have no idea if the answer they are providing is actually correct. Google uses a complex algorithm to determine if a site has relevance and authority when it assigns rankings in the search results and in the answer snippets.
While a site may be an authority with lots of backlinks, it doesn’t mean everything on that site is correct. For example, the Wall Street Journal recently highlighted some of the errors including the question of “Why are Komodo Dragons endangered?”
The reality is that they are not endangered. The answer that was pulled for that question not only said the animals were endangered, but it was taken from a report of a Canadian elementary school student. While they may have an encyclopedic knowledge of Pokemon, herpetological questions may lack accuracy in the pre-teen crowd.
Other embarrassing answers included former President Barack Obama orchestrating a coup and that women are evil. Way to go Google.
Fixing the Problem
Google recently stated that many of the so-called problem answers came from rare and fringe questions. According to Stone Temple, who tested the answer snippets, Google had a 97.4 percent accuracy rate, but given the billions of queries Google has each day, who knows exactly how many “fringe” answers sneak through the cracks.
The search behemoth understands that its answers aren’t infallible, especially when it comes to ethical and non-traditional questions and questions with multiple answers. It’s working to improve the accuracy and remove the bias.
For example, with answers that may have more than one answer such as “How so I set up my broadband modem?” The answer depends on the brand and the type of modem, so Google will begin providing multiple options that you can choose from to get the answer for your brand.
Their other solution is providing multiple answer snippets for people to choose as well. This may help remove some of the inaccuracies and bias since there are multiple sources.
Will these fixes help provide better answers not only for web searches, but for queries to virtual personal assistants such as Alexa and Siri? Time will only tell, but in the meantime, take everything you see in an answer snippet with a grain of salt.