Today, when you use Google to run a search you notice that before you even finish typing your keyword Google offers you suggestions. This automatic suggestions are offered by a feature called Google Instant, which the company has come up with, to make your search faster and easier and to give you a better search experience. Search Engine Land's editor-in-chief, Danny Sullivan, analyzes Google Instant's Autocomplete Suggestions feature, its working and parameters, and gives a considerable amount of interesting insight.
Google is not the only search engine that offers suggestions, not even the first one. The search engine's suggestions offering can be dated back to 2004 (which was then an experimental feature) and a formal inclusion in to the website in 2008 as “Google Suggest” or “Autocomplete”. It got a renewed attention with the update with Google Instant, a feature that automatically displays results and changes them too, in 2010. However, the feature has been under speculation of many users and experts especially the blocked suggestions.
Suggestions Based On Searches Done Previously By Real People
According to Google, the suggestions offered by Google Instant are based on the previous searches done by real people. In this case, when it comes to ranking of the results, popularity is the key. For instance, if more people are searching for a certain keyword/s, it has more chances of appearing as a suggestion. Google also says that there are also other factors that determine what to show as suggestions, but all that are suggested comes from the searches done by real people.
Suggestions May Vary By Location And Language Used
The suggestions offered by Google differs from place to place and with the language you use. Therefore, the same keyword you use on Google to search in New York could get different suggestions in Tokyo or London. To see it yourself, run a search using a keyword, for instance 'coupons', and notice the suggestions you get. Then, change your location settings (telling Google that you are located somewhere else, not currently where you are) and run a search using the same keyword. The inaccuracy of the feature like Google Instant Alphabet or The United States of Autocomplete is also due to this.
Previously Search Suggestions
Another thing you must have seen in the suggestions are the ones in purple color. Those are the keywords that you have already ran a search before, like the purple colored links in the search results that means you have already searched and clicked them. This is because the Google web history feature is active, by using which you can personalize your search. The little “Remove” options next to the suggestions in purple color lets you remove it. This purple color display of suggestions came with Google Instant.
Ranking Of Suggestions
While ranking suggestions, Google might look at each search's popularity. However, sometimes, even less popular searches could be ranked at the top if Google considers them more relevant. Your personalized searches always come before the rest.
Deduplicating And Spelling Corrections
Google also corrects spellings and deduplicates, if there is any spelling or punctuation etc., mistake, and shows the correct one in the final suggestion.
“For example, if some people are typing in “LadyGaga” as a single word, all those searches still influence “Lady Gaga” being suggested — and suggested as two words.
Similarly, words that should have punctuation can get consolidated. Type “ben and je…” and it will be “ben and jerry’s” that gets suggested, even if many people leave off the apostrophe.”
The Search's Freshness Counts
Google says that their Autocomplete suggestions has a “freshness layer”. What it does is that when a search suddenly becomes popular, it can appear as suggestions regardless of their lack of long-term popularity.
“A good example of this was when actress Anna Paquin was married. “Anna Paquin wedding” started appearing as a suggestion just before her big day, Google says. That was useful to suggest, because many people were starting to search for that.”
The company does not reveal how long, or rather short, the duration should be for a search that suddenly gains popularity before it gets suggested. However, suggestions are seen appearing within hours after a search trend has started.
Removal Of Suggestions
Some suggestions were removed for specific reasons according to Google. They are:
- Hate or violence related suggestions
- Personally identifiable information in suggestions
- Porn and adult-content related suggestions
- Legally mandated removals
- Piracy-related suggestions
The company employs automated filters that blocks suggestions, from appearing, which are against their policies and guidelines. “For example, the filters work to keep things that seem like phone numbers and social security numbers from showing up.” Google said that since the filter isn't perfect, “some suggestions may get kicked over for a human review”.
The Case Of Hate Speech And Protected Groups
In this case, not all the things that are hateful gets blocked from being suggested. For instance, “i hate my mom” and “i hate my dad” are both suggestions that come up if you type in “i hate my.” Similarly, “hate gl” brings up both “hate glee” and “hate glenn beck.”
Hate suggestions are removed if they are against, according to Google, a “protected” group. There is no definition for a 'protected group' as such, not even on the Help page for Autocomplete suggestions. However, a Help page for Google AdWords summarizes the company's long-considered protected groups, and it includes:
- race or ethnic origin
- national origin
- veteran status
- sexual orientation or gender identity
Under the 'color' category, majority groups like whites are included, hence, even if you type “i hate white” Autocomplete will not suggest “i hate whites,” likewise typing “i hate black” will not get the suggestion “i hate blacks.”
“However, in both cases, other hate references do get through (“i hate white girls” and “i hate black girls” both appear). This is where a human review may happen, if the reference is noticed.”
Legal Cases And Removals
Google also blocks some suggestions due to legal reasons such as the company losing two cases in France, involving Autocomplete feature, last year. In the first, the company was ordered to remove the word “arnaque” from coming up as a suggestion as the same is the name of a distance learning company. The word means 'scam'. In the second case, a plantiff in France sued Google and won a symbolic 1 euro as payment in damages for having the words “rapist” and “satanist” appearing next to his name in a Google search.
Yet another news broke yesterday about Google losing another case in Italy involving the suggestions feature again. In this case, a man sued the company for having the Italian words for conman and fraud appearing next to his name in Google search.
When asked by Danny, Google replied saying, “We are disappointed with the decision from the Court of Milan. We believe that Google should not be held liable for terms that appear in Autocomplete as these are predicted by computer algorithms based on searches from previous users, not by Google itself. We are currently reviewing our options.”
There are more such cases and controversies involving the Autocomplete suggestions such as the “levitra” and “cialis” case in the US, the “climategate” suggestion, “islam is”and so on.
Nationalities and Religions
Nationalities are protected by Autocomplete while it does not consider religions as protected groups. The aforementioned protected groups list is for AdWords and not Autocomplete. This and other similar protected groups considerations tend to sound illogical. Google argues that “nationalities refer to individuals, religions do not. Our hate policy is designed to remove content aimed at specific groups of individuals.”
It is obvious that in some negative suggestions situations, the affected people would like them them removed, but Google does it rarely. They don't even have a form made in regards to this except a help page that suggests leaving comments in support forums.
Autocomplete And Piracy
Suggestions for some sites associated with piracy were removed but not the sites. Search terms considered related to piracy (by the company) were also removed from its search results early this year. The reason for such step could be the accusations from studios and networks calling Google a supporter of online piracy.
Fresh Fake Queries
Amidst all these, there's the latest fake queries issue. For instance, people can request others to do searches, and when a considerable searches are done, suggestions start appearing at Google.
“Brent Payne is probably one of the most notable examples of someone deliberately doing this “above the radar,” so to speak. He ran a series of experiments where he hired people on Mechanical Turk to do searches, which (until Google removed them) caused suggestions to appear:”
Danny's suggests that Google and all the major search engines should do away with something like protected groups and remove the negative suggestions. “If there are negative things that people want to discover about a person, company or group, those will come out in the search results themselves, and mixed in with more context overall — good, bad or perhaps indifferent.”