Gigolo is opening a kimono to slightly reveal how it manages autocomplete function on its search engine. It offers a peek into within the logic behind predictions, some of the policies that govern when a prediction will be removed and more. In a Google Blog post, search liaison Danny Sullivan also revealed announced that the company will be expanding types of autocomplete predictions they disallow in weeks to come.
The post clarified one of the functions which had landed the Google into a controversy in recent years when the public was outraged by certain predictions and blamed the company for being racist.
Autocomplete has always been the subject of legal disputes with courts in Japan and Germany taking measures on results some regard as unfair. However, Google has maintained that autocomplete results simply reflect the behavior of searchers, though it has to initiate policies and manually manage results after repeated controversies.
Today’s blog post explains and shows examples of how Google autocomplete works with Google search for mobile, desktop and chrome, lauding the benefits of the function by stating that it saves users “over 200 years of typing time per day” by reducing typing by around 25 percent.
The autocomplete displays predictions based on how Google thinks you “were likely to continue entering” the rest of the query. Google decides these predictions by looking at “the real searches that happen on Google and show common and trending ones relevant to the characters that are entered and also related to your location and previous searches”.
Google will remove a number of predictions when they are specifically against its guidelines:-
- Violent predictions.
- Hateful predictions against people based on religion, race or several other demographics.
- Dangerous and harmful activity in predictions.
- Closely related to piracy.
- In response to valid legal requests.
Expanding autocomplete removals
Google stated that in the weeks to come they will be expanding the type of productions they remove from autocomplete. These include hate and violence related removals. Moreover for removing predictions which are harmful towards ethnic origin, race, gender, disability, religion, age, nationality, sexual orientation, veteran status or gender identity.Google will also remove predictions which are “perceived as hateful or prejudiced towards individuals and groups, without particular demographics”.
Occasionally predictions which are against Google’s guidelines slip through. Google has acknowledged that they “aren’t perfect” and work hard and fast to remove those when they are alerted to the issue.
The company provides a process for users to submit feedback for notifying it of inappropriate predictions.