The second day of the New York SMX 2011 kicked off with a keynote conversation with Eli Pariser, author of ' The Filter Bubble'. The book works around the concept that search personalization leads to users being unaware of viewpoints different from than their own. The panelists include:
- Danny Sullivan, Editor-in-Chief, Search Engine Land
- and Chris Sherman, Executive Editor, Search Engine Land
- Eli Pariser, Author, "The Filter Bubble"
Eli started his speech by discussing moral consequences of personalization. To explain his point he quoted Mark Zuckerberg -“A squirrel dying in front of your house today might be more relevant to you than people dying in Africa.”
Relevance was explained in this manner by Mark, but the question Eli raised is, how does the internet know what is relevant for us? He then quoted Eric Schmidt- “It will be very hard for people to watch or consume something that has not in some sense been tailored for them.” This means that internet users are surrounded by a sort of membrane of personal filters. The sad part is that one does not choose what gets in the filter bubble and what has been left out. People have no idea that their stuff is being edited out.
He said that personalization was in play when he noticed changes in his Facebook account. He interacted more with a certain mindset people and suddenly noticed one day that updates from his conservative friends were no longer appearing in his news feed.
Then he experimented by asking several friends to Google “Egypt” and send a screenshot of what popped up in search results. All the results were different as they had been personalized by Google into what people wanted to see.
He then went on to his “filter bubble” concept and said, “You don’t choose what gets in your filter bubble and, more importantly, you don’t know what’s been edited out. Personalization algorithms typically look at what you click first. On the Internet, code is the new gatekeeper. It may be showing us what we like, but it’s not showing us what matters.”
He said the the algorithms and the codes are making similar value judgments as the old gatekeepers. They are finalizing what is essential and necessary for us. But how can they do that when they don't have any civic sense inbuilt. The internet needs to show us what matters and not what we like!
The bottom line- the search results should show both what is happening 5 blocks away or 5000 miles away.
He stressed that these algorithms must not focus on a limited definition of relevance, instead it look at things that challenge us and really matter. We need to be connected to new ideas and ways of thinking and must get out of this personalized bubble.
The Short Speech Led To A Follow Up Discussion
After Eli's short yet thought provoking speech, moderators Danny and Chris chatted with him.
Danny asked him about any commonality between search engines and Eli said that in some cases the results are same,but in majority searches, the results are different.
Danny then posed another question as to do people want the same results and Eli affirmed that. Chris then asked him about Marissa Mayer's observation that personalization was very subtle and it is tied to the individual’s personal history. Has the same changed now? To this Eli answered that the algorithm is complex and no one knows what it does and why he said, “Overall, I think Google undersells how significant it is. I don’t think Google has malicious motives in doing this. They genuinely think personalized search results will get people coming back to their search engine more. I think they also see this as a way to make it more difficult to deliberately game the results.”
Eli also said that Yahoo personalizes your news headlines based on your profile and he said that you never know when you are in the personalized bubble or not. He said, “if it was easier to see when and how these filters are being applied, and be able to turn them on and off, it would be easier to stop them from imposing themselves on you.”
Should Google be regulated?
This was put forward by Chris, in respect to the engine personalizing the results. To this Eli said that the algorithm sets the direction for a billion plus people to go forward. But there’s no opacity. He said, “From a regulatory standpoint, I think there needs to be a reset about the rules on personalization because they were written in 1977.”
The Takeaways from the Q and A session-
- Google is strong on the ethics, it takes a highly ethical approach to user data as compared to some other companies.
- Google knows about you, and can target correspondingly based on inferences.
- Eli said- “Most people don’t think about the fact that we’re walking around the Internet with a big price tag on our shirts. We are.”
- Google tries to know you by the content you consume, among other things.
- People are unaware of personalization being done by Google.
- Eli said that Google could throw some light on how it goes about personalization, without giving all its secrets away.
- People should be given tools to choose whether they need personalization or not!
These takeaways really got the attendees thinking how much Google knows and how little the users do! Stay tuned to Page Traffic Buzz for more of these stimulating sessions from SMX New York 2011, Day 2.A Keynote Conversation With Eli Pariser - SMX East New York 2011, Day 2!,