Jul 8, 2008 114 reads by Navneet Kaushal

For those who are unfamiliar with the name Priyank Garg, he is the Director of Yahoo! Search Team. The throne of the Yahoo! Search, the team that is responsible for the functionality of Yahoo!'s Web search engine including crawling, indexing, ranking, summarizing and spelling Web search functions along with products for Webmasters, such as Site Explorer. Recently, Eric Enge of Stone Temple Consulting had the opportunity of conducting an interview with Priyank Garg. In a similar fashion, Eric Enge had also interviewed Google Webspam Team Head, Matt Cutts last month, who also shared his views about link building. Below is the transcript of the interview with Priyank Garg, in all its entirety.

Interview Transcript:

Eric Enge: Can you talk a little bit about the role that links play in Yahoo's ranking algorithms?

Priyank Garg: Sure. There's a lot of mythology in the industry sometimes around how links work. Links are a way for us to understand how Web sites and other people on the Web are recognizing other content that they have come across. The anchor text from that indicates context for the content that it's linking to, and we have used this information in our algorithms for many, many years to better address search queries as they come into the search engine.

So links are important, but anchor text is just as important. What we look for are links that would be naturally useful to users in context, and that add to their experience browsing on the Web. And links of that nature, which are organic, will survive when the user comes across them and interests him. Those are the kinds of links that we are trying to recognize, identify, and attribute to the target content.

Eric Enge: Right. So part of what you are pointing at there is that relevance matters a lot. So getting a link from the bottom of a WordPress template that you create and distribute is completely irrelevant.

Priyank Garg: Exactly, that's the kind of thing that we are trying to do all the time. The irrelevant links at the bottom of a page, which will not be as valuable for a user, don't add to the quality of the user experience, so we don't account for those in our ranking. All of those links might still be useful for crawl discovery, but they won't support the ranking. That's what we are constantly looking at in algorithms. I can tell you one thing, that over the last few years as we have been building out our search engine and incorporating lots of data, the absolute percentage contribution of links and anchor text to the natural ranking of algorithms or to the importance in our ranking algorithms has gone down somewhat.

New sources of data and new features that Yahoo! has built and developed have made our ranking algorithm better. Consequently, as a percentage contribution to our ranking algorithm, links have been going down over time. I believe that is somewhat attributable to people abusing links on the Web. As that happens, the net quality of links goes down, and the net contribution directly goes down too. However, we're still working hard to make sure all the high-quality links are effective in providing us information we need to search on queries.

Eric Enge: Right. I understand, but the links are still a very significant factor even now.

Priyank Garg: Yes. They continue to be a very significant factor.

I'm saying that people and site owners should think about the site in all aspects of the user experience, and not obsess about links as the only thing that drives traffic to them. Links are critical factors, good organic links are earned through great content and great value that will add to the users' visibility on search engines. But they can do a lot of things in parallel that will also make the search engine visibility better and beyond the search.

Eric Enge: Right, I understand. So just to step back a second to the NoIndex, NoFollow and robots type stuff. The notion has been discussed in many circles on the Web of what people call link juice sculpting. Using tools like the NoFollow attribute a little more explicitly to show what you think is important versus which ones you don't think are important. And so a classic example is, you have a Web site and you have your contact us, about us page, and legal disclaimer page linked to from every page of the site. What your thoughts on that kind of sculpting?

Priyank Garg: It's interesting that this discussion is described in that context. A NoFollow tag creates an alternative state of attribution, but if you think about it, it's not very different from not linking to those pages. When you link to a page, you are saying something about it. When you don't link, that's also an implicit comment, either you didn't know about the page, or you didn't think it was useful.

So if you think about link juice sculpting, this targeting of link attribution existed even before the NoFollow tag, where you could link and you could not link to something. Now you have an intermediate stage such that:

  1. you can link without NoFollow
  2. you can link with NoFollow
  3. you can not link.

So, it's not something that is entirely out of the blue, it's just an intermediate stage that's created; and it's not anything terribly new. You should always make sure you link to content that's useful to users and if you link to the right content, that will work best.

One of the things Yahoo! has done is look for template structures inside sites so that we can recognize the boiler plate pages and understand what they are doing. And as you can expect, a boiler plate page like a contact us or an about us is not going to be getting a lot of anchor text from the Web and outside of your site. So there is natural targeting of links to your useful content.

We are also performing detection of templates within your site and the feeling is that that information can help us better recognize valuable links to users. We do that algorithmically, but one of the things we did last year around this time is we launched the robots-NoContent tag, which is a tool that webmasters can use to identify parts of their site that are actually not unique content for that page or that may not be relevant for the indexing of the page.

If you have ads on a page, or if you have navigation that's common to the whole site, you could take more control over our efforts to recognize templates by marking those sections with the robots-NoContent tag. That will be a clear indicator to us that as the webmaster who knows this content, you are telling us this part of the page is not the unique main content of this page and don't recall this page for those terms.

That kind of mechanism is something that we provide as a control for site owners to be more explicit about what parts of the page are boiler plate. But the NoFollow links are very different from not putting the link, and so I don't see this to be very different in terms of the tools available to webmasters.

Over at the Webmaster World, there are several posts in regard to this interview. Here are some of the valued posts:

"This is the most detailed disclosure I've seen from Yahoo – kudos to both Eric and Priyank for a very rich interview. I appreciated these comments about spam detection very much. Also, some nice information about paid links – Yahoo doesn't automatically discount paid links if the rest of the algo shows that those links add to the user experience."

"Very impressive interview.

This clears up a lot of the link questions with Yahoo and provides more insight into what factors are important.

Kudos to Eric and Priyank for a great interview."

"I found his answers on the duplicate content issue a bit disappointing. It seems they do not penalize sites for mass duplication/stealing of content, it only gets rated a bit lower in results (maybe).

I hate when search engines claim they support DMCA violation requests when we all know they system is broken unless you have tons of money to go after the culprits. I've filed numerous DMCA violations with both Google and Yahoo only to have the spam site that was stealing my pages deny it. Google and Yahoo then simply believe them, and say we have to get a court judgment to have them believe me.

I filed a report documenting 6500 copied pages from my site. I included my original page and their stolen copy of my page. Their pages actually all had the same dates on them. Mine were spread over 6 years. You would think it would be easy to tell which was the original.

Well the site never got penalized in either search engine, and to this day some of our stolen content ranks them HIGHER than us for some queries.

If sites can get away with that kind of blatant theft and spamming, then the system is broken. Don't even publicize that you support DMCA. It is ridiculous."

Navneet Kaushal

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
Navneet Kaushal
Navneet Kaushal
Most popular Posts
Upcoming Events
Events are coming soon, stay tuned!More