Super Session: Search Engines and Webmasters- aka: The Search Engine Smackdown: PubCon Las Vegas, Day 3

Nov 14, 2008 | 1,974 views | by Navneet Kaushal
VN:F [1.9.20_1166]
Rating: 5.0/5 (1 vote cast)

Blackhat SEO will continue to adopt more illegal methods of marketing. Thus, SEOs should decide risk tolerance. Google is and will keep communicating about its efforts and providing tools to help webmasters solve Black hat SEO problem. All these aspects were discussed on day 3 of PubCon sessions.


  • Brett Tabke


  • Matt Cutts, Software Engineer, Google Inc.
  • Sean Suchter, VP, Yahoo! Search Technology Engineering, Yahoo!
  • Nathan Buggia, Live Search Webmaster Central, Lead Program Manager, Microsoft

Nathan Buggia, Lead Program Manager, Microsoft talked about State of Live Search and its important to publishers. Delivering best search results, simplifying key tasks and coming out with innovative business models are the themes of live search. Nathan Buggia discussed some important factors of SEO while focusing on the strategies of particular search engine:

Best search results:

It is all about relevance and we have progressed a lot in this. It has been more than 4 years, since we are tracking this and still find ourselves in the same ballpark – somehow similar to Yahoo/ Google. We are better on some queries but some are not up to the mark. Freshness and depth of content plays a foremost role in this.

Specific improvements:

Improving the crawling performance is necessary. It is always considered that servers receive less load and perform a more effective crawling job. If the resources are zipped, less bandwidth is required.

Standardization of REP rules:

REP rules are specific rules meant for robots exclusion protocol. As the rules are shared, it becomes easy for the publisher to specify his policies for search engines. MSN both has already adopted these set of rules.

Microsoft follow the policy of investing in sitemaps and they can be hosted anywhere. Where at one point, it provide maximum flexibility to users, on the other hand it helps in understanding canonicalization issues.

A significant increase in the crawling capacity has also been observed. Best search results are more about providing tools in comparison to algorithmic improvements. Microsoft provides troubleshooting tips. Once, a list of top issues faced by live search while crawling the websites was noted down. The result was dramatic with 404 errors, too many parameters, blocked by robots and unsupported content. A reporting and filtering process was carried out later and Microsoft will e launching its new feature about Malware next week.

Every page is scanned and examined for what spawns a malicious process. The critical pages are flagged and users can not click on them in the Live Search. Publishers can easily find their own links in the tool as well as avail the list of outbound links, which are equally infected. Microsoft also provide tools around ranking alongwith the information on State Rank, dynamic ranking in sites, backlinks and penalties.

AdCentre excel keyword research tool provide users an access to API, which provide you keyword date for Live Search like demographic and monetization information.

Simplify key tasks:

People come to search engines for doing navigational queries, or sometimes they do not know what they want. Richer media is provided in search engines along with blue links. It can also happen that deeper pages may be related to topic but not to the search experience. If you are a publisher, there is a wider scope on how to reach customer with a specific content that ca be a video or structured content like products, reviews and information about your website.

Innovation in the business model:

Innovation in the business models talks about Cashback/adCenter scenario of Google. Project Silk Road consolidates things to increase engagement, traffic and disclose the insights of your website’s performance.

Within this, there comes the Live Search API. A lot of API publishers were asked about what they needed in API. And the answer was, they wanted to control the results of API. Thus, new features were included where you can record the results, skin results and other ads to match your website or a particular application. It also gives you the freedom to filter out 300 ad providers who does not make any sense.

Some technical aspects of API were also changed to meet business needs like:

  • The query limit is now changed to unlimited
  • You can change the dynamic ranking as per freshness, accuracy and whatnot.
  • Many blackened content types like web, news, images, encarte answers, spellings are now accessible.
  • All standard protocol REST, JSON, RSS and SOAP were implemented. They can use API developed by people.

Sean Suchter, VP Yahoo was the second speaker who told that Yahoo is trying hard to get rid of 10 blue links. As the three players dominate the market, neither site owner nor the searchers can put forth the influence. So, Yahoo is trying to address it.

A search assist features is being studied to make the best possible search queries. Now, Yahoo is moving forward from “to do” to “done” status by reducing frustration and getting to the answers, structuring information directly from web and more.

One of the current example is music player integration, which shows “Play the web” in Yahoo search.

Yahoo is trying to create a community like PubCon around the search. It is planning to set up incentives for everyone i.e Yahoo and end users. For this, yahoo is adopting different ways like coming from outside it. Yahoo is deciding to move from a simple presentation to a more structured presentation that is more appropriate for the task, the user is trying to accomplish. This is beneficial for site owners as users will get right to the answers. For setting up incentives, Traffic should be increased in quality. No click-through method will be appreciated, which will further increase loyalty and engagement.
He shows a SERP that shows many initiatives: rich media modules (video and headlines), deep links, and news federation.

Seach Monkey ecosystem is a big hit as lot as people magazine, Wikipedia, Trulia, WebMD and other properties are utilizing it.

BOSS is another big, to-be included initiative of Yahoo. It will build an open search service. Becoming a principal search engine is not an easy task, as you need hardware, data and more. Therefore, it should be completely opened so people can interact by handling query, crawling and using it directly. The ultimate goal of this step is to have high quality search experience that will be relevant, comprehensive, fresh and well presented.

Matt Cutts was the third and the last speaker of the most awaited session. He spoke on State of the Index covering all the activities happened in 2008 and what more should be expected in 2009.

Google Chrome is a wicked fast browser and Google Android is an open source operating system, which according to Google should be improved a bit more. Other stuff like machine translation, voice recognition need to be made more personalized and universal/blended search.

There are some really cool features like 2001 search index, voice chat in Gmail and ability to track the flu in Google. But what we have done for the webmasters is not very motivating. We take PDFs, that are images and run OCR on them. We crawl flash better by pulling out text of different transitions of flash files.
Google has also improved upon keyword spam and gibberish and provide extra tools to users. Few other things done by Google include:

  • Advanced segmentation of Google analytics
  • Google now reindex upto 10 pages in 24 hours
  • It has webmaster APIs for hosters and G data
  • Google provide translation gadget for your website

When we talk about Webmaster Communication, Google have had 3 chats with 700 people dialing in on the most recent chat. It is bogging more, including more videos and blogs in different languages. The moment you register your site and you have a malware or spam, your messages will be waiting before you register. Google also came out with 30 page guide about SEO 101 yesterday.

When we talk about 2009 Blackhat trends, then it is sure that illegal hacking will become more common. Blackhat has now moved to DNS sub-domain hijacking, where you can be hacked without even using a DNS resolver update.

Super Session: Search Engines and Webmasters- aka: The Search Engine Smackdown: PubCon Las Vegas, Day 3, 5.0 out of 5 based on 1 rating
4.thumbnail Super Session: Search Engines and Webmasters  aka: The Search Engine Smackdown: PubCon Las Vegas, Day 3

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
4.thumbnail Super Session: Search Engines and Webmasters  aka: The Search Engine Smackdown: PubCon Las Vegas, Day 3
4.thumbnail Super Session: Search Engines and Webmasters  aka: The Search Engine Smackdown: PubCon Las Vegas, Day 3