The SMX New York 2011 on its second day saw an informative session on using the data collected from Google Webmaster Central and Bing Webmasters Tools. The panelists included-

Moderator:

  • Vanessa Fox, Contributing Editor, Search Engine Land

Q&A Moderator:

  • Erika Mamber, Vice President, Organic Traffic & SEO, Demand Media

Speakers:

  • Duane Forrester, Sr. Product Manager, Bing
  • Vanessa Fox, Contributing Editor, Search Engine Land
  • Tiffany Oberoi, Software Engineer, Google
  • Myron Rosmarin, President, Rosmarin Search Marketing, Inc

The first speaker to come on stage was Tiffany Oberoi, Software Engineer from Google. Her stream of thought was- “If Google can help webmasters make better sites, then Google can have better content in their index for their searchers.”

She gave a lists of directions to follow to make the data from Google Webmasters Tools useful.

  1. Look at the Message Center- Google tries to get a message across to you if there is anything unfortunate happening with your site like- hacking, malware, status of reconsideration requests, violations, security updates and crawl errors. Make sure to forward these messages to your email address.
  2. Study Your Content- This gives you an idea what people come looking for and what content do you offer them. So now you know where you have to match the content to the search query.
  3. Check the Links To Your Site- See the anchor text, see who is linking to you, are they respectable sites? Check out the HTML suggestions under the diagnostics section. Get to know about the duplicate content issues, meta descriptions, title tags, non indexable content.
  4. Submit a Sitemap file. Make it easier for the visitors and the bots. Make sure to define your geographic target and your preferred domain.
  5. Set the crawl rates. How often do you want bots to come?
  6. Tell Google about your URL parameters, such as dynamic URLs.
  7. Change the address tool and move your site with Google's help
  8. Double check crawler access, test your robots.txt file, generate a new one or remove URLs
  9. With the Crawl Errors time you must identify and set right crawl errors.
  10. See your site as a Googlebot.

Parting Tip-
+1 metrics in Google Webmaster Tools are not a ranking factor now.

The second speaker was Duane Forrester, Sr. Product Manager, Bing. He started with a LIVE demo of Bing Webmaster Tools. He said that the dashboard is important and you must take notice of cliffs and be careful with slumps. You can go to blogs, forums, support email and help pages for Bing Webmasters Tools.

The Traffic Section:
He said that this section will show average impressions, average clicks etc. With these you can know whether your performance is improving for a keyword or not. He emphasizes on the crawl tab to reach the number of pages crawled and crawl errors. Insists on submitting a clean sitemap. He said, “Make sure your URLs are clean. 1% threshold. You can control crawl settings, based on time of day. There is also a check box to tell Bing if there is AJAX and HTML 5 with hashtag URLs on the site. You can block and remove URLs too.”

He wrapped up his session by suggesting that one can submit upto 50 URLs per month and 10 URLs per day into the Bing index. And it shows up in seconds in the search results.

The third speaker, Myron Rosmarin, President, Rosmarin Search Marketing, Inc. took the floor to show how he uses this data. He starts with a close look at webmaster tools as diagnostics, performance and control panel features. He said, “Performance metric measures how well am I doing, with regard to the number of visits, keywords, etc.”

He said that he uses the Diagnostics to determine if something is wrong. He then spoke about the differences between reporting and analysis. He said, “Reporting is pulling data and making them pretty Analysis is making sense of that data.”

He then gave his five principles for analysis-

  1. Standalone numbers are useless. You must have something to compare them to. The other set of numbers can be the data you collected 6 months back.
  2. Analyze the data at a level of granularity. Look into the smaller things.
  3. Pull the data at appropriate time intervals. Weekly is a good time interval for most sites.
  4. Download every data that you get. Store it all for future reference.
  5. Do not take each discrepancy as a problem. Some things may look like a problem but are not.

The last speaker of the day Vanessa Fox and she started with showing her Google Analytics data and matches it with the export of Webmaster Tools data. She then explains how they show impressions of cluster keywords and click through rates etc. for categories of queries. She recommends organizing of sitemaps by templates and then to proceed to indexing by sitemaps.

She also answered a query on Bing issuing warning to webmasters for clean sitemaps, by saying they it is not a clear warning. The larger the sitemap, the larger the variable. However, it does not send any error message that points out Bing's trust issues with your sitemap.

The session went on to more Q and A from here. This wraps our coverage of SMX New York 2011, Day 2. Coverage for the third day coming up in real time!

Author

Navneet Kaushal is the Editor-in-Chief of PageTraffic Buzz. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet is also the CEO of SEO Services company PageTraffic which is one of the leading search marketing company in Asia.