SES New York 2007 Day 4

Apr 14, 2007 | 1,259 views | by Navneet Kaushal
VN:F [1.9.20_1166]
Rating: 0.0/5 (0 votes cast)

Today we bring you the highlights of some key sessions of Day 3 SES New York 2007

Linking Strategies
SES logo 2007(1) SES New York 2007 Day 4

The presenters in this segment are Justilien Gaspard, Greg Boser, and Jim Boykin. The session is moderated by Detlev Johnson who opens with a few words about the importance of quality link building.

The first presenter is Justilien Gaspard, who owns a link building business and is a known link building pro. Justilien emphasizes the general confusion regarding quality links. He goes on to enumerate some of the better methods of link building, such as directories.

Justilien advises to get links from old directories, as many such directories pre-date search engines and have proved their worth over time. Such directories are an easy way to enter good neighborhoods.    

Niche and vertical directories are under-explored and under-exploited though they hold immense value. The simplest way to find such directories is through Google. Local and topical directories are other good options, such as the Chamber of Commerce.

He also goes on to explain what good directories are. Good directories:

Are human edited

  • Have a good amount of backlinks and indexed pages
  • Offer static links
  • Do not use no follow tags
  • Do not sell site wide links

While submitting to directories, use natural and unique titles and descriptions.

Content is another great way of creating links. Research keywords as well as study competition to see linking strategies. Find out which keywords attract the maximum number of links.

Blogs, wikis and forums are other ways if getting quality links. Blogs help in getting links and promoting you as an expert. Finally, be proactive when it comes to promote yourself and your content. Social media sites and press releases are ways to do exactly that.

Next up on the podium is Jim Boykin, CEO of WeBuildPages. He pointed out a few strategies that are long dead.

  • Submitting to search engines is one.
  • Meta tags and on-page optimization without backlinks is no good.
  • Google toolbar is not reliable backlinks indicator. Search in Yahoo!
  • Don’t interlink your own sites. Google is registrar now, remember.
  • Link exchanges are dead. They are too visible and don’t work.
  • Buying high page rank links are no use anymore too.

If none of these works, what does?

Neighborhoods. Search engines are seeing as more important than ever. Use “similar pages” option to see your neighborhood. See where your competition is getting links from.
Also, link to .gov, .edu and other authoritative sites. Four more strategies that he emphasized are

  • unique content
  • who links to you
  • who you link to you and their neighborhood
  • links that are within the content itself

 Greg Boser of WebGuerilla is the next speaker. He emphasizes the trustrank issue which is not understood by many. The same links will not be good for your competitor and your site. Domain age is also a determinant of links that are acceptable.

Trustrank environment is getting more important. Linkbaiting, though gets you many links, is not relevant in a contextual manner. Domain trust is an area where Google needs to clarify the position in future. It is not infrequent to see sites that rank for varied topics, such as finance and real estate. This is the issue where disparate hosted pages rank high because of the domain value.   

Link baiting and viral success

The presenters in this session are Rand Fishkin, Cameron Olthius, Jennifer Laycock, and Chris Boggs. The session is moderated by Jeff Rohrs.

Rand Fishkin (you don’t know him? off my blog!) is the first speaker of the session. He explains that link baiting is about getting content online that is worth sharing. Once the content is taken care of, next step is to get it to the “linkerati”- bloggers, web journalists and researchers. Linkerati is the most important factor when it comes to highlighting your content.
Though the links that you get through link baiting would not be contextually relevant, they help by raising the overall weightage of the domain a.ka. Wikipedia. Rand gives a few examples of successful link baiting from Drivl and SEOmoz sites. 

Some linkbaiting strategies that he shares are tips, lists of best and worst, humor, controversy, irony, teaching, interviews, product reviews, tools, comprehensive reviews among others. He suggests some popular sites for linkbaiting, such as Digg, Reddit, Netscape, StumbleUpon, Del.icio.us. You can get hundreds of links by coming on the home page of any of these sites. He also mentioned a number of helpful blogs: Techcrunch, Boingboing, Engadget, Lifehacker, Slashdot, Techmeme, Scobleizer, Daily Kos and The Huffington Post.

Cameron Olthius is the next speaker who talks of three kinds of linkbaits, content pieces, mash ups and widgets. While these help in improving rankings, you can also use it for reputation management.

You can monitor your online reputation at social media sites, blog search engines and comment trackers. You track it through URLs, company name, product name, keywords and publicly known figures.

Create profiles on social media sites, such as MySpace, You Tube and Wikipedia. These sites have great global authority and help in ranking better.

Jennifer Laycock is the next speaker elaborating on viral marketing. She said while link bait is about getting more links, viral marketing is about branding and getting better conversions.

Viral marketing has a few advantages, such as low cost. There are no placement costs as in PPC. It also creates brand evangelists who promote the brand’s credibility. Viral marketing also has a rapid response rate.

The secret of a good viral campaign is the idea. Find out what your customers are passionate about, come up with something that nobody has tried before. Find out how it will benefit users and why they would like to spread it. Viral campaigns spread through thought leaders in the industry. Use existing networks to spread the message.
 
The last speaker of the session is Chris Boggs who talks about leveraging the community. He talks about the link value being passed around in the search community. He advises to find a forum where a lot of people from the industry get together. This is a great place to pass along links.

Research your backlinks, logfiles and technorati to see what people are talking about. He gives examples of the “My 50 favorite blogging resources” and savetoby.com as successful examples.

CSS, AJAX, Web 2.0 & Search Engines

The presenters in this session are Shari Thurow, Jim McFadyen, Dan Crow, Amit Kumar and Ryan Johnston. The moderator is Danny Sullivan.

Shari Thurow is the first to go. She explains CSS as an html addition that allows webmasters to control parameters, such as font, link appearance etc. It is easier for search engines to read and decreases downtime too. CSS enables easier control of on page elements, differentiates between visited and non-visited links.

However, it has certain disadvantages too. For example, end users have to have the fonts and typefaces installed by you. Users generally prefer fonts that are not generally installed on all computers. The hyperlinks controlled by CSS dominate a page, making it look cluttered. CSS can be used to hide text and uses hidden layers. However, certain hidden layers, such as the drop downs are legitimate.

Shari says that your robots.txt should not exclude the styles directory from the web search directory. CSS should not be used to exploit the search engines.

Ryan and Jim co-present on AJAX. First of all, it stands for Asynchronous JavaScript XHTML. It includes many technologies and is not a programming language. It does not require any downloads or installations. All A-grade browsers support AJAX. The search engines do not support AJAX.  Search engines do not follow javascript. That is why all content or links delivered by AJAX are not readable by search engines.

That is why every page on your site needs to be an HTML to be indexed by search engines. All links, likewise, should be in HTML. To test the search engine friendliness of your site, just turn the javascript off and see if you can still navigate the site. AJAX disrupts normal user behavior. There is no browsing history, no back buttons and no separate URLs for different pages. Thus AJAX allows spamming and cloaking to great degree. The solution lies in you have to create unique page IDs using # sign.

They two examples, one is of very good AJAX – Amazon. The other is of very bad use of AJAX – Gucci.

The Gucci site is all in images and AJAX. If you turn your java script off, you don't see anything. There are no basic HTML pages. On the other hand, Amazon works equally well without javascript.

The next speaker, Dan Crow of Google, said that Google is soon going to index javascript, AJAX , CSS and Flash. The search engine is interested in such technologies. So don't take anything for granted while building your website.

Amit Kumar of Yahoo! said that search engines will not stop webmasters from designing websites for users.  It is upto search engines to index new technologies. AJAX not allowing URLs to change is a problem for search engines as search engines can't distinguish which exact URL is receiving the incoming link.  

4.thumbnail SES New York 2007 Day 4

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
4.thumbnail SES New York 2007 Day 4
4.thumbnail SES New York 2007 Day 4