Duplicate Content and Multiple Site Issues – SES San Francisco 2010 Day 3

Aug 20, 2010 | 1,334 views | by Navneet Kaushal
VN:F [1.9.20_1166]
Rating: 5.0/5 (1 vote cast)

This 2nd session of Search Engine Strategies San Francisco 2010 Day 3 focused mainly on what is duplicate content and the issues related with duplicate content and multiple sites.


  • Adam Audette, President, AudetteMedia, Inc


  • Shari Thurow, Founder & SEO Director, Omni Marketing Interactive
  • Kathleen Pitcher, Senior Manager, Acquisitions Marketing, Pogo.com/Electronic Arts, Inc.
  • Michael Gray, Owner, Atlas Web Service

Shari Thurow is the first speaker of this session who explains what is a duplicate content, why it poses a problem, plus how search engines determine duplicate content, and finally, an overview of recommendations and solutions.

She further says that Duplicate content doesn’t mean an exact match, instead they are looking for resemblance. A site can receive more traffic if duplicate content can be managed well. By removing or managing a duplicate content, a site can drive more visitors, searches and possibly conversions on their site over the next few months. Shari focused on the negative aspect of having a duplicate content in a site. She says, if a site delivers duplicate content, it will be automatically filtered out due to a duplicate version of it ranking instead. This means that your best converting page may not be available to rank.

Shari further gives an example of a site where Google auto- filled the site search engine causing those results pages to rank in the index. The site’s money pages didn’t rank, leading to many problems. Whereas, when they modified their robots.txt file to exclude the search results pages from the site search, soon the money pages began to rank and the problem resolved. She also mentions about the quick duplicate content checklist which is given below:

  • Boilerplate templates
  • Linkage properties (inbound and outbound links)
  • Host name resolution
  • Shingles (word sets)

Shari also talks on how do you deal with duplicate content? Below mentioned are some of the ways:

1.Build a good site in the first place, with good information architecture, site navigation and page interlinking

  • Are URLs linked to consistently throughout the site or the site network?
  • Are the links labeled consistently?


  • You’re able to prevent your duplicate pages from even being crawled

3.Robots Meta tag

  • If articles are shared across your network of sites, are you implementing NOINDEX, NOFOLLOW appropriately?

4.Canonical tag

  •   (link rel=”canonical” href=”http://www.example.com”)

5.Redirects (301)

6.NOFOLLOW attribute

7.Web search engine webmaster tools

8.Sitemap (XML)

While concluding she said that maintaining consistency is important. It is very important
to provide the search engines with clues as to which pages you want to appear in search results, then they are most likely to find the right pages.

Next up is Kathleen who begins by talking about the two types of duplicate content, naughty and nice. According to her, at times there are legitimate reasons why a site might have duplicate content? Naughty duplicate content is when you knowingly or unknowingly have content out there in order to inflate the amount of content you have in order to rank better. For example, www.site.com, site.com and site.com/index.html. Another type would be printer-friendly pages, blog category and tag pages. Syndicated content can also create duplicate content. She says these are all legitimate reasons for having the duplicate content and you aren’t meaning to do evil with it. But when you multiply your content by putting it across different domains, or stealing another well-ranked site’s content are considered as naughty.

Kathleen says that the consequences for publishing duplicate is not that worse. You wont be
completely black listed. Only the visibility of your site will decrease in the search engines for certain words or you may see that one particular page that you don’t necessarily feel is a good landing page is getting more traffic than the real, user-friendly version of the page. This can affect your conversion rate.

Kathleen’s 5 best practices on duplicate content:

1. Determine if you have it

  • Is the same info located in multiple places on your site?
  • Is every page valuable?
  • Naughty or nice?

2. Leverage resources

  • Talk to other departments in your company
  • Consult with your agency
  • Research industry sites
  • Review webmaster forums
  • Talk to industry peers

3. Be proactive

  • Write unique page content
  • Identify authority pages
  • Be aware of engine updates
  • Manage syndicated content

4. Manage syndicated content effectively

  • Allow ample time for your original content to be indexed before giving it to other sites
  • Require links back
  • Require condensed versions
  • Use generic Meta data

5. Don’t freak out

  • There is no specific penalty
  • There are legitimate reasons
  • Duplicate content may not be naughty
  • There are usually multiple solutions

Michael, was the penultimate speaker of the session. He started by saying that sometimes instead of avoiding duplicate content, it can be used as a weapon. According to Michael, if you syndicate the content, most of the time it’ll be taken as a whole and not modified at all. He further says that, this can be used as an opportunity to get links. If your content gets published on a trusted source with more authority that your site, it’s okay to let them take “ownership” of that content just to get the value of the link back to your site.

Michael says, with some of your duplicated content, you can modify your Meta information, which will allow you to rank differently for the same content.

Duplicate can sometimes be used to get visibility. Michael gives the example of Jason Calacanis of Mahalo who writes articles that cause reactions. In one case he wrote an article and waited for people to reprint it just to get links from large news and authority sites back to Mahalo.

Below mentioned are some of the reasons why Michael loves scrapers:

  • Many web scrapers search for keywords and leave links in the content
  • This can be used to your advantage by linking with high value focused keyword anchor text.
  • It will be advantageous for you, as long as you offset these with low value links with some mid- and high-level links It is important to insert links back to the original website
  • In order to get different credit, the anchor text, link and surrounding text should be changed after every 3-4 months.


  • It is important to look for opportunities in order to syndicate your duplicate content to gain attention, exposure and links from trusted sites
  • In order to target more keywords, copy should be refined
  • You should allow your blog and RSS feed to be syndicated with self-referencing and keyword-focused links to commercial pages
Duplicate Content and Multiple Site Issues - SES San Francisco 2010 Day 3, 5.0 out of 5 based on 1 rating
4.thumbnail Duplicate Content and Multiple Site Issues   SES San Francisco 2010 Day 3

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.
4.thumbnail Duplicate Content and Multiple Site Issues   SES San Francisco 2010 Day 3
4.thumbnail Duplicate Content and Multiple Site Issues   SES San Francisco 2010 Day 3