Ajax software
Free javascripts
↑
Main Page
The problem arises when a product is in more than one category, and a parameter must be passed in order
to build the breadcrumb programmatically. Without breadcrumbs, the parameter would be unnecessary,
and presumably the user would navigate back to the category page using his or her back button, but now
you have a problem. If every product is in an average of three categories, you have three essentially dupli-
cated pages for every product on the site.
The only difference on your pages is the breadcrumb:
Home > products > fortune cookies > frosted fortune cookie
Home > products > novelty cookies > frosted fortune cookie
This creates a particularly sticky problem. The benefits of friendly site navigation cannot be denied, but
it also causes duplicate content issues. This is a topic that we feel is largely ignored by the SEM commu-
nity. Breadcrumbs are clearly a help for navigation, but can cause problems with regard to duplicate
content. In general, using tracking variables in URLs that do not effect changes in the content creates
duplicate content.
There are other ways to cope with this issue. Following is a presentation of the ways that you
can
address the duplicate content issues associated with breadcrumbs if you do want to address it.
Using One Primary Category and robots.txt or Meta-Exclusion
This involves setting one category that a product falls into as “primary.” It involves adding a field to a
database in the application to indicate as such. This is the idea espoused by Dan Thies in his SEM book
The Search Engine Marketing Kit
as well. The upside is that it’s bulletproof, in that you will never be
penalized by a search engine for having duplicated pages. But there are two downsides:
1.
Very often the keywords from your products placed in multiple categories (in the title, perhaps
under the breadcrumb, or in the “suggested” products) may yield unexpected rankings for what
we call “permutation” keywords. Obviously, with this solution, you only get one of the permu-
tations — the primary one.
Example: Assume a cake is in two categories: “birthday” and “celebration.” The resulting titles
are “Super Cheesecake: Birthdays” and “Super Cheesecake: Celebration.” If the webmaster picks
“birthday” as the primary category, a search engine will never see the other page that may rank
better for the hypothetical less-competitive keywords “celebration cheesecake,” because that
page is excluded via
robots.txt
or meta-exclusion.
2.
Users may passively “penalize” you by linking the non-primary page. A link to an excluded
page has questionable link-value, arguably none for that page — but perhaps also none to the
domain in general.
Changing Up the Content on the Various Permutations
Done right this can also work, but it must be done carefully. If done inadequately, it can, in the worst
case, result in a penalty. You could change the description of a hypothetical product on a per-category
basis, or add different related products to the page. As you may have guessed, the exact threshold for
how much unique content is required is elusive. Use this technique with caution.
105
Chapter 5: Duplicate Content
c05.qxd:c05 10:41 105
Ajax software
Free javascripts
→