Ajax software
Free javascripts
↑
Main Page
This is probably the most common approach employed by dynamic web sites at present, because you
frequently meet URLs such as the following:
?
http://www.example.com/catalog.php?cat_id=1
?
http://www.example.com/catalog.php?cat_id=2&prod_id=3&ref_id=4
This approach is certainly the easiest and most straightforward when developing a dynamic site. However,
it is frequently sub-optimal from a search engine spider ’s point of view. It also doesn’t provide relevant
keywords or a call to action to a human viewing the URL.
Some programmers also tend to use extra parameters freely — as shown in the second URL example. For
example, if the parameter
ref_id
is used for some sort of tracking mechanism, and search engine friend-
liness is a priority, it should be removed. Lastly, any necessary duplicate content should be excluded from
search engines’ view using a
robots.txt
file or a
robots
meta tag. This topic is discussed in Chapter 5.
Example #2: Numeric Rewritten URLs
An improved version of the previous example is a modified URL that removes the dynamic parameters
and hides them in a static URL. This static URL is then mapped to a dynamic URL using an Apache
module called mod_rewrite. The
ref_id
parameter previously alluded to is also not present, because
those types of tracking parameters usually can and should be avoided:
?
http://www.example.com/Products/1/
?
http://www.example.com/Products/2/1/
The impact of numeric URL rewriting will likely be negligible on a page with one parameter, but on
pages with two parameters or more, URL-rewriting is desirable.
This form of URL is particularly well-suited to the adaptation of existing software. Retrofitting an
application for keyword-rich URLs, as discussed in the next section, may present additional difficulty
in implementation.
It is important to realize that rewriting a dynamic URL only
obscures
the parameters. It prevents a search
engine from perceiving that a URL structure is problematic as a result of having many parameters. If
underlying problems still exist on a web site — usually in the form of duplicate content — a search engine
still may still have difficulty indexing it effectively.
If URLs on your site are for the most part indexed properly, it may not be wise to
restructure URLs. However, if you decide that you must, please also read Chapter 4,
which teaches you how to make the transition smoother. Chapter 4 shows how to
preserve link equity by properly redirecting old URLs to new URLs. Also, not all
solutions to URL-based problems require restructuring URLs; as mentioned earlier,
duplicate content can be excluded using the
robots.txt
file or the
robots
meta tag,
which are discussed in Chapter 5.
43
Chapter 3: Provocative SE-Friendly URLs
c03.qxd:c03 10:39 43
Ajax software
Free javascripts
→ R7