Best javascript editor debugger
Freeware editor
↑
Main Page
40
Robots Deliver
We’re going to start with the basics of how the search engines work, and a major com-
ponent of this is a
robot,
or
spider
, which is software that slurps up your site’s text and
brings it back to be analyzed by a powerful central “engine.” This activity is referred
to as
crawling
or
spidering
. There are lots of different metaphors for how robots work,
but we think ants make the best one. Think of a search engine robot as an explorer ant,
leaving the colony with one thought on its mind:
Find food.
In this case, the “food”
is HTML text, preferably lots of it, and to find it, the ant needs to travel along easy,
obstacle-free paths: HTML links. Following these paths, the ant (search engine robot),
with insect-like single-mindedness, carries the food (text) back to its colony and stores
it in its anthill (search engine database). Thousands and thousands of the little guys are
exploring and gathering simultaneously all over the Internet. (See Figure 3.1 for a visual
example.) If a path is absent or blocked, the ant gives up and goes somewhere else. If
there’s no food, the ant brings nothing back.
Figure 3.1
Search engine robots at work
So basically, when you think of a search engine, you really need to think of a
database that holds pieces of text that have been gathered from millions of sites all
over the web.
What sets that engine in motion? A search. When a web surfer enters the term
“grape bubble gum” into the search engine, all of the sites that
might
be relevant for
Text info
Web pages
c03. 8:01 40
Best javascript editor debugger
Freeware editor
→ R7