Many search engines continually rely on computer programs called robots or
spiders to compile their databases. The programs roam across the web by
following links from site to site and indexing what they find in order to add to
their databases. Other search engines have people who manually create their
databases. Each search engine has its own criteria to decide how and what to
put in its database. Some index only main pages of web sites, while others
index each page in a web site.