Search Engines send out robots (automated software) which read and index web sites. Robots are also called, spiders, crawlers, bots, automatic indexers. Google calls their robot Googlebot. The spider visits a web page, reads it, and then follows links to other pages within the site. The spider revisits most sites frequently to index changes and add new pages to its index. When a user queries a search engine that engine searches its own database for corresponding information.
1 | 2 | 3 | 4 | 5
We use more than 200 signals, including our patented PageRank™ algorithm, to examine the entire link structure of the web and determine which pages are most important.http://www.google.com/corporate/tech.html
We then conduct hypertext-matching analysis to determine which pages are relevant to the specific search being conducted.
By combining overall importance and query-specific relevance, we're able to put the most relevant and reliable results first.PageRank Technology:
PageRank reflects our view of the importance of web pages by considering more than 500 million variables and 2 billion terms.
Pages that we believe are important pages receive a higher PageRank and are more likely to appear at the top of the search results.