Spiders begin by fetching the data from a few web pages and then it follows the links on those pages and then follow the links on those pages and so on until a pretty big portion of the web Is linked. These pages are stored on millions of computers and have about million of ggabytes. The pages are sorted by their content and other factors and they are kept track of in the index. There are several algorithms and programs to understand and deliver the best posslbe result.
Algorithms Ilke autocomplete, spelling, synonyms, query understanding etc are used to understand what the user actually want. Then using these Information the most relevant pages are sorted out based on over 200 factors. These factors Include site and page quality, freshness,
This blends the relevant Images, videos, data and personal content Into a single unified search results page. After all these factors are evaluated and a fine list of search results are obtained they are Olspla to tne user. The next stage is to fight the spam off The site owners will be notified if their site is marked as spam. The siteowners could then fix the problem. And the best part is that all these things happen in Just 118th of a second and Google could support about 100 billion searches each month..!