How Search Engines do Indexing?
Spiders or bots are the most used automated software programs by search engines for surveying the web and also for building up the data repository. It is with the aid of the search engine spiders (also called search engine crawlers), through which the search engines fetches your website information to keep it up-to-date and devoid of spam.
When an end user makes a query for a particular keyword, it will be checked against the search engines index and the best URLs that are matched will be shown in the search results. You need to deploy accurate, white hat SEO methods for pushing your site on top of all search engine result pages.