What are search engine robots

Search robot is a software robot scriptagent search engine, scanning pages site and transmitting indexdata for the subsequent formation of search databases. In the process, the so-called indexation, on the basis of which as a result is formed search engine output in response to the target request user.

In a professional environment also you can find other names - search bot, spider or spider, Crowler. 

Get a free consultation from an SEO expert on your site

Search baud features

Bots do not analyze the information received and do not provide a description or assessment of the quality of the site and its pages. Their task is only to read data and add it to the existing search engine database. Such work resembles the well-functioning activities of a courier who makes movements along a strictly defined route. 

Search robots differ in their purpose - the type of information with which they work. For example, image bypasses are produced by Googlebot-Image and YandexImages of the corresponding systems, and Googlebot-News and YandexNews search robots are news content kraulers.

что такое робот поисковой системы картинка

Indexing Procedure

The functioning of the search system in general terms consists of three main stages:

  • Robot Scan
    Scanning web resources not performed spontaneously. The actions of the spiders are programmed and performed in a given sequence system. The distribution of attention, the number of scanned pages and the speed of reading data depend on many indicators.
  • Indexing
    Also performed by bots. Collected data is added to the search engine database.
  • Formation of the issuance of PS
    As a result of indexing and analysis of information collected by robots of PS, they are selected relevant on request links and distributed in accordance with search engine ranking algorithms.

Index Management

Interaction webmasters search engines are happening with robots through service files robots.txt and sitemap.xml. With their help and using special system commands optimizer has the ability to open for indexation or hide some pages from it. It is worth considering that such communication is exclusively advisory in nature, and spiders can ignore them partially or completely. 

You can also influence this with the help of special tools that are used to speed up scanning and SEO-provements new or modified content. For example, in Yandex you can use the tools “Walk through metrics” or “Flap of pages” in Yandex. Webmaster.