To those who create the site yourself, rises, it seems evident question: how to add your resource in search results of a system for a given query? The answer is simple: those who just need to wait until the search engine robot will go and index page of your site.
Once the pages of your life will be in the index of a search engine, they will be removed in the SERP.
For most webmasters, this problem is not serious. The volume of their sites do not exceed hundreds of pages, and the search robot often correctly index the content. However, you should pay attention to several important factors when seo-optimization.
Bots typically use one of two methods indexing:
The first is that the owner himself adds the address of the site in a special box that is displayed in the web-search engine. In this case, the search engine is notified that a new site is not yet indexed. In order to turn the robot comes to each resource and indexes.
For proper indexing can specify just the home page, the rest of the robot finds itself through the site map. Map site you also upload your own. On the Internet there are many sites and programs that generate you a free card formats. Xml or. Html
The second way is that the robot itself is looking for your site. You might wonder: how does he do it? The fact that the index of a resource, the robot checks the reference placed on them. That is, if your site has a link to any resource, indexed, for example by Yandex, the search engine robot in a short time he will visit your web site and index it. Experienced webmasters believe that this version is much more beneficial to the organic search site. For this it is necessary to get (or buy) a few links and wait for the robot.
Of course, you want as soon as possible to see your website on the search engine pages! Usually, the robot indexes pages in a period not exceeding two weeks. It all depends on the workload of a search engine. In terms of indexing the fastest robot from Google. After a few hours of manual input of the robot starts its scan.
The robot, as well as developers of websites does not like complicated codes. For the most successful indexing I suggest to simplify it as much as possible and stick to the following rules:
- First, access to all pages must be no more than 3 clicks, starting from the home page. Where this is not possible, create a site map. It helps the robot to navigate your "jungle".
- Second, be careful when working with scripts. Search engine spiders can not recognize them. When using the navigation working on the script, be sure to duplicate the links!
- Third, do not rule out the attention that the search engine spiders to index up to 200 KB of text. If you care to be indexed the entire page, do not do it more than 100 kb.
For search engines there is a special file that is stored in the root of your hosting. He called robots.txt. With it you can control the search robot, for example, to enable or disable the indexing of different pages.
In contrast to foreign search engines, Russian understand tag <NOINDEX>, which allows you to hide from the robot parts of the page. In order to completely eliminate the possibility of indexing, you must put the tag in the HEAD of the page.
In order to exclude from the search results instead of fresh outdated resources, search engines perform constant re-index the site . Entries in the database is constantly updated. You should not be alarmed if the number of indexed pages suddenly changed.
Pay special attention to the choice of hosting. If your website is often "falls", it impairs the process of indexing. The robot simply does not have access to a resource! It is therefore anxious in choosing a hosting service. Do not neglect the external links. They are beneficial to re-index your site.
After analyzing the log files on the server, you will receive information about the process of indexing.