Logo-KOMMA99

Glossary entry: Indexing

Definition: What does indexing mean?

"Indexing" refers to the method of understanding information. When people talk about analog indexing they usually mean encyclopedias or telephone directories, but in the world of the Internet, indexing refers to the inclusion of a document in the index of a search engine. Search engines like Google send out crawlers (bots) to search the Web and add relevant documents to the index.

 

What is an index?

The index is an ordered directory (or index) that uses the method of indexing to list information. An index is located in a reference work (analogous, for example, to an encyclopedia, telephone directories, etc.). A well-known digital index is the "Google Index", which contains all websites recognized and stored by Google. If a website is recognized and stored by Google, it is considered indexed - this process is therefore called indexing. Only indexed web pages are displayed on the search engine results pages (SERPs), irrelevant pages are not included in the index and therefore cannot be found in Google search. For online marketing methods such as SEO (search engine optimization), the indexing of web pages is very important in order to optimize the visibility of companies.

 

How does Google index?

Compared to a static encyclopedia, the Google index is very dynamic. New web pages are continuously added, existing ones are changed and others are removed. Keeping an overview of the index here requires a lot of crawler power. Crawlers are "visitors" of web pages sent out by Google, which jump from link to link and thus make their way through the world wide web. In this way, completely new web pages can be recognized and indexed, removed web pages can be excluded from the index or changed web pages can be re-evaluated. For search engine optimization (SEO), the process of indexing is very relevant, because only indexed websites can achieve good positions.

 

Why are websites indexed?

Indexing is a method of information development. Documents are collected and sorted based on keywords. The index formed from this is similar to a library, i.e. a source for a variety of information that can be queried there. The big requirement for an index is to provide the right information quickly. Search engines such as Google, Yahoo or Bing also have this requirement - because users searching there want to find suitable search results for them as quickly as possible. Indexing is therefore an essential method that is responsible for the functionality of a search engine.

 

How is a website indexed?

Google's crawlers (also called bots) jump from one link to another, from one page to another, and the information gathered there is sent to the index. At this point, the algorithm comes into play, evaluating and sorting this information according to various ranking factors. Thus, the better a web page is linked to other web pages, the faster it can be indexed.

There is also the possibility to index a website manually and thus inform Google faster. The magic tool here is the Google Search Console, which must be active for the website in question. In the Google Search Console URL search, the respective URL of the page is entered and the status is queried. If the page is not yet indexed, it can be requested manually by clicking on "Request indexing". However, manual indexing does not mean that the page will be included in the index - that is still Google's decision.

 

Indexing and SEO

Search engine optimization (SEO) is a branch of online marketing and deals with the optimization of websites for search engines. Indexing is very important for the optimization of websites for search engines. If a website is not indexed and therefore not findable in the search engine, all further SEO measures are useless. Webmasters can initiate the process of indexation and set all measures so that the website is indexed and displayed in SERPs. These measures include:

  • Meta tags: meta tags are information that can be read by machines (e.g. bots). With meta tags, webmasters provide search engines with information to better crawl (read) a website.
  • robots.txt: In robots.txt, webmasters specify which pages of a website may be crawled and which may not. In robots.txt, entire pages, subdirectories or even domains can be excluded from the index. Sometimes this happens unintentionally, for example if the noindex tag is not removed after going online.
  • xml: The sitemap is a directory with all individual pages of a website. It makes it easier for crawlers to read the content and thus speeds up the process of indexing. The sitemap can be submitted through different methods, one way is directly through Google Search Console.

However, a specific position in the search results (e.g. page 1) cannot be achieved with these measures. To achieve a top position for specific search terms (Keywords), search engine optimization (SEO) is needed. SEO deals mainly with the areas of technology, content, Webdesign and Backlinks (Offpage SEO).

The content of a website can help crawlers to evaluate the information as relevant for the index and to index it. The interlinking of a website with other pages is called backlinkbuilding and supports crawlers to find the website faster and to index content. As an online marketing agency with focus on SEO, we support companies with these challenges - feel free to contact us!

Professional support SEO & more?

The Online Marketing Agency KOMMA99
with focus on SEO & SEA
Logo round COMMA99
Menu
SEO - Search Engine Optimization
Knowledge
Tools