# robots.txt # # This file is used to allow crawlers to index our site. # It is NOT used for any other purpose, such as filtering # impressions or clicks. # Ganze Site freigeben User-agent: * Disallow:/cover-download Disallow:/cover-download/* Disallow:/_files_media/hefte Disallow:/_files_media/hefte/* User-Agent: bingbot Crawl-delay: 4 User-Agent: Slurp Crawl-delay: 5 User-Agent: Yandex Crawl-delay: 5
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)