# robots.txt # # This file is used to allow crawlers to index our site. # It is NOT used for any other purpose, such as filtering # impressions or clicks. # Ganze Site freigeben User-agent: * Disallow:/cover-download Disallow:/cover-download/* Disallow:/_files_media/hefte Disallow:/_files_media/hefte/* User-Agent: bingbot Crawl-delay: 4 User-Agent: Slurp Crawl-delay: 5 User-Agent: Yandex Crawl-delay: 5
Se han encontrado las siguientes palabras clave. Comprueba si esta página está bien optimizada para cada palabra clave en concreto.
(Deseable)