# Blocks crawlers that are kind enough to obey robots User-agent: AhrefsBot User-agent: Applebot-Extended User-agent: Arquivo-web-crawler User-agent: BLEXBot User-agent: BUbiNG User-agent: Bloglines/3.1 User-agent: BrandVerity User-agent: Bytespider User-agent: CCBot User-agent: ClaudeBot User-agent: Crawlspace User-agent: CrazyWebCrawler-Spider User-agent: DOC User-agent: Diffbot User-agent: Domain Re-Animator Bot User-agent: Download Ninja User-agent: Exabot User-agent: Fetch User-agent: GPTBot User-agent: Google-Extended User-agent: HTTrack User-agent: Jyxobot/1 User-agent: Linguee User-agent: MSIECrawler User-agent: MauiBot User-agent: Meta-ExternalAgent User-agent: Microsoft.URL.Control User-agent: NPBot User-agent: Offline Explorer User-agent: PanguBot User-agent: PetalBot User-agent: SeekportBot User-agent: SemrushBot User-agent: SemrushBot-SA User-agent: SiteSnagger User-agent: Speedy User-agent: Teleport User-agent: TeleportPro User-agent: Timpibot User-agent: UbiCrawler User-agent: Vegi User-agent: WebCopier User-agent: WebReaper User-agent: WebStripper User-agent: WebZIP User-agent: Xenu User-agent: Yandex User-agent: YandexAdditional User-agent: YandexAdditionalBot User-agent: YandexBot User-agent: Zao User-agent: Zealbot User-agent: ZyBORG User-agent: archive.org_bot User-agent: cityreview User-agent: dotbot User-agent: grub-client User-agent: k2spider User-agent: larbin User-agent: libwww User-agent: linko User-agent: psbot User-agent: rogerbot User-agent: sitecheck.internetseer.com User-agent: wget Disallow: / User-agent: * #allow digested assets Allow: /*?vsn=d$ #allow paginated sitemaps Allow: /sitemap/*?page= #allow paginated category pages Allow: /directory/*?page= #allow paginated spotlight pages Allow: /software/*?page= #allow paginated blog homepage Allow: /blog?page= #pages with query strings Disallow: /*?* Disallow: /cdn-cgi/ Disallow: /x/* Disallow: /sem/
Folgende Keywords wurden erkannt. Überprüfe die Optimierung dieser Keywords für Deine Seite.
(Nice to have)