# # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/robotstxt.html User-agent: * # CSS, JS, Images Allow: /core/*.css$ Allow: /core/*.css? Allow: /core/*.js$ Allow: /core/*.js? Allow: /core/*.gif Allow: /core/*.jpg Allow: /core/*.jpeg Allow: /core/*.png Allow: /core/*.svg Allow: /profiles/*.css$ Allow: /profiles/*.css? Allow: /profiles/*.js$ Allow: /profiles/*.js? Allow: /profiles/*.gif Allow: /profiles/*.jpg Allow: /profiles/*.jpeg Allow: /profiles/*.png Allow: /profiles/*.svg # Directories Disallow: /core/ Disallow: /profiles/ # Files Disallow: /README.md Disallow: /composer/Metapackage/README.txt Disallow: /composer/Plugin/ProjectMessage/README.md Disallow: /composer/Plugin/Scaffold/README.md Disallow: /composer/Plugin/VendorHardening/README.txt Disallow: /composer/Template/README.txt Disallow: /modules/README.txt Disallow: /sites/README.txt Disallow: /themes/README.txt Disallow: /web.config # Paths (clean URLs) Disallow: /admin/ Disallow: /comment/reply/ Disallow: /filter/tips Disallow: /node/add/ Disallow: /search/ Disallow: /user/register Disallow: /user/password Disallow: /user/login Disallow: /user/logout Disallow: /media/oembed Disallow: /*/media/oembed # Paths (no clean URLs) Disallow: /index.php/admin/ Disallow: /index.php/comment/reply/ Disallow: /index.php/filter/tips Disallow: /index.php/node/add/ Disallow: /index.php/search/ Disallow: /index.php/user/password Disallow: /index.php/user/register Disallow: /index.php/user/login Disallow: /index.php/user/logout Disallow: /index.php/media/oembed Disallow: /index.php/*/media/oembed # Legal notice: fuhrpark.de expressly reserves the right to use its content for commercial text and data mining (§ 44b UrhG). # The use of robots or other automated means to access fuhrpark.de and/or all its subpages or collect or mine data from fuhrpark.de and/or all its subpages without the express permission of fuhrpark.de is strictly prohibited. # If you would like to apply for permission to crawl fuhrpark.de and/or all its subpages, collect or use data, please contact lizenzen@schluetersche.de # ------------------------------------- # Extra-Robots Disallow: /reifentests* Disallow: /verzeichnis/suche/ Disallow: /index.php/verzeichnis/suche/ Disallow: /stellenmarkt/ Disallow: /*/node/*?page* Disallow: /*/node/*&page* User-agent: Applebot Disallow: / User-agent: AhrefsBot Disallow: / User-agent: Amazonbot Disallow: / User-agent: anthropic-ai Disallow: / User-agent: AwarioBot Disallow: / User-agent: sentibot Disallow: / User-agent: SeznamBot Disallow: / User-agent: DataForSeoBot Disallow: / # Slurp User-agent: Slurp Disallow: / # X11; compatible; semantic-visions.com crawler User-agent: semantic-vision.com Disallow: / # Testcrawler User-agent: Testcrawler Disallow: / # DENIC Crawler User-agent: DENIC-Crawler/42 Disallow: / # contao/crawler Crawler User-agent: contao/crawler Disallow: / # Bing User-agent: bingbot Disallow: /*send_as_pdf* Disallow: /*download_as_pdf* Disallow: /*node* Disallow: /*taxonomy* # AdsTxtCrawlerTP User-agent: AdsTxtCrawlerTP Disallow: / # AdsTxtCrawler User-agent: AdsTxtCrawler Disallow: / # arquivo-web-crawler User-agent: arquivo-web-crawler Disallow: / # barkrowler User-agent: Barkrowler Disallow: / User-agent: Bytespider Disallow: / # BLEXBot User-agent: BLEXBot Disallow: / # BrightEdge User-agent: BrightEdge Disallow: / User-agent: CCBot Disallow: / User-agent: ChatGPT-User Disallow: / # ClaudeBot User-agent: ClaudeBot Disallow: / # Clubhouse-LinkPreview User-agent: Clubhouse-LinkPreview Disallow: / User-agent: cohere-ai Disallow: / # CriteoBot User-agent: CriteoBot Disallow: / # contao User-agent: contao Disallow: / # crawler_eb_germany_2.0 User-agent: crawler_eb_germany_2.0 Disallow: / User-agent: Diffbot Disallow: / # Dotbot User-agent: dotbot Disallow: / # ev-crawler User-agent: ev-crawler Disallow: / # Exabot User-agent: Exabot Disallow: / User-agent: FacebookBot Disallow: / User-agent: GPTBot Disallow: / User-agent: Google-Extended Disallow: / # GrapeshotCrawler User-agent: GrapeshotCrawler Disallow: / # ias_crawler User-agent: ias_crawler Disallow: / User-agent: ImagesiftBot Disallow: / # IonCrawl User-agent: IonCrawl Disallow: / # LCC User-agent: LCC Disallow: / # LivelapBot User-agent: LivelapBot Disallow: / # LightspeedSystemsCrawler User-agent: LightspeedSystemsCrawler Disallow: / # magpie-crawler User-agent: magpie-crawler Disallow: / # MBCrawler User-agent: MBCrawler Disallow: / # MegaIndex.ru User-agent: MegaIndex.ru Disallow: / # metajobbot User-agent: metajobbot Disallow: / User-agent: meta-externalagent Disallow: / # MixrankBot User-agent: MixrankBot Disallow: / # MJ12bot User-agent: MJ12bot Disallow: / # Netvibes User-agent: netvibes Disallow: / # Nicecrawler User-agent: Nicecrawler Disallow: / User-agent: Omgilibot Disallow: / User-agent: Omgili Disallow: / # oBot User-agent: oBot Disallow: / # OgScrper User-agent: OgScrper Disallow: / # PerplexityBot User-agent: PerplexityBot Disallow: / # PetaBot User-agent: PetalBot Disallow: / # PMGCrawler User-agent: PMGCrawler Disallow: / # proximic User-agent: proximic Disallow: / # PubMatic User-agent: PubMatic Disallow: / # radian6 User-agent: radian6 Disallow: / # R6_CommentReader User-agent: R6_CommentReader Disallow: / # R6_FeedFetcher User-agent: R6_FeedFetcher Disallow: / # Scrapy User-agent: Scrapy Disallow: / # SemanticScholarBot User-agent: SemanticScholarBot Disallow: / # SEOkicks User-agent: SEOkicks Disallow: / # SemrushBot User-agent: SemrushBot Disallow: / # SEOkicks User-agent: SEOkicks-Robot Disallow: / # sitepark User-agent: sitepark Disallow: / # Testcrawler User-Agent: Testcrawler Disallow: / # TinEye-bot-live User-Agent: TinEye-bot-live Disallow: / # X11 User-agent: X11 Disallow: / # Yandex User-agent: Yandex Disallow: /
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)