# # robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: https://example.com/robots.txt # Ignored: https://example.com/site/robots.txt # # Website Sitemap Sitemap: https://www.autonomous.ai/sitemap.xml User-agent: * User-agent: Googlebot User-agent: Googlebot-image Disallow: /new-in/* Disallow: /pre-launch Disallow: /*/pre-launch Disallow: /*/compare Disallow: /*/talents Disallow: /*/success-story Disallow: /*/office-furniture-bulk-orders Disallow: /*/business-offer Disallow: /*/reseller-program Disallow: /*/influencer-review Disallow: /*/press-feature Disallow: /*/track-your-order Disallow: /*/share-and-earn-money Disallow: /*/about Disallow: /QC-* Disallow: /*/QC-* Disallow: /anon/ Disallow: https://m.autonomous.ai/ Disallow: https://m.autonomous.ai/* Allow: /p/office-furniture-bulk-orders user-agent: *Yandex.* user-agent: *ichiro.* user-agent: *moget.* user-agent: *NaverBot.* user-agent: *Baiduspider.* user-agent: *baidu.* user-agent: *Sogou.* user-agent: *Youdao.* user-agent: *MJ12bot.* user-agent: *Goodzer.* user-agent: *istellabot.* user-agent: *Gigabot.* user-agent: *DotBot.* user-agent: *SeznamBot.* user-agent: *ltx71.* user-agent: *Spiderbot.* user-agent: *opensiteexplorer.* user-agent: *OpenLinkProfiler.* user-agent: *majestic12.* user-agent: *datagnion.* user-agent: *sogou.* user-agent: *ahrefs.* user-agent: *scoutjet.* user-agent: *changedetection.* user-agent: *istellabot.* user-agent: *napoveda.seznam.* user-agent: *linkfluence.* user-agent: *smarter.* user-agent: *mj12bot.* user-agent: *mojeek.* user-agent: *naver.* user-agent: *deusu.* user-agent: *slack.* user-agent: *aihitdata.* user-agent: *wotbox.* user-agent: *kazbt.* user-agent: *mediatoolkit.* user-agent: *safedns.* user-agent: *aboundex.* user-agent: *webmeup-crawler.* user-agent: *orangebot.* # Directories Disallow: /api* Disallow: /dev.*
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)