# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file User-agent: * # For any LLM training, we have implemented https://shopify.dev/llms.txt following https://llmstxt.org/ guidelines. You can append .txt to the end of any URL to get the raw text version of the page. Sitemap: https://shopify.dev/sitemap.xml Disallow: /*?*shpxid=* Disallow: /beta/ Disallow: /workshops/ Disallow: /api/shipping-partner-platform/ Disallow: /docs/api/shipping-partner-platform/ # disallow Common Crawl bot in effort to prevent being added to the Common Crawl dataset (used in GPT training) User-agent: CCBot Disallow: /apps/default-app-home # disallow ChatGPT plugins from accessing certain routes User-agent: ChatGPT-User Disallow: /apps/default-app-home
Following keywords were found. You can check the keyword optimization of this page for each keyword.
(Nice to have)