# robots.txt — forg.to # Served as a static file to bypass Cloudflare's Managed Robots.txt # which was prepending AI-crawler block rules before our Next.js route. # ── Default: all crawlers ───────────────────────────────────────────────────── User-agent: * Allow: / Allow: /articles/ Allow: /products/ Allow: /@ Allow: /updates/ Allow: /topic/ Allow: /compare/ Allow: /explore Allow: /docs Allow: /help Allow: /privacy Allow: /terms Disallow: /api/ Disallow: /settings/ Disallow: /schedule Disallow: /notifications Disallow: /onboarding Disallow: /analytics Disallow: /adminpvt/ Disallow: /sudoadmin/ Disallow: /_next/ Disallow: /ref/ Disallow: /affiliate/dashboard # ── Googlebot ───────────────────────────────────────────────────────────────── User-agent: Googlebot Allow: / Allow: /articles/ Allow: /products/ Allow: /@ Allow: /updates/ Allow: /topic/ Allow: /compare/ Allow: /explore Allow: /docs Allow: /help Allow: /privacy Allow: /terms Disallow: /api/ Disallow: /settings/ Disallow: /_next/ # ── AI search crawlers — full public content access ─────────────────────────── User-agent: GPTBot User-agent: OAI-SearchBot User-agent: ChatGPT-User User-agent: ClaudeBot User-agent: PerplexityBot Allow: / Allow: /articles/ Allow: /products/ Allow: /@ Allow: /updates/ Allow: /topic/ Allow: /compare/ Allow: /explore Disallow: /api/ Disallow: /settings/ Disallow: /_next/ # ── AI training crawlers — evergreen content only ───────────────────────────── User-agent: CCBot User-agent: anthropic-ai User-agent: Bytespider User-agent: cohere-ai Allow: / Allow: /articles/ Allow: /products/ Allow: /topic/ Disallow: /api/ Disallow: /settings/ Disallow: /_next/ # ── Sitemap ─────────────────────────────────────────────────────────────────── Sitemap: https://forg.to/sitemap.xml