# 📌 General Crawl Rules User-agent: * Disallow: /cart Disallow: /checkout Disallow: /orders Disallow: /account Disallow: /search Disallow: /policies/ Disallow: /thank_you Disallow: /*?* # Blocks URL parameters (prevents duplicate content) Disallow: /*tag=* # Blocks tag-based duplicate indexing Allow: /products/ Allow: /collections/ Allow: /blogs/ Allow: /pages/ # 📌 Sitemap for Search Engines Sitemap: https://yousweety.com/robots.txt # ✅ Allow Major Search Engine Bots (For SEO) User-agent: Googlebot Allow: / User-agent: Bingbot Allow: / User-agent: Yahoo! Slurp Allow: / User-agent: DuckDuckBot Allow: / User-agent: Applebot Allow: / User-agent: Googlebot-News Allow: / User-agent: Googlebot-Image Allow: / Disallow: /cart User-agent: Googlebot-Mobile Allow: / User-agent: Googlebot-Video Allow: / User-agent: Google-AdsBot Allow: / User-agent: AdsBot-Google Allow: / # ✅ Allow Social Media Bots (For Visibility & Engagement) User-agent: Pinterestbot Allow: / User-agent: Twitterbot Allow: / User-agent: Facebot Allow: / User-agent: LinkedInBot Allow: / User-agent: InstagramBot Allow: / User-agent: WhatsApp Allow: / User-agent: TelegramBot Allow: / # ✅ Allow AI & Content Aggregators (For Traffic & Backlinks) User-agent: AlexaCrawler Allow: / User-agent: NaverBot Allow: / User-agent: Flipboard Allow: / User-agent: Redditbot Allow: / User-agent: archive.org_bot Allow: / User-agent: Google-Site-Verification Allow: / User-agent: PetalBot Allow: / User-agent: BraveBot Allow: / User-agent: CensysInspect Allow: / # ✅ Allow SEO & Backlink Tools (For Insights) User-agent: AhrefsBot Allow: / User-agent: SemrushBot Allow: / User-agent: Majestic-12 Allow: / User-agent: MozBot Allow: / User-agent: SeznamBot Allow: / # 🚫 Block Bad Bots, Scrapers, and Spy Crawlers User-agent: AhrefsSiteAudit Disallow: / User-agent: SemrushBot-SiteAudit Disallow: / User-agent: MJ12bot Disallow: / User-agent: SimilarWebBot Disallow: / User-agent: ShoplazzaBot Disallow: / User-agent: DotBot Disallow: / User-agent: Bytespider Disallow: / User-agent: BLEXBot Disallow: / User-agent: MegaIndex Disallow: / User-agent: ZoominfoBot Disallow: / User-agent: Barkrowler Disallow: / User-agent: MauiBot Disallow: / User-agent: CCBot Disallow: / User-agent: Wget Disallow: / User-agent: HTTrack Disallow: / User-agent: Python-urllib Disallow: / User-agent: Go-http-client Disallow: / User-agent: Java Disallow: / User-agent: curl Disallow: / User-agent: AliHunter Disallow: / User-agent: KoalaInspector Disallow: / User-agent: CommerceInspector Disallow: / User-agent: DropshipSpy Disallow: / User-agent: AliExpressScraper Disallow: / User-agent: NicheScraper Disallow: / User-agent: DataForSeoBot Disallow: / User-agent: SpyFu Disallow: / User-agent: Sogou Disallow: / User-agent: Screaming Frog SEO Spider Disallow: / User-agent: AspiegelBot Disallow: / User-agent: SiteExplorer Disallow: / User-agent: Sistrix Disallow: / User-agent: VoilaBot Disallow: / User-agent: WebmeupBot Disallow: / User-agent: Exabot Disallow: / User-agent: MojeekBot Disallow: / User-agent: NetcraftSurveyAgent Disallow: / User-agent: Cliqzbot Disallow: / User-agent: VelenPublicWebCrawler Disallow: / User-agent: woorankreview Disallow: / # 🚫 Block Aggressive Crawlers from Yandex & Baidu User-agent: YandexBot Disallow: / User-agent: Baiduspider Disallow: / # 🚫 Block AI Bots & Large Language Model Crawlers (Optional) User-agent: GPTBot Disallow: / User-agent: ClaudeBot Disallow: / User-agent: CCBot Disallow: / User-agent: AI-Content-Crawler Disallow: / User-agent: ChatGPTBot Disallow: / User-agent: Amazonbot Disallow: / User-agent: Ezooms Disallow: / User-agent: LuminateBot Disallow: / User-agent: SMTBot Disallow: / User-agent: 360Spider Disallow: / User-agent: Aboundexbot Disallow: / User-agent: LinkpadBot Disallow: /