# Begin robots.txt file User-agent: BLEXBot Disallow: / User-agent: PetalBot Disallow: / User-agent: AspiegelBot Disallow: / User-agent: Yandex Disallow: / User-agent: MJ12bot Disallow: / User-agent: AhrefsBot Disallow: / User-agent: SemrushBot Disallow: / User-agent: dotbot Disallow: / User-agent: MauiBot Disallow: / User-agent: Baiduspider Disallow: / User-agent: coccocbot-web Disallow: / User-Agent: RedekenBot Disallow: / User-agent: DataForSeoBot Disallow: / User-agent: * Disallow: /*/ctl/ # Googlebot permits * Disallow: /admin/ Disallow: /App_Browsers/ Disallow: /App_Code/ Disallow: /App_Data/ Disallow: /App_GlobalResources/ Disallow: /bin/ Disallow: /Components/ Disallow: /Config/ Disallow: /contest/ Disallow: /controls/ Disallow: /Documentation/ Disallow: /HttpModules/ Disallow: /Install/ Disallow: /Providers/ Disallow: /Activity-Feed/userId/ # Do not index user profiles Disallow: /login Disallow: /login.aspx Disallow: /DesktopModules/Journal/ # End of robots.txt file