Use the Robots.txt File to Prevent Crawling of Development and Staging Instances

When a robot visits a web site, such as, it first checks for the robots.txt file and analyzes it to determine if it's allowed to index the site. For the robots.txt file to be used by Salesforce B2C Commerce, you must set up your hostname alias.

Here is a sample robots.txt file that prevents all robots from visiting the entire site:

User-agent: *# applies to all robots

Disallow: /# disallow indexing of all pages

The robot simply looks for a /robots.txt URI on your site, where a site is defined as an HTTP server running on a particular host and port number.

Note: Storefront Password Protection only password protects dynamic pages, not access to static content like images. Deploy a robots.txt file if you don't want images or other static content crawled.

Related Links

Generating a Robots.txt File

Assigning a Hostname Alias

X Privacy Update: We use cookies to make interactions with our websites and services easy and meaningful, to better understand how they are used. By continuing to use this site you are giving us your consent to do this. Privacy Policy.