You must enable Robots.txt submission to allow search engines such as Google and Bing, to crawl your eCommerce Pro site and submit pages for indexing. If pages are not indexed, they will not appear in search results.
-
Log into the Magento admin panel.
-
Navigate to Stores > Configuration > Catalog > XML Sitemap > Search Engine Submission Settings.
-
Set the Enable Submission to Robots.txt value to Yes.
-
Click Save Config.
-
Navigate to Content > Theme > Configuration.
-
Click Edit.
-
Set Default Robots.
Note: The default robots and custom instructions settings are based on user preferences.
Select from these settings:
INDEX, FOLLOW
Instructs web crawlers to index the site and to check back later for changes.
NOINDEX, FOLLOW
Instructs web crawlers to avoid indexing the site, but to check back later for changes.
INDEX, NOFOLLOW
Instructs web crawlers to index the site once, but not check back later for changes.
NOINDEX, NOFOLLOW
Instructs web crawlers to avoid indexing the site, and to not check back later for changes.
DDI provides a standard robots.txt file in the custom instruction of robots.txt section. We allow Google (Googlebot and Google Image Bot), Yahoo (Slurp), Bing and MSN search engines to crawl the site every 15 minutes.
The DDI eCommerce Pro default robots.txt file also disallows the crawling of certain pages including account creation, CMS directories, search pages and the checkout review for security purposes.