Robots.txt for Ecommerce

Filter by Category
Filter by Category
Back

If you have an ecommerce website and work with a good developer, you probably already have a robots.txt file in your root directory. Search engines use robots (also known as spiders or crawlers) to search and categorize websites, and the pages that they find during their crawl are the pages that show up in the search results when someone performs a search in Google, Bing or any other major search engine. Your robots.txt file is an effective way to tell search engines which areas of a website should not be processed or crawled.

To Crawl or Not to Crawl? That is the Question

Without a robots.txt file in place, your ecommerce website is completely open and crawl-able, which sounds like a good thing. But using up bandwidth on irrelevant and outdated content may come at the expense of crawling and indexing important and valuable pages. You might even have some key pages that get skipped entirely.

Meticulosity Robots.txt file

If left unchecked, a search engine robot may crawl your shopping cart links, wishlist links, admin login pages, your development site or test links, or other content that you might not want showing up in the search results. Crawls can find personal information, temporary files, admin pages, and other pages that contain information that you may not have realized was publicly accessible.

It is important to remember that each website has a “crawl budget”, a limited number of pages that can be included in a crawl. You want to make sure that your most important pages are getting indexed and that you’re not wasting your crawl with temporary files.

New Call-to-action

Learn More

While a robots.txt file can be useful to block content that you don’t want indexed, sometimes robots.txt files inadvertently block content that website owners do want crawled and indexed. If you're having troubles with some of your key pages not getting indexed, the robots.txt file is a good place to check to identify the issue. It's a good idea to monitor what's in your robots.txt file and keep it updated.

There are a number of SEO tools that you can use to see what a robots.txt file may be blocking and one of the best tools is found, for free, in Google Search Console.

robots.txt_Google-Search-Console.png

With the Google Search Console robots.txt Tester, you can test specific pages to see if they are being crawled or not. Don't forget to leverage your XML Sitemap to list the pages you do want Google to crawl. If you're auto-generating your XML Sitemap, you might unintentionally be including pages that you're also disallowing in your robots.txt file. Keeping an eye on Google Search Console will alert you to errors like these in both your sitemap file and your robots.txt file. 

A robots.txt file can be the best thing to happen to your ecommerce website, so if you're not utilizing its power, it's time to get started. If you're need help with setting up your robots.txt file, let us know, and we can help you out. 

How to Set Up a Facebook Pixel Event for Your Ecommerce Site
Top 5 Paid Advertising Mistakes (Part 2)

About Author

Ally Bootsma
Ally Bootsma

Related Posts
Hybrid Retail Explained
Hybrid Retail Explained
Exporting a Google Shopping Product Feed from Magento
Exporting a Google Shopping Product Feed from Magento
XML Sitemaps for Ecommerce Sites
XML Sitemaps for Ecommerce Sites

Comment