Magento 2 Robots.txt File Examples

Rate this post

If you are the website developer you must know the importance of “robots.txt” file. This robots.txt file helps search engines like Yahoo, Bing, Google..etc to understand what information on site needed to be indexed or which item or folder excluded for indexed.  Mostly Robots.txt available on website root folder  you can check your website robots.txt file from http:///robots.txt. Most of the site have robots.txt respective of site script you are using. Unfortunately, by default Magento do not have robots.txt file so you have to take a time to create this file.

Magento 2

Improving search engine performance Using Robots.txt in Magento 2

  1. Robots.txt will help prevent duplicate content issue, one of the primary thing for SEO success.
  2. Ronots.txt will also disallowed search engines to indexed unwanted folder such as admin, js, test folder, logs, SVN files, PDF invoice..etc

Robots.txt Example for Magento 2

Magneto 2 is latest version release on November, 2015 so, we provided tested version of robots.txt file which you can use.

# Crawlers Setup
# Directories
User-agent: *
Disallow: /app/
Disallow: /bin/
Disallow: /dev/
Disallow: /lib/
Disallow: /phpserver/
#Disallow: /pub/
Disallow: /setup/
Disallow: /update/
Disallow: /var/
Disallow: /vendor/

# Paths (clean URLs)
User-agent: *
Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /wishlist/
Disallow: /customer/account/
Disallow: /customer/account/login/

# Do not index the general technical directories and files on a server
#Disallow: /cgi-bin/

# Files
User-agent: *
Disallow: /composer.json
Disallow: /composer.lock
Disallow: /CONTRIBUTING.md
Disallow: /CONTRIBUTOR_LICENSE_AGREEMENT.html
Disallow: /COPYING.txt
Disallow: /Gruntfile.js
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /nginx.conf.sample
Disallow: /package.json
Disallow: /php.ini.sample
Disallow: /RELEASE_NOTES.txt

# Do not index the page subcategories that are sorted or filtered.
Disallow: /*?
Disallow: /*?Dir*
Disallow: /*?Dir=desc
Disallow: /*?Dir=asc
Disallow: /*?Limit=all
Disallow: /*?Mode*

# Do not index the link from the session ID
Disallow: /*.php$
Disallow: /*?SID=

# CVS, SVN directory and dump files
Disallow: /*.CVS
Disallow: /*.Zip$
Disallow: /*.Svn$
Disallow: /*.Idea$
Disallow: /*.Sql$
Disallow: /*.Tgz$

# Website Sitemap
Sitemap: http://<yoursitename>/sitemap.xml

How to create robots.txt file

It is very simple to create robots.txt file and put into your server.

  1. Open Notepad and copy and paste follow example.
  2. Save as file as robots.txt
  3. Put this file on your web hosting root folder.
  4. Done.
Like this post? Please share to your friends: