Select Page
January 30, 2015

How to improve SEO of your
Magento store? Part 1

Author Victor Achilles
Share

Just uploading a website is simply not enough today. To have ample visitors regularly, you need to conduct SEO activities on your website. SEO or Search Engine Optimization is a strategy or method or a set of best practices that helps improve visibility of a webpage by attaining relevant web traffic making a website search engine friendly.
Alike others, SEO are crucial for websites or e-stores made on Magento. Yet, many find managing SEO troublesome. But if held with professional hands, performing SEO can be relatively simple. In our first blog of the series, we share insights on two of the many ways that helps improve SEO for your Magento store:

Managing URL Rewrites

Search engine friendly URLs are an important ranking factor for all ecommerce websites. This popular Magento SEO tool creates rewrites for products and categories, and custom rewrites for other pages in store. It allows businesses to change/rewrite dynamically generated URLs to search engine friendly URLs that includes most important and relevant keywords.
Magento URL Rewrite functionality however causes issues at times in SEO perspective. Many a times, URLS are appended with -1, -2, etc. due to the default behavior of Magento primarily because another page might be using the same URL key or the key being used in the past. Bulk uploads via CSV creates confusion for Magento ending up rewritten URLs. But, once you realize this and plan to switch back, you can’t as the URL keys by then is already in the rewrite table. In case you face this problem, we suggest you to delete the rewrite entry against the previous URL; but make sure you have a professional hand with you as deleting incorrect rewrite needs delicacy.

Setting Up a Robots.txt File

Robots.txt file is a standard text file that should stay in the root directory of the Magento site. Magento lets you use a robots.txt file to control which pages of your site are indexed by search engines. Robots.txt, often pronounced “robots dot text,” is a text file that is checked whenever search engine robots visit your web site. Search engine robots, or search engine “bots,” are machines that index web content, including your store. The robots.txt file is currently the standard for determining which information on a web site gets indexed. When a search engine robot visits your web site, it first looks for a robots.txt file. If one is found, the search engine robot follows the instructions in the file.
Robots option can be set to one of the following:

  • NOINDEX, FOLLOW: Pages are not indexed, but search engine bots are allowed to follow links from applicable pages
  • INDEX, NOFOLLOW: Pages are indexed, but search engine bots do not follow links.
  • NOINDEX, NOFOLLOW: Pages are not indexed, and search engine bots do not follow links.

Creating a Custom robots.txt File
If you need more control over which pages and directories search engine bots can index and follow, please follow the example below:
Imagine that you have a store with a set of pages and folders that you don’t want indexed by all search engine bots.
The pages and folders you may want to prevent from being indexed are:

  • /terms-of-service.html
  • /special-offers.html
  • /referral-discounts.html
  • /customer/
  • /review/
  • /media/

Here’s what the instructions would look like:
Example Robots.txt
User-agent: *
Disallow: /terms-of-service.html
Disallow: /special-offers.html
Disallow: /referral-discounts.html
Disallow: /customer/
Disallow: /review/
Disallow: /media/
And that’s it! Make sure to save your changes so that they take effect. To preview your robots.txt file, go to [your-store-name].com/robots.txt and verify that it reflects the customizations you made.

Recent Blogs