Preparing and launching your web store can be a stressful endeavour. From procuring products, ensuring reliable payment gateways and delivery options, to all other technical and organizational challenges to tackle – optimization of the web store for search engines can easily be pushed down on the priority list. However, the search optimization often makes the key difference between a successful (and profitable) web store and an unsuccessful one.
What is eCommerce SEO and what basic things should you have on your “to-do” list when creating a webshop? In the simplest terms, eCommerce SEO includes all activities related to optimization of your webshop for good positioning on search engines (be that Google or any other popular search engine). As with every other website, there are some technical (code and server) related basics that you can do correctly in order to ensure a good starting position when your potential customers are searching for products you are selling.
When it comes to technical SEO basics for a successful web store, there are some general things you should have in mind:
1. Don’t block / hide your content from crawlers
The easiest way you can shoot yourself in the foot from a technical SEO perspective (just after your web store being offline often) is to block the access of search engine crawling bots to your website. What are these crawlers and why are they important? A Web crawler, sometimes called a spider, is an Internet bot which systematically browses the World Wide Web, typically for the purpose of Web indexing (web spidering). In short – they are bots that add your website to the search engine database and ensure that your website will can be found by your potential customers.
Blocking them, or making their job of crawling your website harder, is a sure way to undermine all of the other SEO related efforts you are doing for you web store.
In order to enable proper functioning of crawlers on your web store you:
Shouldn’t use “noindex” and “nofollow” meta tags where they aren’t necessary
The “noindex” meta tag is used in order to prevent most search engine web crawlers from indexing a page on your site. It is placed into the <head> section of your page:
<meta name=”robots” content=”noindex”>
Using this tag makes sense in some cases (as we will discuss later in the article), but you should be careful not to use it on all pages since it will block your complete store from indexing.
The “nofollow” meta tag (as explained by Google) provides a way for webmasters to tell search engines “Don’t follow links on this page” or “Don’t follow this specific link.” It can also be placed into the <head> section of your page, for example:
<meta name=”robots” content=”nofollow” />
It also can be used on individual links, blocking the crawlers from following links to specific pages. For example:
<a href=”admin.php” rel=”nofollow”>admin panel</a>
Using this meta tag on the whole website can interfere with indexing pages that are linked from your content and menus.
Shouldn’t cloak or hide content
One of the techniques used in the “dark days” of SEO was to hide parts of your content from visitors, keeping it on site for ranking purposes only. This was done in various ways, such as using white text on a white background, using CSS to set text size to 0 or positioning content off-screen, locating text behind the image etc.
Attempting to use these techniques today will harm your SEO efforts as search engines have become much smart and you will earn penalties in rankings (or get dropped out completely) if you try it.
In the events where you need to hide the parts of your content, Google’s guidelines can serve as a great reference point.
Should properly set up your robots.txt file
The robots exclusion protocol (REP), or robots.txt is a text file that webmasters create in order to instruct robots (typically search engine robots) on how to crawl and index the pages on their website.
This is a very useful tool for “communication” with crawlers, that (among other things) allows you to specify which areas of your website you don’t want to be indexed. Entering the following code in this file will block all crawlers from accessing your website:
This robots.txt code can sometimes be forgotten in the process of moving the web store from the development environment to “live” status, so be sure to check your robots.txt file (it is usually located on www.yourdomain.com/robots.txt path) to confirm you aren’t blocking your web store from indexing. If you need more info, you can find it in the great guide by Moz.
2. Fix the indexing issues (causing duplicate content in the search engine index)
One of the issues we often see on web stores is duplicate content (product descriptions, images etc). Duplicate content is content that appears on the internet in more than one place (on more than one unique URL). If it is present, search engines have to decide which one is more relevant for search query by users, and they will rarely show multiple pieces of the same content. This can lead to indexing and ranking of the content you don’t want to be ranked and can cause direct loss in conversions and profit. In eCommerce we often see duplicate content due to dynamic generation of pages for various products (for example, red and blue versions of the same product), where the content is pretty much the same, just using different URLs.
How can you help crawlers to index the right piece of content? The recommended way to do this is by adding noindex, follow meta robots tags. The code on dynamically generated pages with this code should look like this:
<meta name=”robots” content=”noindex,follow”>
As mentioned before, “noindex” meta tags can be used to instruct crawlers about sections or pages of your web store that you don’t want to be indexed. Adding this code to your dynamic pages will ensure that they drop out of Google’s index as soon as they are crawled (if they are already indexed), and new pages with this code won’t be indexed. If you need additional info, Yoast has a nice guide on this.
Note: If you are removing already indexed duplicate pages from index – after you add noindex,nofollow tags, they get crawled and they drop out of index – you can additionally block them using robots.txt or URL parameters.
3. Make your site fast
Page speed is one of the factors with increasing influence in ranking during the last few years. It refers to the amount of time that page needs to be completely loaded and it depends on a number of factors, ranging from hosts and all the way to your web store code. Optimizing it will make your website faster and directly impact your conversions (users don’t like to wait for the page to load) and search engine rankings.
Due to a large number of factors impacting speed, we won’t go in depth on this area, but as a good starting point we recommend using https://www.webpagetest.org/ or https://tools.pingdom.com/ for an initial assesment of your web store’s speed and potential issues.
We also recommend that you go for the low hanging fruit first – check the sizes of media (photos, videos) on your web store, ads and banners, widgets (social buttons, comments etc) and the size of your html/css code.
4. Optimize for mobile
With the latest statistics in 2016 showing that 80% of users own a smartphone and 48% of consumers start mobile research with a search engine (Smart Insights), it is clear that having a web store that is not optimized for mobile devices is a major issue.
Luckily, most of the modern eCommerce platforms natively offer support for mobile devices. You can do an easy check with Mobile-Friendly test by Google and go from there. It is extremely important to double check your “cart” and “checkout” pages (in addition to testing the complete payment funnel), since these are the pages where users are the most likely to drop-out if they encounter issues.
One of the newest initiatives by Google is preferring AMP (accelerated mobile pages) in their mobile index. AMP significantly boosts page speeds on mobile devices and it is recommended to take it into consideration, especially if you are developing a new web store. You can read more about AMP for eCommerce in detailed guide by Yoast.
5. Implement HTTPS sitewide
Another ranking signal that has gained importance over the last few years is HTTPS (also known as HTTP over TLS, or Transport Layer Security). Google is putting increased focus on security so it is no wonder that in 2014 they announced that they will give slight preference in search results to web sites using secure connections.
As an eCommerce site, you are already using HTTPS in the login, account and payments sections, but it is highly recommended that you switch to HTTPS on the complete website. This should give you a slight boost in search rankings in the long term. Also, it is good from the user experience perspective, as more and more internet users are putting emphasis on their browsing (and shopping) security.
Please note: Switching to HTTPS isn’t trivial and it can (at least in the short-term) impact rankings, however it definitely pays off in the long term. If you wish to learn more about HTTPS you can consult this guide from Woocommerce.
Fixing technical issues and implementing technologies/techniques we wrote about in this article can be a great foundation for the success of your web store on search engines. While you might feel overwhelmed at first, remember that not all of these have to be done at once. Focus on basic things first (crawling and indexing) and then move to other activities. If you are not sure how to implement something on your web store, or just want to find out more, contact us and we will be happy to help!