Assessing Where You’re At – Shopify SEO Audit
Before you launch into any kind of Shopify SEO work, it makes sense to analyze where you’re currently at in terms of SEO performance. This way, you can set a baseline and measure improvements from there. Importantly, it will also highlight any major problem areas that you need to tackle first.
Pro Tip: If you want to optimize your baseline performance with one of the top premium Shopify themes that’s already SEO-optimized, the Shoptimized™ Theme takes care of a lot of the SEO headaches for you.
To establish a baseline, you’re going to need two essential free tools; Google Analytics and Google Search Console:
1. Google Analytics
“Without instruments, you’re going to eventually crash.” These were the words of my business mentor twenty years ago when I was running a construction business by the seat of my pants. His words are every bit as applicable to e-commerce as they are to any business.
And the best way to track and measure your store’s performance is Google Analytics. It’s great for tracking your key business metrics like Conversion Rate, Visitor Value, and Average Order Value, but it’s also a fantastic tool for benchmarking your SEO efforts and measuring improvements.
If you haven’t got Google Analytics set up on your store already, here’s Shopify’s guide on it.
2. Google Search Console
This tool (as well as Bing Webmaster Tools) plugs the gaps that Google Analytics leaves and gives you some imperative information about your Shopify store and how it’s viewed in the search engines’ eyes.
GSC will quickly become your go-to tool for monitoring your store’s SEO-related factors like crawl errors, mobile usability, broken pages, schema markup, and inbound links to your store.
So now that you have both GA and GSC set up here’s what you need to do:
Uncover Index Gaps
Go to google.com and enter the following (without the quotation marks) “site:yourstore.com”. This will tell you how many pages from your store have been indexed by Google.

The next step is to do a gap analysis between how many pages are indexed and how many are crawlable by an SEO spider tool like Screaming Frog. It’s free up to 500 crawlable assets, you just need to download it to your computer and run it.

From the example above we can see that ScreamingFrog returns more than 311 results so in this case, it would be worth upgrading to the paid version to get the full picture.
In fact, a spider tool will rarely scan fewer pages than Google has indexed. This is because the robots.txt file blocks Google from crawling and indexing certain pages.
Once you have the full results you can export them to a spreadsheet and begin your index gap analysis. There are two key questions you need to ask:
- Which URLs are being indexed that shouldn’t be?
- Which URLs aren’t being indexed that should be?
Copy and paste the URLs that should and shouldn’t be indexed to a separate tab or spreadsheet and get to work.
One of the key reasons that the spider tool can crawl more pages than Google has indexed is because of poor site architecture. Generally, this isn’t such a big deal for Shopify stores because of their rigid page structures for /pages, /products, and /collections.
Having said that, if your site architecture has gotten messy and a page is buried 4 or more clicks away from your homepage, you’ll need to do some housekeeping so Google can find it and index it. We’ll address poor site architecture later on in this guide.
The Proper Way to De-Index Shopify Store Pages
If you have pages indexed in Google that you don’t want to be, there’s a right way and a wrong way to remedy it.
Let’s say you created a new collection page for a 50% off flash sale and the promotion has ended but you’re still getting traffic to the page via Google. Obviously, you don’t want to keep sending traffic to a page that’s full of out-of-stock products (assuming the sale went well) or products where you’re giving away too much margin.
(I talk about the dangers of perpetual discounting here.)

The wrong (and most common) way to de-index a page is by editing your theme’s liquid code with a noindex or nofollow rule for the page you want de-indexed. The trouble is that the page is still discoverable via the sitemap or from other links on your store.
The right way to de-index a page is to find the page or collection that you want to de-index and create a metafield with the following value:
"namespace" : "seo"
"key" : "hidden"
"value" : 1
"value_type" : "integer"
This will add a nofollow and noindex rule to the page, remove it from the sitemap and remove it from the site search.
Google Search Console Crawl Stats Report
The crawl stats report gives you an overview of how many pages Google sees on each scan. Look out for sharp spikes or dips or a peak that was higher than the total indexed pages on the site.
So from our example earlier, if you remember, Klevercase has 311 indexed pages but you can see that more than 500 pages were crawled in late July.
The increase could be from new products/collections being added or the submission of a new sitemap.
The good news is that the image below shows the frequency we would expect for this store since the weekly spikes correspond to the <changefreq>
parameter in the sitemap.

Lifetime Organic Traffic Drops
Lifetime organic traffic drops can be caused by either slow attrition or a sudden drop caused by a Google Algorithm update. If your store has been affected by an update, you’ll be able to correlate it with Moz’s Google Algorithm Change Log against your organic traffic in Google Analytics.
A sudden drop will be obvious but slow attrition is harder to spot without keeping a keen eye on your metrics. You’ll find the report in Google Analytics under Acquisition and then select Source/Medium as shown below.

Alternatively, Barracuda’s free mapping tool overlays the algorithm updates on your organic traffic report so you can correlate the dates more easily.

Slow attrition is symptomatic of not keeping on top of your SEO. Causes to investigate are a Google update resulting in a minor penalty, increased competition, or short-lived and dubious SEO practices.
Like I said in part one of this article series, unless you’re clued-up about good and bad SEO practices, you can easily be taken advantage of by unscrupulous SEO agencies or freelancers and get penalized by Google. So-called ‘Black Hat’ SEO will cost you dearly, Google penalties can take weeks or even years to recover from.
If you have competitors hot on your heels then you can’t afford to drop the ball with your SEO efforts. In part one, I also showed you how the number one spot gets the lion’s share of the traffic and sales so you have to maintain your rankings or leapfrog your competitors.
SEMRush is a handy tool for keeping track of your keywords’ performance. It’s a great sanity check and also plugs some of the gaps that Google Analytics and GSC leave.
Manual Actions – Google Search Console
If you get a ‘Manual Action’ notification it’s a screaming priority for you to fix it and fast. You’ll only ever get a notification when a human reviewer at Google has determined that pages on the site are not compliant with Google’s webmaster quality guidelines.
Common causes are if your site has been hacked and littered with third-party spam or unnatural links. I’ve never seen it on a Shopify store but plenty of times with WordPress sites.

Site Architecture
Site architecture relates to structuring your site in an SEO-friendly way as well as having good housekeeping and usability.
Collections
Your store’s collection pages need to have a logic to them that not only instantly tells your visitors what they’re likely to find when they view your collection but also tells the search engines the same.
It’s always best to create your collections for your customers first and SEO second. Google is sophisticated enough to know when you’ve created user-unfriendly collections resulting in dropdown menus with 100+ items on them.

Doing some competitive analysis is always a good idea to see how your competitors organize their collections and which keywords they are targeting with each collection page.
As a rule of thumb, you’ll want to keep things as simple as possible for your visitors and choose one keyword for each collection. Don’t be tempted to create a collection for each target keyword within a cluster of related keywords, it will kill usability by confusing your customers and your sales will suffer as a result.
Just make sure that the keyword is included in the URL of the collection name as well as the title for that collection.

Larger stores with a diverse range of products have a more complex job on their hands. If you have a large store that’s dependent on organic traffic then you’ll need to do some deep keyword research and organize your collections according to high-volume target keywords.
If you need help choosing keywords, see my in-depth guide in part one of this articles series.
You can also do some user testing and heatmap analysis to help understand how visitors navigate your collections.
Part 3