21 Mar What to Do When Your Organic Traffic Drops

Losing organic traffic to your website can be a very scary and costly experience, especially if you are unsure of the factors behind your site’s slipping traffic.  Luckily, many of the scenarios that can result in a sharp traffic drop can be fixed almost as quickly as the initial drop occurred. No matter the cause, checking the following aspects of your site can usually identify issues adversely affecting your organic traffic.

Check Google Analytics

If your organic traffic trend looks like this, it’s very likely your site was somehow misconfigured.

If you notice a drop in organic traffic, the best place to start is likely the same place you noticed the traffic drop in the first place: Google Analytics. Take a look at the traffic trend for your organic traffic and try to pinpoint a date where traffic began to fall off. If traffic dropped quickly in a short amount of time, it is more than likely the result of a website misconfiguration. If traffic gradually dropped over a long span of time, it is more likely the result of other external factors like competitors and keyword rankings. Once you’ve established the split between your current traffic and prior “normal” traffic, compare organic landing pages for the two date ranges in Google Analytics. Is there a noticeable traffic drop in any one page or pages? If so, keep these specific pages in mind as you take a closer look at your site.

One other item to check in Google Analytics is your other traffic sources. Is the drop in traffic exclusive to organic traffic? Or are there changes to other traffic channels as well? If traffic has dropped in all channels, the issue is likely bigger than just the organic channel and could be the result of technical issues on the website or the server. If organic traffic has dropped while other channels have grown, it’s worthwhile to verify that organic traffic hasn’t slowed as a result of other channels like PPC. An increase in paid traffic can easily come at the cost of organic traffic, so it’s important to identify if your organic traffic is still coming to the site via a different channel.

Check Google Search Console

 

A Google Search Console property with a manual action penalty

Once you have an idea of the trajectory of your traffic, the best next step to take is to look at Google Search Console. The first place to check is your messages to ensure you haven’t been subjected to a manual action penalty. A manual action penalty can occur for any number of reasons, but it is uncommon to receive a manual action penalty without specifically participating in behavior that Google frowns upon. If you don’t have any manual action penalties, take a look at your indexation status and check to see if there has been a dropoff in total indexed pages. Lastly, use the URL inspection tool on pages you’ve seen traffic slip and confirm that Google can crawl and index the page without issue.

 

Run a Crawl of Your Site

A Screaming Frog site crawl

If your site experienced a dramatic drop in traffic, the best way to pinpoint the problem is to run a crawl of your site using a tool like Screaming Frog or any number of online SEO tools. When reviewing the results of the crawl, there are several things to look out for that can result in pages being de-indexed or not being crawled, including the following:

Meta robots tags: ensure that there are no pages unintentionally tagged with the meta robots noindex tag. Pages with a noindex page will most likely be de-indexed from Google upon being crawled, so removing unintended pages from the Google index can have catastrophic results, especially if it affects an important page like your homepage.

X-Robots tags: in addition to meta robots tags, noindex directives can be communicated to search engines via the X-robots tag that is configured at the header response level. This type of tag can be placed at the server level, and can easily apply to an entire website. Some SEO tools do not check this tag by default, so be sure to confirm there are no unexpected directives at the header response level.

Robots.txt: a misconfiguration in your robots.txt file can easily block search engines from properly crawling your site. When running a crawl of your site, make sure to instruct your crawler to obey your robots.txt file so that your crawler simulates the behavior of a Google bot—some crawlers will ignore your robots.txt directives unless you change the settings. If you want to quickly check if Google is able to crawl a specific page, you can you also use Google’s robots.txt tester in Search Console to verify that Google can crawl the page. If the page isn’t able to be crawled, the tool will display the directive in the robots.txt file that is preventing the page from being crawled. In this case, you can update and upload your robots.txt file and test crawling your page again.

Beyond issues that can block your site from Google entirely, crawling tools can also surface problems that can strongly affect your site’s ability to rank on search engines, such as duplicate content (generally caused by a lack of a canonical tag), misconfigured redirects, or pages with missing or incomplete metadata.

Check Your Keyword Rankings

 

 

At a minimum, check your average keyword positioning for major shifts in Search Console

Organic traffic, as with all of your website’s traffic, should be closely monitored so that you quickly identify and fix potential issues before the effects worsen. Better yet, proactively checking the above items after major site changes can allow you to fix any problems before they can even affect your traffic. If you do, however, find yourself with a site where damage has already been done, running through the above list can help identify the factors behind your site’s sagging organic traffic and how to get back on track.

 

If your organic traffic has decreased over a longer period of time rather than a sharp drop, the issue could be a function of your keyword ranks slipping. If you use any kind of keyword rank tracking software, take a look at your tracked keywords and the trajectory of your rankings. If your rankings have slowly declined, it’s important to review the SERPs (Search Engine Results Pages) and determine which sites have pushed your pages down in the rankings. If you find that competitor sites are outranking you when they previously didn’t, visit the pages in question and try to determine why these pages are outranking yours: Are they targeting different keywords in their metadata? Does the page better satisfy the user’s intent with a better experience? It could also be that their pages are simply more optimized for the keyword in question. Identifying the gaps between your site and a competitor site can help guide how to improve your pages, but moving back up in the rankings may require some trial and error as well as patience.

No Comments

Post A Comment