Features Prices
News 0
Latest News See All

Temporarily unavailable. Please come back later.

See All
Webinars 0
Upcoming Webinars See All
Upcoming Webinars

Sorry, we could not find any upcoming webinars.

See recorded webinars
Blog 0
Recent Posts See All

Temporarily unavailable. Please come back later.

See All
Marcus Taylor

How to Obliterate Every Bad Redirect & Crawl Error From Your Website

Marcus Taylor

Google recently announced an update to their guidelines surrounding sneaky redirects. While it seems very minor on the surface and targeted at a small minority of marketers using redirects manipulatively, it’s a good reminder to take a look at the topic of redirects.

In my experience, redirect issues tend to creep up. A good marketer will rarely make a mistake as big as missing a title tag or accidentally no-indexing a bunch of pages. But even the best of us have redirect issues.

Diagnosing crawl errors & bad redirects

The first step in cleaning up is to carry out a thorough diagnosis of all issues. To ensure a comprehensive diagnosis, I’d recommend running a report from a number of SEO tools. Below, i’ve used the Venture Harbour site as a guinea pig, running a report on Webmaster Tools, Screaming Frog and SEMrush.

1. Detecting Broken Links (404s) and Finding Problematic Hotspots

Broken links are not only a drain on bandwidth, they’re also one of the surest ways to get a user to leave your site.

One of my clients recently had over five hundred 404 errors showing up in Google Webmaster Tools. After fixing all of these, the average pages visited per user increased from 1.4 to 1.85 pages/visit, and there was a noticeable decrease in bounce rate.

Webmaster Tools remains my favorite tool for detecting 404 errors, thanks to the fact that WMT will show you what page the 404 error was found on, and what pages are linking to it.


404 errors are one of our biggest problems when it comes to crawl errors and bad redirects, so it’s a good idea to stamp these out as a priority.

2. Detecting Bad 301 Redirects

For spotting pointless 301 redirects, I like to use Screaming Frog’s SEO Spider. The biggest culprit for WordPress sites is accidentally linking to pages without the trailing slash, which, as you can see below, has caused a few silly 301 redirects on the Venture Harbour site.


In an ideal world, you’d have no internal 301 redirects — as every link should (in theory) be amended to where the link redirects. While this may be practical for small sites, it would be weeks or months of work for many sites.

That said, this report will help you understand the magnitude of the problem, and most likely give you a few quick wins that will solve the majority of issues.

3. Detecting All Other Crawl Issues

I have to admit, I only came across SEMrush’s Site Audit tool in preparation for this blog post, but I’m glad I did. While my favourite thing about this tool is how it gives a clear birds-eye view of a site’s on-site issues (incredibly handy for Panda clean ups), it’s also pretty good for detecting any crawl issues that may have leaked through the cracks.

As we can see from the report below, I had a fair number of broken external links coming from a post I wrote on responsive WordPress themes (which had over 250 links on the page)! Checking these links manually would be time-consuming and incredibly dull, so thanks to these tools for making these issues easy to spot.


Cleaning up bad redirects

I’d recommend first exporting all of the reports above into a CSV file, so that you can compile one large list of all issues, categorized by the response status.

To start off, we want to turn all 404 broken links into 301 redirects. To do this, you’ll want to create a column with the broken URL (with the domain stripped), and another column with the full URL of where the broken link should redirect. From here, it should be relatively easy to create a concatenation rule that looks something like the example below — which is ready to copy and paste into your .htaccess file.


Once you’ve tackled all of your broken links, it’s time to focus on turning bad 301s into plain old 200s (working links). Because bad 301s come in all shapes an sizes, there’s not really an efficient trick to cleaning them up.

The best we can do is use the reports above to identify which 20% of bad 301s are causing 80% of the errors. Usually this might be a 301 link that’s in your footer or header, as these links exist on every page of your site.

Once you’ve identified the biggest culprits, go through and change the link to where it’s redirecting to. Hopefully you’ll end up with a site with no troublesome bad redirects or crawl errors.

Preventing crawl errors from occurring in the future

As mentioned in the introduction of this post, crawl errors are inevitable. Especially if you run a large site with multiple writers.

However, there are a few ways that you can reduce the likelihood of them popping up. The first thing to realize is that crawl errors compound; every time a piece of bad code is copy and pasted, or a new page is created with a crawl error on, the problem compounds. Think of crawl errors like pest control — if you clean up 99% of them, they’ll soon pop up again. Clean up 100%, and you’ll probably eradicate them for a long while.

Ayima Redirect Path & Broken Link Checker

One of the most effective and simple systems I’ve used to keep crawl errors at bay for my sites and client’s sites is ensuring that all site contributors have two plugins installed: Redirect Path and Check My Links.

I encourage writers to run Check My Links before any page is published to detect whether there are any broken links in their content.


On top of that, I recommend that everyone on the team keeps an eye on the redirect path Chrome plugin to spot any silly 301 redirects, like the one I just spotted below.


Between these two tactics, you have a good system for keeping crawl errors at bay. Of course, you’ll need to run a site-audit every now and again, but you’ll have far less to do if you implement systems to reduce issues in the first place.

In Summary

Admittedly, most us have bigger fish to fry when it comes to SEO, and a few pesky 301s are unlikely to have any major repercussions. That said, redirects can easily get out of hand. Considering they indirectly affect site speed, user experience, conversion rate, search rankings and site security, I believe it to be worthwhile to create a system for efficiently keeping them at bay.

If anyone’s come across any other good plugins, tools or advice on handling crawl errors, I’d be keen to learn about them in the comments below.

Author bio

Marcus Taylor is the founder of Venture Harbour, a marketing agency that specializes in working with brands in the entertainment industries. Marcus also runs an eclectic portfolio of his own projects, including WhatisMyComfortZone.com, a comfort zone calculator he spoke about at TEDx, Qosy and MusicLawContracts.com.

Have a Suggestion?