Go to Blog

Google’s Fetch and Render: Why It’s Important

Pat Marcello

Lots of folks think that SEO is about “gaming the system.” Well… that’s true of “black hat” SEO, but those of us who are trying to make pages easy for Google to crawl and evaluate are working towards what I like to think of as “natural” SEO.

We put in all the right meta tags, make sure that your page is about what you tell search engines it’s about in your description and generally try to streamline things so that spiders won’t be caught in traps or leave pages entirely.

So, let’s say that you have recently built a new website. Is it search friendly? Or more importantly, is it Google friendly? No, Google certainly doesn’t pay me and I don’t worship at the Google altar, either, but let's face it. Google brings the most traffic and for some reason, that traffic seems to convert. That’s why we want to please the gods of Google as much as we possibly can.

Where to Find Out If You’ve Done the Right Things

No matter how much you profess to hate Google, you have to agree that they have given us all the tools we need to be successful in search and much, much more. I mean, parts of my business run on using Google Tools, and they give us lots of stuff to help us to figure things out. Let’s take Google’s Fetch and Fetch and Render tools, for example.

If you’re new to SEO and/or online marketing, and don’t yet have your site registered with Google Webmaster Tools, do it. NOW!

First go to http://google.com/webmasters. Then verify your site. You can use a small file from Google that you upload to your main directory or add some code to your page, but after your site is verified as yours (or one you manage), all you have to do (assuming you’re already signed into Google) is to click “Sign in to Webmaster Tools.”

Sign in to webmaster tools

Once inside, you’ll find a wealth of information to help you to see how your site is faring in Google’s eyes (not Bing or Yahoo or DuckDuckGo, only Google). They don’t just tell us how many keywords a site is ranking for, but how many backlinks they’re counting for that particular Web property, whether something put your site on Google’s bad side, and importantly right now, whether your site is “mobile friendly.”

So much of this information is critical to getting Google traffic. If you don’t go inside at least once a month or more to see if you’re getting any “you have issues” messages, including “manual actions” by Google, you could be living in a fool’s paradise, eh? You need to clear those things up right away.

However, if you’re doing things right, and not following anyone else’s scam tactics to rank well, you will probably never see such messages. But then again, you might be doing things wrong that you think are right. It’s best to check periodically to know you’re cool. It only takes minutes and it’s worth the effort.

How to Pinpoint Crawl Weakness

One element that many webmasters overlook is the “fetch” page function. I’m not just an SEO, I’m also a designer/developer and I check every website I build to make sure that Google loves it. I allow Google to “fetch” it in my client’s account. (These functions will only work for individual sites that have already been verified. So you can’t go into your account and try to fetch someone else’s site.)

During the fetch function, Google sends out a robot to scan the page you designate in the query box, and you can tell it which platform to fetch it for, too – desktop or various forms of mobile:

fetch as google

You can fetch your homepage by leaving the box blank, or any other directory or page you designate in your site’s interior. This will help you to see whether there are any issues with connectivity or security, and it’s pretty quick. Takes less than 30 seconds, and spits back your page in code, which is what Googlebot sees. A simple fetch never sees the “render” part of the equation or what your visitors see.

When you use the “Fetch and Render” option, Google will tell you about elements that spiders can’t see or are blocked from seeing. Googlebot runs through the page (or set of pages you designate by entering a directory) and all the links on or in it. In it, meaning behind the face of your page, in the code. Links can be to images, separate CSS files, or JavaScript or other codes, for example.

When the fetch and render is complete, Webmaster Tools then shows you two images – one that Google sees and another that your viewers see.

MagnaSites

You’ll notice that the images for my homepage are different. Visitors are asked to “Like” my page on Facebook, which comes up in a lightbox pop-up. But in this fetch and render, Google was happy. Everything looked good and there were no crawl errors or other issues. (Better NOT be. Ha!)

Fixing Trouble Spots

But let’s take a site with issues. After Google completes the fetch, you’ll see this:

Google Fetch Partial

“Partial” means that Google was able to crawl part of your site with no issues, but there were other parts they couldn’t get near.

If you click the listing (/ in this case), you can find out which pages are at issue.

unreachable pages

Happily, this error is only a small image that is unreachable. It probably took too long to load or is broken. Easily fixed. But it’s issue like this that you need to be watching for and repairing, especially if you’re posting elements from other websites (videos, slideshows, text links, etc.).

Other issues will have to do with scripts that might be running. You can also check your robots.txt and edit it to be sure that Google is seeing all that you want it to see and blocking the things you don’t want crawled. (There’s a robots.txt checker n Webmaster Tools, too.)

Other codes that Google might return are:

  • Redirected: Either your robots.txt or javascript has redirected Googlebot to another page.
  • Not found: Though Google can contact your site, they can’t find the page you specified.
  • Not authorized: Google can contact your site, but you have blocked the page from being crawled.
  • DNS not found: The domain name that you entered isn’t registered or reachable.
  • Blocked: Your robots.txt file has blocked Google from reaching the page.
  • Unreachable robots.txt: Your robots.txt file is missing.
  • Unreachable: Google reached your server, but was sent home without seeing the page. Probably timed out.
  • Temporarily unreachable: As above, but this is only a temporary issue.
  • Error: Google was prevented from fetching the page.

The Bottom Line

The best thing to do is to keep your pages crawlable. If you make a change to a specific page, run it through the fetcher at least once to be sure that it’s available and A-OK for Googlebot, unless you don’t want it to be. (Content in a membership site is one reason that comes to mind.) The more accessible and easier your pages are to crawl, the better Google and all other search engines will love you.

And we all know what happens then, right? TRAFFIC. Without that, you have nada.

Like this post? Follow us on RSS and read more interesting posts:

RSS
Pat Marcello is President and SEO Manager at MagnaSites.com, a full-service digital marketing company that serves small- to medium-sized businesses. Follow her on FacebookTwitter or Google+. Pat’s last article for SEMrush was "Google's Fetch and Render: Why It's Important."
Share this post
or

Comments

2000 symbols remain
hi i checked it says partial when i checked it says blocked [link removed by moderator] please tell me what to do
Its really helpful for me, as you cleared the fetch and render doubt. But there is still a problem, in status its shows as partial what is this? can you please clear. I have a digital marketing agency [link removed by moderator]
Pat  Marcello
Nirav Patil
Hi Nirav,

I just checked your robots.txt. I'm wondering if your special characters for the AWRANGE is confusing spiders. Here's what JohnMu (Google) says:

"Partial means Google was unable to read/decipher all of the components trying to load because they were blocked by robots.txt (or possibly another cause)."

Some element on your site isn't loading properly.

Also here in Google Webmaster Central:

"Partial: Google got a response from your site and fetched the URL, but could not reach all resources referenced by the page because they were blocked by robots.txt files. If this is a fetch only, do a fetch and render. Examine the rendered page to see if any significant resources were blocked that could prevent Google from properly analyzing the meaning of the page. If significant resources were blocked, unblock the resources on robots.txt files that you own. For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot. See the list of resource fetch error descriptions."

I'm wondering if it isn't your "Word Art" on the robots.txt file. Words are fine, but you have a series of dashes and such, which may be confusing spiders. I'd remove that and try again.
Pat Marcello, you have really done a good job. I've a quick question: I fetched and render some pages on my website, status showed "partial" then, I later submit my sitemap the result showed I submitted 253 pages and only one was index. what could be the reason? Thanks.
Pat  Marcello
SOLA
Have you checked your robots.txt to be sure you're not disallowing 252 pages? That could easily be the problem.

But the "partial" render is interesting, too. What's your niche? Could it be that Google doesn't like some of your content? (Google thinks it ownw the Internet, you realize.) If you have any hidden text, too many links or links leading to what they consider to be "bad neighborhoods" (gambling, for example), any content that's undesirable, or for several other reasons, Google can decide your pages don't belong in their Index. They can consider them "low-quality," even if they're on the up and up perfect.

Or... Google just hasn't crawled those 252 pages yet. Even if you have a sitemap, that alone won't guarantee inclusion in the Index.
I am facing some problem in structured data in my website [link removed by moderator] what should i do to resolve it please help
Pat  Marcello
Rohit Mishra
Rohit, are you using a plugin for structured data? That might help. Otherwise, it's a consultation. Here are some plugins you can try: https://wordpress.org/plugins/search/structured+data/.
Pat Thank you for writing this great article you help me fix a few issues in our site [link removed by moderator]. But The issues I have now it is that when I check my partial render it says that google blocked the script. Do you have any recommendation on this situation or another article i can read.
I was just working on a new articule on my site about marketing, I usually run the Fetch tool to "report" google my new article. after reading your article, im not sure if thats actually how it works. this is my new article by the way: http://www.rocasoftware.net/blog/importance-internet-marketing-brand/

Ill keep searching for information, but will be great if have information about how to let google know that you just posted new material. thanks for your information, its being very helpful
Pat  Marcello
Rodrigo Roca
Hi Rodrigo,

That's not what the fetch tool is about. It's more about seeing how your page looks to a Google spider, after which Google will tell you if your page is ready for search. It has nothing to do with crawling.

Google crawls all pages automatically. If you want to take the time, you can submit your site to Google: https://support.google.com/webmasters/answer/6259634?hl=en. But as long as you already have Webmaster Tools, I don't think it's necessary. Google already knows your site exists.
Pat Marcello
Thanks for the informative post. clears up a lot of questions. i have switched from http to https done automatically with weebly.. and of 233 pages, only 53 r indexed...(was 190+ when i had the http) i guess it needs time.... nearly all pages are showing the https url when i search site: mysite dot com/blahblahblah dot html.

when images or text are linked to external sites with only http and not https does this affect the indexing?

and if i FETCH a new page it comes up on google search results pretty quick when searching site:blahblahblah dot com but takes forever or never if i leave google to index the sitemap.

i will be reading more of your reports... thank u
Pat  Marcello
trevor jones
Hi Trevor,

Great! I hope you will.

Google may be indexing the pages that get the most traffic first. My site was the same way. Now, of 301 pages, 268 are indexed. It was MUCH lower in the beginning. So, give Google a chance to catch up.
charle john
Thanks for posting this interesting article. This article helps me a lot is solving indexing problem of which my website [link removed by moderator] was suffering. Once, again I would like to thanks, Pat Marcello for sharing this informative article.
Pat  Marcello
charle john
Welcome, Charle! So happy to help.
It is really a nice and helpful piece of information. I’m happy that you shared this useful info with us. Please keep us informed like this. Thanks for sharing.
thanks, as [link removed by moderator] is having difficulties in indexing its pages on google search engine
Thanks for this helpful info as [link removed by moderator] has been having difficulty with getting proper content and relevant keywords indexed.
We now see why.
really helpfull for our site [link removed by moderator]
Thanks :)
thanks, www.highclones.com appreciates the help
Should I do Fetch as Google or Fetch and Render? Also, do I repeat this process when I make changes to my pages? How do I know if my pages are indexed? Thank you so much!! :)
Pat  Marcello
Alisha
Hi Alisha,

Sorry it's taken so long to get back to you. I've been pretty swamped. (Not complaining.)

If you do a Fetch, you'll see the page as a spider sees it. Fetch and render just means they'll show the aforementioned as well as how the page looks on the Web.

Webmaster Tools will let you know how many of your pages are indexed. The best thing to do is to submit a current sitemap. If you're using Yoast SEO, it's easy to get your sitemap information.
Very helpful advice because we had to change all our meta tags and titles on our site www.telboysrecycling.co.uk and now have done the fetch properly, Many Thanks.
I have added a post to my blog. After publishing the blog post I have added the url for fetching in Google Webmasters. But still my blog: http://www.navthemes.com/how-to-install-crazyegg-in-wordpress-website/ is not cached. How much time it will take?
James Smith
Hello
Please let me know why this is happening with my blog.
Pat  Marcello
James Smith
Hi James, I bet it's indexed now! :) No telling how much time things like that take. It's all Google. But getting your site into Webmaster Tools and submitting a sitemap will help.
I followed given google index strategies for my own website www.vtechedu.in it's too good but another days i checked url not showing index. Can you suggest me another option google index?
Pat, great article! Have you ever had an issue with the mobile friendly tool doing the same thing (Partials/temporary error with resources), but having different results almost each time? Its been happening on a few sites.
Pat  Marcello
J Silva
Again, I'm sure you resolved this by now. The mobile friendly tool and must else has changed, like Mobile First indexing. Google is much better at getting mobile right.