Mobile indexing as it applies to AI

Cindy Krum’s keynote at EngagePDX a few weeks ago was both fascinating and a little disturbing:

More detail can be found in this post Understanding Mobile-First Indexing (2/3): The Long-Term Impact on SEO

Mobile-first indexing is not the same as Mobile-first design and making sure your site is friendly to mobile devices. The internet is expanding to the point where Google cannot crawl everything, and crawling is in itself inefficient. Indexing is moving to a place where there are no URLs and Google (or Alexa, or a similar device) will tell you the answer to your question regardless of whether there is a URL associated with it. For marketers this means that we really need to be very clear about what question our content answers and structured data and matters again. See my recommendation to set your pages up with structured data and rich snippets and then my reversal when Google told us rich snippets were over. Tagging your content correctly will help our future AI overlords understand that we hope to be the answer chosen for voice searches and other searches.

Knowledge graph entry for Knowledge graph

Click to embiggen: Knowledge Graph entry for Knowledge Graph

For people who use the internet, this is actually already happening. When you use Knowledge Graph results, or your Google Now results, or anytime you ask any voice-enabled device for an answer you are not only using primitive AI, but you are also training the AI to understand context and relevance. The future is pretty amazing, and I’m sure there is no reason to worry about skynet. Yet.

Cheat sheet for Google algorithm updates

Google has been very busy with updates over the past few years and I sometimes need a quick reference to remember the differences in the most important algorithm updates — Panda, Penguin, and Hummingbird. This infographic should be a good cheat sheet:

Google algorithm changes cheat sheet

Google algorithm changes cheatsheet — penguin, panda, hummingbird (click to embiggen)

Below is a little more information for reference:

Major Google algorithm updates 2011- 2014

  1. Panda was introduced February 2011 and the last update was Panda 4.1 in October 2014.
    • What it is: Algorithm update identifies and demotes low quality pages such as content farms, thin content, and duplicate content — from scrapers or from an infrastructure issues on your own site.
    • Action: Don’t republish stolen content, use a canonical tag to clean up duplicate content, make sure what you publish on your site is valuable to your reader and each URL has completely unique content.
    • Search Engine Land’s Google: Panda Update
  2. Penguin was introduced April 2012 and the last update was Penguin 3.0 in December 2014.
    • What it is: Algorithm update penalize sites using spammy link techniques, like link farms and other “unnatural” linking practices.
    • Action: Clean up suspect links leading to your site. Use Google Disavow to distance your site from bad links. Don’t buy links.
    • Search Engine Land’s Google: Penguin Update
  3. Hummingbird was introduced in August 2013 (although it was not publicized until late September 2013) and is not exactly an algorithm update, but an actual change to the SERPs.
    • What it is: Refresh of the entire Google platform to direct searchers to content that answers questions more quickly with the user intent in mind. Hummingbird also takes into account previous searches and is a huge change to how Google provides results.
    • Action: Create content that answers the queries you think your target audience might have and do not try to “rank” for a word or, worse, multiple and not completely related words. Think of your content as a way to fully answer questions about whatever your site is about with very focused pages.
    • Search Engine Land’s Google: Hummingbird Update

Many of these changes are true improvements to the search experience, although many sites felt they were penalized unfairly. I’m certain more interesting updates await in 2015.

Google Authorship is dead, but rich snippets and structured data still matter

ErnestCat has questions about rich snippetsAs of August, Google has completely discontinued all support of of Google Authorship (SEL article). I’m disappointed because it was one of the more interesting experiments in semantic search and also because I have to eat my words. Two years ago I began recommending author markup to all the authors and web sites I work with. But I’m taking back everything I said in this post: Using rel=author tags on blogs to optimize content ownership. The process was too convoluted and because of that too few authors went to the trouble of linking their content to their Google + accounts. Google tried to do the linking programatically, but often failed in bizarre ways, like Giving Truman Capote credit for continuing to write for the NYT even after his death, for example. So that experiment has ended.

But rich snippets and structured data are not dead yet. And in fact, even Google Authorship may not be completely dead and could return in another form using methods not dependent on humans marking up their pages. But structured data for non-author content is still very much in use and Google is still supporting the use of markup for SERP enhancements. It may be that eventually Google will use Knowledge Graph/Knowledge Vault and other sources to pull this structured data from web content and associate it with search queries — but for now, whatever markup you can add to your pages that will help Google understand what data your site provides can only help. But I wouldn’t recommend making a huge investment in adding this markup, Google obviously hasn’t quite figured out what help they need from webmasters in creating results with structured data.

How to unpersonalize search results

The world beyond GooglePeople often ask how they can get an “unfiltered, unpersonalized view” of their “rankings” on Google. The short answer is, you can’t.

Unpersonalizing your Google results is a useless exercise
You can remove some personalization from your results, but from an SEO standpoint, there is absolutely no reason to monitor slightly less personalized Google results. Just like you, your customer/audience is getting personalized results in Google, and unless you figure out a way to recreate each visitor’s Google SERPs, then the only real way you have to gauge SEO success or failure is monitoring visit totals from search engines to your site and a handful of other web traffic metrics. The Google Hummingbird update was so extreme that it’s taken personalization to the point where rankings and keywords are irrelevant for gauging Google success, but other traditional search engines like Bing, Yahoo, and Baidu also personalize results to varying extents. Personalization is the future of commercial search.

Getting less personalized Google results is possible

However, you may have a real need to get results that do not include the same results you already saw. For example, when you’re doing research on a topic and want to gather as much new data as possible, seeing the same top pages you already visited is not helpful when you want to know MORE, not just just reinforce your previous search choices. This is exactly how personalization fails. (This is also true for paid search — clicking on an ad increases the likelihood you will see that same ad again.)

There are many great posts by respected SEOs with step-by-step instructions on how to depersonalize your searches, as well as FireFox and Chrome plug-ins that will give you less personalized results. I haven’t ever found one that I could say with absolute certainty stripped all the personalization from my Google searches, but there may be one out there. This article on from 2011 covers some of the most basic hacks, Google’s Un-Personalized Search. Tools to Hack the Code. Searching for [disable personalized search] may give you more options (many of which no longer work).

How I get unpersonalized results

Although I know I will never see absolute rankings for Google, I do have a couple of methods I use when I want to exit my own search history echo chamber.

Most recommendations for getting less personalized results in standard Google recommend multiple steps (many of which I don’t bother with):

  1. Log out of Google account
  2. Clear your search history
  3. Depending on your browser, set your privacy settings to not personalize, etc.
  4. Wait until the full moon and at midnight shout, “GoogleJuice! GoogleJuice! GoogleJuice!”
  5. Append &pws=0 to the end of your search URL
  6. Expand or change your location by also appending &gl=us or whatever country code you want to search in, to your search URL

You will end up with a URL that looks something like this:

Here are my results:

Attempting to unpersonalize Google results with code

Attempting to unpersonalize Google results with code (click to embiggen)

All these results are local except for one. This isn’t really working for me, perhaps because I’m too lazy to change all my browser settings every time I want to get a purer search. A better method for me is to use the Google Adwords preview tool . Most likely you will need to sign in to use it. Follow these instructions to set up an account if you haven’t already. Then just input your search terms in the search field and see what you get:

Unpersonalizing Google results with Adwords preview

Unpersonalizing Google results with Adwords preview (click to embiggen)

You can adjust your results to see what people in different locations, languages, and devices might see. Here are my results:

Unpersonalized results from Google Adwords preview

Unpersonalized results from Google Adwords preview (click to embiggen)

Again, this is not useful for finding the absolute ranking unless your target audience has no search history or location but it is good for looking for other answers to questions you may be trying to answer with Google. Also, you do have to choose some personalization options to get the results (language, country, device) so it is not completely unfiltered.

Using truly impersonal search engines
The best option I have found for getting unfiltered search results is to use a truly anonymous search engine like DuckDuckGo. Because DuckDuckGo is a completely different engine, it should not be used as an SEO tool to be used to compare “ranking” against Google. It’s just a good search engine that does what search engines do best — find things on the web. Here’s my query:

Truly impersonal dentist offices on DuckDuckGo

Truly impersonal dentist offices on DuckDuckGo (click to embiggen)

The deep web option
If you really want to get not just unpersonalized, but completely anonymous search results outside the traditional web, you may want to try exploring the Invisible web, via TOR or other anonymizing networks. This is not a comparable alternative to Google searching on the Open web, but a completely different option to be approached with caution.

SEO without keywords

Life without keywords makes puppies sad

Pages, user intent,  and user paths matter now

SEOs were finally adjusting to life without ranking,  then late last summer the hummingbird algorithm update and total keyword [not provided] changes rolled out at the around the same time.  Every time there are Google changes, the SEO world freaks out, but these were two pretty big shifts at once.  But with time I’ve come to see this as a opportunity to  re-energize my thinking about metrics, SEO, and the goal of web sites.

A world without keywords

Digital marketing has so many different data points available, focusing on keywords was the simplest and easiest option.  But with so many variables, keywords never told the whole story. Now it’s time to work a little harder on SEO and metrics and create something better. And anyway, we cannot hack our way back to keyword data (believe me, I tried).  Keywords are truly dead (for now, at least).  The future is in refining our pages to answer the specific questions our visitors are trying to answer. And it just so happens that the Hummingbird update is very much centered on user intent and providing more relevant pages  — highly focused pages are the way to simplify your metrics and win in Google with this new algo update.

What is your page for

It’s not important to know all the keywords a page is ranking for, but it is important to already know what your page is about and how your visitors are using the page.  Questions to answer are:

  • Do you have visitors?
  • Where do these visitors come from?
  • Where do visitors go after they access your page?

Your metrics need to be as focused as your pages.  And honestly, those giant lists of “Top keywords” and “Top converting keyword” reports we digital marketing types were cranking out were never really that actionable or useful without context.  Good news! we don’t have to make any more of those keyword reports — they are now officially meaningless. More useful data can be found looking at the top pages overall to get a sense of what resonates with your audience.  We can infer the keywords just from that.  Most articles “rank” for the title of the content. But really, we should already know what our pages are for.

Hummingbird and user intent

Hummingbird deserves a complete post of its own, but I will echo Warren Lee’s  statement that  Hummingbird is about “concepts and not keywords” (Hummingbird In The Trenches: A Canary In The Coal Mine ).  Google is trying to answer complex questions more quickly using the data it has — the relationships between the keywords, the information Google has on the searcher, and the search history have far more influence on the SERPs than keywords on the page.  We can’t know how Google makes those leaps from intent to SERP.  The only thing we can control is making sure our pages answer specific questions and support specific goals.

Metrics without keywords

Avinash Kaushik has a great model here,  Search: Not Provided: What Remains, Keyword Data Options, the Future,  based on his “Acquisition, Behavior, Outcome” metrics.  He recommends sorting out traffic sources by channel so you can tell what methods are working to drive traffic. Avinash, as always, does a better job than I can of providing examples, explanations, and even templates for  Google Analytics, so  I recommend taking a look at his post to get some ideas.

My simplified view of a keyword-free SEO strategy:

  • Figure out what your page is for. Think in terms of the goal of the user. Build (or-rebuild) pages around goal-focused content to meet the user needs.  If people come to the page and take an action, then you are creating valuable content to you and your audience.
  • What metrics should you look at now?  The same ones you already should have available.  Visits/visitors/etc from:
    • Organic Search
    • Paid search
    • Direct visits
    • Social media
    • Other site referrals and whatever other channels you may using to promote your content
  • Then look at what people did before (if possible) and did next.  Did they take an action or did they bounce? What action did they take? Does it make sense based on the goals of the page?

Content should be built around intent and action
Let’s say you have a book about crafting with cat hair and you want to sell this book. Yes, this is really a thing.  What you need is a page that meets the user intent. The data you want to look at is the data that tells you  people are coming to this narrowly focused page and converting. Specifically, the question you need to answer is whether people accessing your page built around ordering a book on making cat toys with cat hair are taking an action (ordering the book).  To help with this, the page should be created to work much like the order page for this item with very few options to do anything that is not directly related to learning more about and buying the book. Narrowly focusing the page meets the needs of your user, fulfills the requirements for optimizing for Hummingbird, and is going to make your metrics much easier now that keywords are no longer available.

Amazon screenshot cat crafts

Amazon screenshot focused cat crafting book sale page

How NOT to build a web page to sell a book about cat hair crafts

Because we could always check the keywords after the fact to see what was working, many digital marketers created pages that tried to do everything for everyone. Those days are over.  You should not have a page that describe why you like cats, how to sew, how to teach your cat to sew, etc.    A page like this is not going to work in Google because there is no specific user value, much less a call to action. This method of digital marketing is OVER:

Unfocused cat hair crafting book sale page

Unfocused cat hair crafting book sale page

Useful data for focused pages

Here is what your data on cat hair crafting book pages (or any goal-focused page) should look like:

  1. Visits/Visitors/etc segmented by channel.  Will tell you what is generally working.
  2. Top referrers tied to action taken.  The closest you can get to seeing user intent.
  3. Bounce/ Time spent on page/other related metric. Do people bounce every time they come from Organic search? If so, your page is not meeting the user intent.
  4. Conversion  and other action data. Do people take an action when they come to the page?  What action?  If this is the page that should compell the user to buy the book, then your key action is the clickthru to the page to buy the book.  If people are clicking any other link, this page is not working.

Still addicted to keywords?
If you cannot see giving up keywords yet and need to slowly wean yourself (or your colleagues/execs) off keywords here are some suggestions:

  • Paid data –You can use paid data to help inform your organic keyword ideas by using the Keyword planning tool or, if you are running Adwords, use your matched search query data .
  • Google Webmaster Tools data (GWT)  –You can also use your Google Webmaster Search Queries> Top queries report to see the aggregated view of the top keywords for ctr and visibility along with the aggregated view of the Top pages for those keywords.  The key word (ha!) here is aggregated.  This is not your old keyword data back from the dead, this is just a little extra information about Google results related to your site.
  • Legacy data –If your site has been around for a while and your offerings/products/topics have not changed much in the last 4 years, you can use your keywords from 2010 as a base for your work on user intent. But any keyword data from October 2011 or beyond will not be usable.
  • Google Trends — This may be the most useful tool for keywords because it requires you to already know what your content is about but gives you ideas of how the interest for that keyword is changing (or not) over time.

But really, keywords are dead. And it is a big shift to make from a keyword-focused strategy to a content-focused strategy, but Google has changed the rules of the game and we don’t have much of a choice. It’s time to embrace that change and refocus our efforts on refocusing our pages.  It will pay off in the long-term for our marketing goals and the goals of our visitors.

Google rank checking is a complete waste of time

Rank checking makes kittens angry

Angry kitten used with Creative Commons license from flickr

The goal of a website is to help humans use your site to take an action. The website may inform people where your business is and how to contact you, it may have a goal to create leads via registration, it may actually sell a product on the internet, or have another goal completely. The website should be built around creating a desired result for whatever the goal is. Rankings are not a result and should never be your goal. Because rankings are personalized, it is impossible to know what your baseline is, and even if you could know whether you had #1 rankings for all your favorite keywords, high rankings don’t translate into visits. Rankings are meaningless in 2013 and should not be considered a metric, much less a KPI.

Ranking is not a KPI
If you are working on marketing or optimizing a website and you become obsessed with the results of your tactics and start believing that your tactics are your KPIs, then you are doing it wrong. Consider this a gentle intervention for rank checkers (I know how it is, I used to be a rank checker myself). SEO/SEM applies digital tactics to the website, that website should be part of a much larger content marketing strategy, which is in itself a smaller part of the overall marketing effort of a company. The goal of marketing yourself or your business with a website is not #1 rankings, so the goal of SEO should not be to get #1 rankings, or even to get massive traffic from Google — the goal of SEO is to support the wider marketing goals of your personal brand or business.

Google gives you the #1 results you asked for
Keep in mind when you are tempted to report #1 rankings as a success metric (and it is so tempting because it looks so good in a powerpoint) that Google is probably serving you your favorite web page as #1 because you keep searching for it and keep clicking on it. But if you present this to your exec and he or she has not been obsessively searching and clicking on that same keyword and URL, you may have some explaining to do when the executive does not get that great looking #1 result.

Ranking does not exist
Manual Rank checking does not exist anymore as anything more than a way to spot check what Google is serving you in your personalized results. Your results have been personalized for years, as I noted in this 2011 post Manual Rank Checking in the era of Google Personalization. This has not changed.

Jill Whalen explains very clearly and much more in depth Why Running Ranking Reports Is a Fool’s Errand. Even if you don’t have time to read the whole thing, scan through the bolded text. All this is true.

Auto rank checking violates Google’s TOS
And automated rank checking is also useless and is a violation of Google’s TOS and can get your IP blocked. This has been the case for many years — I know this because I dabbled in automated rank checking tools in my early wild years of SEO and Google blocked my IP. But I made a recovery and vowed never to rank-check again. And as Jill Whalen reveals in her post, Google may even be looking into penalizing the web sites of companies who do automated testing. So not just your IP, but your actual web site could get dinged if you are autorank checking.

But even if you could get a true ranking report for your site and know to the minute what your rank in the SERPs are for all keywords for your URLs, what does it get you? Nothing. Rankings do not equal visits and even visits do not always equal success. Before you rank check again, stop and think about the goals of the business and then step away from the Google.

Keyword research using the Google Adwords Keyword Planner

Forced Adwords sign-up makes kittens angry

Angry kitten used with Creative Commons license from flickr.

Updated October 17, 2013 to reflect yet more changes to this tool

As you may have noticed already, Google has replaced the old Keywords tool with the Google AdWords Keyword Planner. More from Google on this.

The biggest change as far as daily use is that you are now required to sign-in to Google and associate an AdWords account with your gmail ID in order to use the tool. You also must go through a couple of extra steps to get to the tool, but generally it still works the same — although you cannot get mobile keyword totals separated out anymore (and there are initial reports of unexplained changes in keyword totals).

I’ve been using this new version for over a month because I have an AdWords account and I’m always signed in. I’ve written up a quick overview to help people get back to keyword researching as quickly as possible.

Here is the new process:

1. You must be signed in to a Google AdWords account to use the tool. You do not need to be running Google AdWords (but Google would certainly appreciate it if you would), you just need to enable a gmail account to run AdWords. It’s free. But this change does mean you will not be able to look-up terms without signing in.

2. You will be redirected to something call the “Keyword Planner” if you try to go to the Keywords tool URL — or you can just access it directly.

3. You will have to choose from 3 options. Choose the twistie to Search for new keyword and ad group ideas:

Use this option to get keyword suggestions in the Google Keywords planner

Use this option to get keyword suggestions in the Google Keywords planner



4. This should give you options that are familiar to you if you used the old Google Keywords tool. You now have the option to exact match your keywords in this view, to ensure you are shown keywords that are truly associated with your initial keyword choice. Enable the sliding button to Only show ideas closely related to my search term.  Input your terms or analyze a web page and make other changes to further target your results if needed:

Enter keywords and choose exact match

Enter keywords and choose exact match in the Google AdWords Keyword Planner

5. When you get to the results you can view all the other possible recommended terms and their estimated search volume by choosing the Keyword ideas tab. And if you forgot to chose  exact match in the previous views, you can choose it in the left nav under Keyword options.

Keywords idea results from Google AdWords Keyword Planner

Keywords idea results from Google AdWords Keyword Planner (click to embiggen)

This should be enough to get started using the tool.

Here is a much more complete Step-by-step for using the Google AdWords Keyword Planner How to Use The Keyword Planner — The New Keyword Tool From Google AdWords.

is encryption stripping out referrer data?

Evil 1990s kitten says check the Mac Plus for your lost data

Evil 1990s kitten says check the Mac Plus for your lost data

Is encryption causing you to lose search referrer info in your Web metrics?

The short answer is yes, but it has always been hard to match all referrals back to specific URLs. And for once, the problem is not just Google.

Encryption is already a  problem

As most everyone who looks at data knows, analytics programs are definitely are not capturing all keyword data, since Google began encrypting keywords in 2011 . That encrypted keyword data is what is being reported as “(not provided)” in Google Analytics reporting. But Google has not encrypted data to the point where you cannot see the URL, only the keyword is being stripped out.

You may be familiar with a URL referrer called “None” or “Direct” or “No referrer” in your data. This referrer could come from apps, offline, and non-browser referrals to your site from anywhere (including, but not limited to Google properties). It is not the keyword encryption from Google and it is not a flaw in any other metrics software that is causing “None” referrers. The “None” referrer just isn’t recordable as a URL, because most Web metrics tools only track browser referrals.

However, there is another issue with encryption that may cause you to lose URL referral data. And as encryption becomes more broadly used you may start seeing more “None” referrals in your metrics. The way it works is when you are on a secure site (https) and then click on a link that is not secure (http) and the URL data is lost in the transfer from https to http.

Google has a solution for SSL search, but mobile and Safari searches may still be stripping out URL info
When Google decided to encrypt all signed in searches they fixed that particular problem with the https to http transition with Google SSL search because they really didn’t want us thinking that Google stopped sending any traffic to us, they just didn’t want to share the keywords with us.

But the problem still exists and complete loss of referrer info is a problem in some mobile searches using Google and Safari browsers. More info on this can be found at SEL: Safari Shifts To Google Secure Search in iOS 6, Causing Search Referrer Data To Disappear

So that is real problem — a loss of referrer URL data because of encryption — it is not a Google search problem though, except possibly for mobile and Safari. The unfortunate thing is I have always seen “None” referrer data in all the metrics I have monitored over the years and it is impossible to know how big a problem it could be when the source of None referrers is not trackable.  But if you start seeing an unexplainable increase in “None” referrers you may want to investigate whether your site or your audience behavior is changing in such a way that encryption could be a factor.

Getting started with structured data, rich snippets, semantic markup

4 out of 5 evil kittens recommend semantic markup

4 out of 5 evil kittens recommend semantic markup

Whether it’s called structured data, rich snippets, microdata, semantic markup, using the rel= tag, or hacking, structured data has been a hot topic this week in my SEO feeds and daily search discussions. And adoption of this markup is good news, because rich snippets have a lot of potential for high ROI in organic search marketing.

Structured data is not a new thing. It was not new when I wrote this post last march using rel=author tags on blogs to optimize content ownership — but as was so succinctly stated in a post just last week, If You Don’t Care About Structured Data, You Suck at SEO. That may be a little harsh, but ignoring rich snippets means leaving money on the table. Because you can manipulate your search listings right now with the power of without paying Google another cent — at least for now…

So why now all this structured data buzz? My guess is, it’s something SEO pros can actually control that Google supports. The past few years have been rough for the search world. We’re adjusting to over 50% [not provided] keyword data, new linking penalties, and constant Penguin and Panda updates rolling out it seems like every day (why yes, there is one in progress right now, see the Webmaster world Google Update thread for more info). So a way to have control over your SERPs and help Google understand what your page is about is welcomed.

Getting started with structured data

Another nice thing about rich snippets is that, unlike pretty much everything else related to Google, Google is giving out very good instructions on implementing structured data on your page. Google prefers you use and has conveniently created a FAQ and instructions.

And you can test your snippets with Google’s Structured Data testing tool.

Where should you use structured data?

Everywhere! Well, maybe not everywhere, but anywhere it might be helpful for a visitor to know more about your page and encourage him or her to click on that link in the SERP.

If you host a blog or market a blog you absolutely should use author markup. It’s a little more complicated to claim your blog in Google than it is to just toss some code on your page, but if you get a Google referral and someone returns to Google after visiting your blog,the searcher get extra links when they return to Google. More details on the hidden benefit of authorship on SearchEngineland and my original post on rich snippets for authors using rel=author tags on blogs to optimize content ownership has more details as well.

Think about your own user experience, wouldn’t you rather click on a blog post with a picture of a real person:

ljbanks rich snippet

ljbanks rich snippet

And I know when I’m looking for a product or restaurant, a review with stars in the results like Yelp provides is very enticing:

Yelp,, and my favorite bar

Yelp,, and my favorite bar makes all of this possible!

Other places you might want to enable rich snippets are events, videos, and anything with ratings.

And Google has even created a hack for site owners who have an event site in English. Google has a wysiwyg editor called the DATA Highlighter that will do most of the work for you.

If you’re serious about your Google referrals and want to take positive action in a time of much sadness for SEO professionals, and you have a minimum of technical skills, structured data is the way to go. A small investment could create a long-term payoff.

Google cache hack: See what Google sees

I’m pretty sure I stole this from another SEO, but because I’ve been using it as the first step of  basic SEO testing  for so long, I’ve completely forgotten where I got this. But today I was reminded of  what a great little hack it is when a colleague asked me how I  always know what Google sees when it indexes a page.  Use this hack when when people ask  “Is Google indexing this page?” or “Why does Google not index my content?” and they will think you have Google x-ray vision.

(Pro-tip: Usually when you have a page in a site with no other major indexing problems that is not surfacing in Google, it is a JS or AJAX issue and Google cache is the first step in diagnosing this problem)

Google cache is nothing new, but if you add it to your browser bookmarks you can immediately view what is happening in Google the moment someone IMs you a URL and asks “Why does Google hate this URL?”.

How to Add Google cache to your browser bookmarks to instantly view Google’s view of your page

  • Right-click in your browser shell (under your URL field, usually) and, depending on what browser you are using, you will either have the option to “Add a bookmark” or “Add a page” or something similar.  I’m using Firefox and Chrome on a windows machine in the examples below:

Chrome bookmark view

Chrome bookmark view



ffbookmark view

Firefox bookmark view



  • In the location field (Firefox) or URL field  (Chrome) enter:
  • Add the bookmark to whatever browser you do most of your SEOing in and anytime you wonder what Google is seeing on a page just enter the page in your browser and then click your “Google Cache” bookmark window and you will see when Google last indexed the page, what URL Google indexed, and what that index looked like.

Google cache FF browser
Viewing the URL in Google cache is a great way to determine if you have any weird redirect problems (self-inflicted or otherwise).  Also, I usually click on the “Text-only version” of the page so that I can actually see if Google can see the words I think this page should be indexed under.  This is another great way to diagnose rendering issues and/or text that looks like text but is actually images (basic SEO).

Text-only version Google cache

Text-only version Google cache


I’ve used a lot of SEO tools over the years, but this Google cache hack remains the standard first step in diagnosing Google indexing problems. And remember: spidering, indexing, and ranking are different things.  If you do not see a cache for a page it does not mean Google has not spidered it, and if you see a cache for a page and you are still not ranking for what you think you should be ranking for, remember all this proves is that Google has a cache for your page — no promises were made.  Also, the  ranking you see is not necessarily the ranking everyone else sees for your page.