Google Content Experiments: 12 Must-Knows Before Using

Last updated By | 11 Comments

content experiments header

Hopefully you’ve considered doing A/B testing on your website to get more sales or leads? You may have heard of Google Content Experiments to help you do this.

But just because it comes under the Google umbrella does it mean it’s any good? And how do you know whether its worth using or not? To help you answer these questions I have put together a list of 12 essential things to know, a short video overview and given it my own mid-term report card.

This tool used to be called Google Website Optimizer, which was pretty good considering it was free. Competition then grew, and back in June 2012 they decided to shut it down, and offer a cut-down version of it in Google Analytics instead. This is what the tool interface now currently looks like:

content experiments screenshot

So let’s get started with the 12 must-knows before you decide to use it to improve your website (or spend much more time with it):

The 12 Must-Knows About Google Content Experiments

  1. GOOD. The tool is very simple and easy to use and setup.
    One of the simplest things first – being part of Google Analytics it benefits from Google Analytics great, easy to use interface. It’s also really simple to create a test (almost too simple, as you will see in my video below).
  2. BAD. Google has not really improved it much since launch.
    It was launched to much excitement and anticipation in June 2012, but since then Google hasn’t done much to improve its limited functionality (see below for more details on improvements). Red-headed step child springs to mind…
  3. GOOD. You can use Google Analytics goals to measure success.
    This is one of the best parts of the integration into Google Analytics – it makes it easy for you to test improving your website goals that you already use in Google Analytics (whether that be sales or something simple like a newsletter sign-up). See the video below to see this in action.
  4. BAD. There is still no visual editor to help you create tests.
    This is a major downfall of the tool, particularly as its main competitors offer a visual editor to make it much easier to create tests. Therefore you are still reliant on tech know-how to create a test page. Surely it can’t be that hard to add this feature? This is an example of the great visual editor in Optimizely:
    optimizely editor
  5. GOOD. You can use Google Analytics segments to analyze results.
    This is one of the main benefits of having this tool being part of Google Analytics – it means its simple to do further detailed analysis on your test results (like how major segments of visitors perform, for example paid search or first time visitors). This gives you great insights to help you create better follow-up tests.
  6. BAD. It only offers split page testing – much too simple.
    This is a big issue with the tool. Unlike its previous incarnation or rival testing tools, it offers no ability to test specific page elements (like a button, image or form) without having to create a whole new page to test against (unless you are an expert who can use their new API functionality). Very frustrating.
  7. GOOD. Running tests with it doesn’t negatively impact your SEO.
    One question I often hear is does this testing tool (and others) affect your SEO efforts? The good news is the answer is no – you can use canonical tags to tell Google not to spider your test variation pages.
  8. BAD. Hard to implement in a flow of pages or on dynamic pages.
    Because you have to create alternative page versions to test, it’s hard to set up tests on pages that show dynamic content or can’t be redirected easily (like on a shopping cart). This limits what you can test and its potential impact. Their new API makes it slightly easier, but you need tech help to do so.
  9. GOOD: It’s perfect for using on websites built with wordpress.
    It’s great if want to test improving a blog or simple website based on a wordpress platform – there is even a plug-in to make it really easy to add necessary tracking code to your pages. But bear in mind if you have a small website, you need enough traffic to get results – at least 1,000 uniques per week.
  10. BAD. It still uses ‘multi-armed bandit’ testing by default.
    Their decision to use this newer testing methodology has been quite controversial – they launched to help you get results quicker, but it doesn’t always give you the best result. So much so that that they recently added an option to change it so your test variations will get equal amounts of traffic – shame they hid this under the advanced options though.
  11. GOOD: Yes it’s still free to use.
    Sure, you can’t beat that its free. But not when you consider there are much better testing tools which cost the less than 50 cents per day (Optimizely has plans that start from just $17 per month). Which smart small business owner or online marketer can’t afford that to help boost their sales?
  12. BAD: You can’t do multivariate testing in it.
    Because its a split page test based tool, you can’t run multivariate tests. This is where you test multiple page elements at the same time to find the best converting combination. While beginners won’t really use this, its essential for any advanced online marketer.

A quick overview of the tool and whether its worth using

To help you understand some of these issues (and benefits) of using Google Content Experiments, check out this 8 minute overview video I created:

What’s been changed since it launched in 2012?

After the early buzz and excitement after launching Google Content Experiments, basically, not a lot has changed unfortunately. There have only been a few announcements – initially they made some early essential ‘fixes’ to satisfy users frustrated with the limited usage options.

Then sadly not much else has been improved – just a few other under-whelming announcements. First, one regarding the Google Analytics Content Experiments API (which is only good if you have technical help available), and then one in Sept 2013 all about a new testing methodology choice and Adsense revenue as a test objective (which seems a bit self serving – only good if you are using Adsense on your site).

Other than that and a few case studies, the sound of crickets spring to mind. A real shame indeed, for a tool that shows great potential.

Google Content Experiments: The Mid-Term Report Card

Okay, so its been a year and half since Google announced they were killing off Google Website Optimizer and moving the remains of it into Google Analytics. They have had plenty of time to improve some of the functionality, which initially seemed to show some great early promise. Unfortunately, they haven’t really done much, and as you can see this tool is still only really suited for real beginners of A/B testing.

Not even the free price point is as much of a selling point anymore because of low cost rivals like Optimizely make this point a bit redundant.

Overall, since it launched, I give Google Contents Experiments the following grades:

  • A for the initial idea and moving it into Google Analytics
  • C for the effort placed into creating it
  • D for functionality and improvements since launch
  • Overall a great initial idea, but must do much better in the future!

I suggest you also learn about the other better tools available to you:
Learn how it compares to its main rivals, Visual Website Optimizer and Optimizely.

Your thoughts on Google Content Experiments?

Have you considered using it or had any success with it? What are you biggest frustrations with it or things you love most about it? Please comment below, and please share this article if you have found it useful. Thanks!

Like this article? Please share it using these icons below... thanks!

61
SHARES
GET YOUR FREE WEBSITE CONVERSION TOOLBOX (WORTH $200)
Get free access to my 5 guides for boosting your website sales and leads (checklists and more)
Also includes blog updates and we won't spam you!
  • Danny Michelson

    Very interesting. Was wondering what was happening with that tool. I tried it a few months ago and wasn’t too impressed. Good food for thought Rich!

  • http://rich-page.com/ Rich Page

    Thanks Danny! Yeah lets hope they improve it soon.

  • Alok Raghuwanshi

    This is a very interesting post!

  • http://alexrada.com/ Alexandru Rada

    Hello Rich,

    Nice post. I’ve did used Google Experiments with some great results for my clients and I don’t agree with 6,8,12;
    - using the API is much easy than it appears
    - you can do multi variate testing, although manually (not like Optimizely for example) -> using the API;
    - it saves you tons of money when you test very high traffic websites, and for these you surely have a dev to help you out (for a 1M visitors, Optimizely would have been $1000 probably)

    Thanks!

  • http://rich-page.com/ Rich Page

    Glad you found it interesting Alok. Thanks for the comment.

  • http://rich-page.com/ Rich Page

    Hi Alexandru – thanks for the comments. Do you have any details on the API ease of use? And for high traffic websites, why not just only test a low percentage of visitors? That way you don’t get hit with high costs on other tools, and you still get statistical significance in the results.

  • http://alexrada.com/ Alexandru Rada

    The API has two important calls (setChosenVariation and getChosen Variation). If you play with them it’s easy to modify page with Jquery (ok, you need to code a bit, but it’s extremely simple).

    I haven’t tested just a percentage of traffic because of statistical significance. It’s because if you go with very small increments (the site is already very well optimised), it’s hard to reach it.

    At least for me I only increased by 3% in a test (100% population) and I went with my gut finally.

    And the segmentation it’s much better in analytics that to create the segments again with the variables from the experiments.

    Optimizely and the others are very good, don’t get me wrong. The thing with analytics is that it makes sense to have the same tracking machine with the one creating the tests. The tracking can’t be influenced by the other system.

    Hopefully I explained correctly.

  • Gab Goldenberg

    I strongly disagree with the “won’t hurt SEO” byline – you’re buying the official line. Have you tried this yourself? I have and it screwed up Google rankings.

    Also, regarding the WP integration, it didn’t work well when it first came out, and again a year or so later, so I wonder if this is based on direct experience or just their claim? I don’t mean to be confrontational but there aren’t any details in the post that would suggest it’s your own experience …

  • http://rich-page.com/ Rich Page

    Hey Gab! I don’t actually use this tool because of its many shortfalls, so haven’t personally experienced whether split page testing actually impacts SEO, or whether its WP integration is any good. Just going on what I have read/heard for those two. Sorry you had such a bad time with the tool! What do you use instead?

  • Mark Hall

    Very helpful article, Rich, and a fast read. For me, BAD issues #s 8 and 12 are a showstopper for many of the tests I want to run. Thanks for also including the link to the VWO/Optimizerly comparison table.

  • Shiv Ettes

    Interesting — thanks for putting the info in an easy to read layout. Seems like a decent program for the price;) Do you have any alternatives that you recommend that have made regular updates?