May marketing madness usability week, post #28
By Justin Cutroni
Director of Digital Intelligence, Cardinal Path
We all know that web analytics is a process. And the most critical part of that process is making changes. But deciding what changes to make can often be very difficult. That's where website testing comes into play!
Rather than blindly implementing what you "think" is the best solution, why not try a few different solutions and measure which one has the best impact on your website? Sounds logical, doesn't it? This is the basic premise behind A/B testing.
We all make assumptions about visitor behavior that are clouded by our familiarity with our own site. Testing your assumptions is the safest route and can also lead to great insights about your website's visitors.
In an A/B/N test, you're comparing the results of something original against, for example, a landing page, versus a new version.
A multivariate test is like an A/B test, but you're usually testing multiple parts of a page, like the image, a button and maybe a headline.
A third type of test is a split test, where you can test multiple processes, like a checkout process, to see which one generates a better result.
A/B testing can be fairly easy to do, even for small companies with limited technical resources. Let's walk through the process.
Learn in this blog post how to prioritize your a/b testing for maximum monetization.
What to Test
Finding out what to test is why we do analysis. Two of my favorite things to test are a landing page that has a high bounce rate or a funnel step that "leak" a lot of people. These two things are fairly easy to identify in any analytics tools. Landing pages are prime candidates for a good A/B test.
Estimating Test Length
Once you know what you want to test it's time to figure out how much data you need to test. Honestly, there is a lot of match that goes into is. The easiest way to get an idea of the sample size is to use an estimation tool, like this one put out by the guys at Cardinal Path (full disclosure, that's where I work).
Creating your Test Variations
The thing that is great about A/B testing is that you only need to create one variation. So you don't need to spend a lot of time designing new things. With a landing page, you may try something very simple, like small changes to reduce clutter...Or a bigger call to action...Or you might change the background color of the page. The possibilities are really endless.
Implementing the Test
Finally, once you have your test variations, it's time to implement your test.
In the old days, when you wanted to run a test, you had to buy your IT department lots of Mt. Dew & Red Bull so they could work all weekend and custom code a testing system. Thankfully those days are over!
Don't get me wrong; running a test still requires some technical skills. But now it's more like tagging pages with an analytics tool. There are several good A/B testing tools available:
Tools For Anyone (free or almost free)
1. Google Website Optimizer : free, fairly simple to set up, requires that you build the test pages
2. Visual Website Optimizer : Limited Free plan, includes a WYSIWYG editor to test out changes
3. Optimizely : Includes a WYSIWYG editor to create test versions, no free plan. I LOVE Optimizely. This is probably the easiest tool in the entire world. Seriously.
Advanced tools (ie you want to pay more and get more features)
4. Webtrends Optimize : Enterprise level testing, has a lot of advanced features, UI built for business users and not IT
5. Amadesa* :They claim that no IT help is needed to run tests
6. Test & Target : Users can quickly design test versions, real time results
7. Maxymizer : Easy to use and implement
Analyzing the Results
This is where the testing tools really help out. In addition to simplifying the implementation, they do a lot of the statistical work to tell you when a test is complete and which variation won. Different tools do this in different ways. But they will all tell you which version won. The results from a Google Website Optimizer test are below.
In addition to analyzing the results using your testing tool it's a good idea to look at the bigger picture. How did the test affect other metrics? This is done using your analytics tool. Remember, just because a tool deems a variation the winner doesn't always mean that it's the best thing for the business.
So next time you need to make a change to fix a problem, just don't choose what you think is the best solution, test it.
This post is one out of Clicktale's month long May Marketing Madness series. Each of our daily posts will highlight and explain today's best practices, useful tips and smart tools to measure and improve your online business performance. This week's theme is usability. Make sure to stay tuned in for more!
Lean more about Clicktale's integration with Optimizely Multivariate Testing Tool.
About the Author
Justin Cutroni is Cardinal Path's Director of Digital Intelligence. As a respected leader in the web analytics industry, he helps organizations integrate web analytics into their decision making processes. Justin commonly interacts with senior level management to drive the strategic use of web data and collaborates with marketing and IT teams to develop implementation plans and processes needed to generate actionable data and business insights. An active participant in the web analytics community, Justin speaks at various industry events with a strong passion for sharing knowledge and advancing the analytics industry.
Justin is the author of Google Analytics Short Cut, (O'Reilly Media, Inc., 2007) and Google Analytics (O'Reilly Media, Inc., 2010) and co-author of Performance Marketing with Google Analytics (Wiley, 2010). Justin holds a degree in Mechanical and Aerospace Engineer from Worcester Polytechnic Institute.
*Website not available anymore