fbpx
Market intelligence for international student recruitment from ICEF
5th Mar 2014

Test everything: the business case for A/B testing online

Khan Academy

had a question. The non-profit had recently introduced a “sneak peek” feature into some of their online lessons. The feature was meant to encourage students to continue the lesson by giving them a preview of what they would soon learn. However, the user response to the new feature was mixed: some liked it, others seemed confused. Khan Academy was unsure about how to proceed and so they decided to measure more carefully how the “sneak peek” was working. For one week, they ran a 50/50 test on their website that used two different versions of the same web page. Half of the visitors saw the “sneak peek” feature and the other half didn’t. After the week, they had a clear picture of how the feature was working based on real user data - through which it was clear (as shown in the graphic below) that the “no sneak peek” version of the page was converting at a rate nearly 30% higher than the version including the “sneak peek”. the-khan-academys-50/50-test-for-their-sneak-peek-feature The Khan Academy’s 50/50 test for their “sneak peek” feature. Source: Khan Academy “In every measure, hiding the sneak peek resulted in greater learning outcomes,” notes a Khan Academy blog post. “Now we can remove the sneak peek talk-through with a degree of confidence, because we've seen that students are more likely to keep going if they don't see it.” The sneak peek case is an example of A/B testing, which the testing service Optimizely defines as follows:

“An A/B test involves testing two versions of a web page - an A version (the control) and a B version (the variation) - with live traffic and measuring the effect each version has on your conversion rate. Start an A/B test by identifying a goal for your company then determine which pages on your site contribute to the successful completion of that goal.”

A/B testing is a simple but powerful idea. It is an approach that relies on good data and good processes throughout and that, with those key building blocks in place, can be applied to just about anything you do online. You can test different copy or colours on your homepage or the placement of a key conversion button or the subject line in your email campaign - or, for that matter, pretty much any aspect of your online presence. Wired magazine describes how A/B testing has become standard practice for some of the world’s leading online brands. “Over the past decade, the power of A/B testing has become an open secret of high-stakes web development. It’s now the standard (but seldom advertised) means through which Silicon Valley improves its online products. Using A/B, new ideas can be essentially focus-group tested in real time. Without being told, a fraction of users are diverted to a slightly different version of a given web page and their behaviour compared against the mass of users on the standard site. If the new version proves superior - gaining more clicks, longer visits, more purchases - it will displace the original; if the new version is inferior, it’s quietly phased out without most users ever seeing it. A/B allows seemingly subjective questions of design - colour, layout, image selection, text - to become incontrovertible matters of data-driven social science.” At the recent Travolution Summit in London, Graham Cooke, the CEO of the conversion optimisation firm Qubit, pointed out that, especially in the online space, marketing and IT teams are becoming increasingly integrated. Also at Travolution, Cooke’s co-presenter, Chris Bradshaw of ATD Travel Service, demonstrated another compelling A/B case that compared two versions of an online ad for ATD. two-version-of-an-online-banner-for-attraction-tickets Two version of an online banner for Attraction Tickets. Source: ATD Travel Service In the ATD case, the “B” advertisement converted at a rate 52% higher than the “A” control version, providing the company with a clear, data-based indicator of which ad was going to drive better results. Marketers today have access to better and more efficient tools that support a greater range and number of A/B tests than would have been possible even a few years ago. These tools are helping to move routine, ongoing A/B testing out of the realm of the big tech brands and into the reach of virtually all online marketers. Google is an example of a tech leader that has extensive experience with A/B testing. The search engine giant has now also released freely available tools - Google Content Experiments - that allow anyone to run similar tests. Content Experiments ties in to Google Analytics and allows marketers to test up to five variations of a page at any one time. In its introduction to Content Experiments, Google makes the case for A/B testing as a basis for optimising websites and as a means of driving the conversions that are most important to your business goals.

“Let’s say you have a website where you sell house-cleaning services. You offer basic cleaning, deep cleaning, and detailed cleaning. Detailed cleaning is most profitable of the three, so you’re interested in getting more people to purchase this option. Most visitors land on your homepage, so this is the first page that you want to use for testing. For your experiment, you create several new versions of this web page: one with a big red headline for detailed cleaning, one in which you expand on the benefits of detailed cleaning, and one where you put an icon next to the link to purchase detailed cleaning. Once you’ve set up and launched your experiment, a random sample of your visitors see the different pages, including your original home page, and you simply wait to see which page gets the highest percentage of visitors to purchase the detailed cleaning. When you see which page drives the most conversions, you can make that one the live page for all visitors.”

Beyond a free service like Content Experiments, there are also an increasing number of service providers specialising in A/B testing or other aspects of website optimisation, such as Optimizely, Qubit, Unbounce, and many others. The interesting thing about such services is they often allow marketers to set up and run multiple tests without additional development or IT intervention on the websites they are testing, and this opens the door to more frequent and more extensive optimisation efforts. However, putting effective processes in place to manage any such testing is a key step. A recent Marketing Sherpa survey of marketing executives found A/B testing is a powerful window into customer behaviour once solid management processes are in place.

“Overall, 47% of marketers said they use website optimisation and testing to draw conclusions about customers. But when we segment the data, it gets interesting. The more mature the marketing organisation, the more it is able to use A/B testing to learn about customer behaviour… 76% of trial phase marketers - those who do not have a process or guidelines for optimisation or testing - do not use testing to learn about customers and build a customer theory. However, 75% of strategic phase marketers - those who do have a formal process with thorough guidelines routinely performed - use testing to learn about customer behaviour.”

In a recent post on the Harvard Business Review blog, Wyatt Jenkins, a vice president at Shutterstock, sets out some important tips when drawing up your testing processes.

  • Build a small testing team. “All you need to perform tests is a 3-4 person team made up of an engineer, a designer/front-end developer, and a business analyst (or product owner).”
  • Look at all the available metrics. “The metric you tried to move probably won’t tell the whole story. Plenty of very smart people have been puzzled by A/B test results. You’ll need to look at lots of different metrics to figure out what change really happened.”
  • Look at results by customer segment. “Often a test doesn’t perform better on average, but does for particular customer segments, such as new vs. existing customers. The test may also be performing better for a particular geo, language, or even user persona. You won’t find these insights without looking beyond averages by digging into different segments.”
  • Think small and fast. “Don’t spend months building a test just to throw it away when it doesn’t work. If you have to spend a long time creating it, then you’re doing it wrong. Find the smallest amount of development you can do to create a test based on your hypothesis; one variable at a time is best.”
  • Test again. “Keep a backlog of previously run tests, and try re-running a few later. You might be surprised what you find.”

Mr Jenkins also offers an extended set of tips on his personal blog to build on the best practices in his Harvard Business Review post, and we highly recommend his extended post for further reading on the subject. When we are talking about conversion optimisation and A/B testing, what we are really talking about is creating a culture of experimentation in the marketing effort. A/B testing is not a fix for every business problem and it is especially not a basis for determining major innovations or shifts in strategy. However, conversion optimisation is an important factor in driving results for many marketers - one that is best understood as a process of incremental change and continuous improvement.

Most Recent

  • Three international education trends for 2025: Revenue optimisation, marketing personalisation, and on-the-ground local intelligence Read More
  • Australia’s enrolment cap legislation is stalled. What happens next? Read More
  • Canada confirms expansion of in-study work rights and new compliance requirements for institutions Read More

Most Popular

  • Which countries will contribute the most to global student mobility in 2030? Read More
  • Research shows link between study abroad and poverty alleviation  Read More
  • Beyond the Big Four: How demand for study abroad is shifting to destinations in Asia and Europe Read More

Because you found this article interesting

Three international education trends for 2025: Revenue optimisation, marketing personalisation, and on-the-ground local intelligence TREND #1 DEMAND AND REVENUE OPTIMISATION “Half the money I spend on advertising is wasted; the trouble is...
Read more
Australia’s enrolment cap legislation is stalled. What happens next? Australia’s next federal election is expected to be held on or before 17 May 2025. It seems clear...
Read more
Canada confirms expansion of in-study work rights and new compliance requirements for institutions In July 2024, Immigration, Refugees and Citizenship Canada (IRCC) proposed a series of regulatory changes to the country’s...
Read more
Foreign student enrolment in the United States reached an all-time high in 2023/24 The 2024 Open Doors Report on International Educational Exchange reveals that 1,126,690 international students were enrolled at US...
Read more
Canada ends expedited study permit processing for international students Canada began to offer fast-tracked study permit processing to international students from select countries in 2018 with the...
Read more
The way forward: Shifting to a more strategic and student-centred recruitment The following article is adapted from the 2025 edition of ICEF Insights magazine, which is freely available to...
Read more
What will a second Trump presidency mean for international education? After a hotly contested and close-run campaign, former President Donald Trump was declared the winner of the 2024...
Read more
Agent survey reveals priorities and shifting preferences of prospective international students The findings from the 2024 edition of the ICEF Agent Voice survey reveal that prospective international students are...
Read more
What are you looking for?
Quick Links