Analytics is about knowing how your videos perform and then putting that knowledge to work. In August, we spoke to the brilliant Corey Halverson, Akamai’s product line director for media analytics, and taught you how to make sense of confusing analytics results. This time, we spoke to Halverson and his colleague Girish Bettadpur, a senior product manager at Akamai, about how to put that knowledge to good use.
The process is designed to show what elements on your site are effecting your viewer’s behavior. By altering key elements in a controlled way and studying the metrics, such as how long a video was watched or how many people completed a video, you can learn how to make your site more successful.
You might use A/B testing to study your video player’s design, or the effect of your site’s overall design. What you test is up to you. As the A/B name suggests, you’ll have two elements that you’re testing against each other: the site as it is and a page, player, or content variation that you create.
As an example, says Bettadpur, you could study how many videos viewers watch per visit, with the idea that adding ratings on videos alters that number. The test group will see ratings, while the control group won’t. At the end of the test, you’ll discover which version led to more videos being watched per viewer. If having ratings leads to more views, you’ll want to put that in place for all your videos going forward.
While the name might suggest, a 50/50 split of test subjects, you don’t have to assign it that way, says Halverson. You can randomly assign every third or fourth visitor, if you want a smaller test group. The important thing is to get a large enough sample by the end of the test period.
The results of your test might surprise you. At Akamai, Halverson studied whether or not sports fans would watch online sports videos longer if the videos were of higher quality. He found that higher bitrates didn’t automatically lead to longer viewing times, because they required more buffering.
Ad policies are often tested on video sites, to see if viewers will watch for shorter amounts of time when they get more ads. To test the impact of ads, sites add more or space them out differently. The control groups sees business as usual.
“It takes the question of “What if I show more ads on my site?” and makes it something measurable and actionable,” says Halverson.
Putting it into Practice
To start running your own A/B tests, you’ll need to work with an analytics system that supports A/B testing. Naturally, Akamai’s media analytics tool is one option. It collect visitor metrics and creates ratios from those metrics, such as the number of views per visitor, for example.
Once you’ve got an analytics system, you need to decide what you want to test against. Pick a variable that you’ll change, such as showing ratings, and decide how many visitors will go into your test group. Halverson recommends letting tests run for two weeks, since you need enough of a sample to get a representative group. When the test is over, go into your analytics tool and see which group performed best.
While most analytics packages say they offer A/B testing, less than half of them really support it and make it easy, says Halverson, so shop around.
To begin a test, you’ll need to input the alternate page design or content matter you’re testing against. The analytics system itself doesn’t create the two designs.
“A/B testing is a way of operating, it’s how you make decisions. It’s not necessarily that you do one A/B test and you’re done,” says Halverson. “Once a customer begins A/B testing, they don’t have to make guesses anymore.”
Got it? Post any questions down in the comments area. Come back next month when we’ll talk to Halverson about using real-time data to make snap decisions.