Video split-testing (or, How to Always Be Right)

Business, in our experience, is chock-full of battles of opinions.  I’m sure we’re not the only ones who have had to deal with questions like this…

  • “Is it worth spending a lot of time on this if we’re just putting it on the web?”
  • “Everyone else keeps it simple.  Can’t we just do the same?”
  • “Next time, can we use two lights instead of three?  Do we have to use lights at all?”
  • “I don’t like the background music.  Can we change it?”
  • “Can you add a shot of X?”
  • “Can we just cut this part of the video?”
  • “Could this be finished in the next hour?”

We videographers have a lot of ideas about what video should be.  It needs to sound good, it needs to be well-lit, in focus, and, well, beautiful.   Maybe it needs to be color-corrected, too.  And some nice graphics would make us look like we know what we’re doing.  And the message needs to be perfectly clear.   And we need to persuade our audience.  We want them to take action.

In other words:  we have to take time to get this video exactly right.

But is it worth it?

Have you ever wondered whether it’s worth spending days color-correcting your shots?  Or spending an hour setting up lights?  Or hauling all this heavy gear around?  If you’re not, your clients might be.

And what if the message isn’t as clear as you think?  What if the video only makes sense to you?  Worst of all, what if your boss (or client, or colleagues) are right, and you really should make all the changes you ask for?

Wouldn’t it be nice if you could avoid all the battles-of-opinion and just know which style is the one that’ll have the greatest effect on your audience?

 

Enter:  the split-test.

As you might’ve gathered, we here at Fixing Your Video are big fans of science.

Have you ever heard of a technique called split-testing?  It’s used all the time for testing new website designs.  Here’s how it works:

  • Two (or more) versions of a website are designed.
  • Visitors to the site are randomly served Version A or Version B.
  • If Version A gets more hits, or collects more email addresses, or whatever, it’s declared the winner!
  • Version A becomes the new website design.

It may come as no surprise that you can do the same thing with video.

At its core, videography is an art – but there’s no reason it can’t also be a science.  The next time you sit down at your workstation to bring out the skin tones and recover some of those shadows, and you just don’t feel like bothering, make two versions – one with and without color-corrected footage – and split-test them!  Maybe your viewers will love the untouched version just as much, freeing you up to do bigger and better things.  Or maybe you’d like to try a new editing technique.  Or some new graphics.  Or anything, really.

But what if you discover, through testing, that your audience doesn’t seem to care how good the video looks?  There’s always that thought in the back of my head:  maybe I’m wrong.  Maybe, in my current situation, it’s really not worth my time, and I might as well make something that… well, sucks.

Instead of stubbornly sticking to my guns, I’d rather just know.  I want me some hard data.  Does our audience care, for instance, about a polished product?  If pretty lighting makes audience retention go up, then you’re totally justified.  And the next time someone asks you not to bring any lights, just point to the data!  But if it is pointless for you to make this video look nice, then maybe your time is better spent tweaking your script.  Or making a difficult concept a little more clear.

 

I’m intrigued.  How the heck do you pull this off?

All it takes is a little bit of JavaScript.  Using this sweet tool we made (based loosely on the sample provided so kindly by Wistia), you can generate A/B codes for your favorite video provider.  Just supply it with the URLs of the two (or more) videos you’d like to test, and it’ll give you a new-and-improved embed code that’ll choose a video at random and play it.  Afterwards, you can look at your analytics data and see which one was the most effective!

Here’s a permanent link to the A/B testing tool.

(By the way:  This is a beta version.  If it does not work, please hit me up by email or on Twitter so that I can fix it.)

 

So… what, exactly, am I supposed to test with this tool?

Here are some ideas.

  • Does this video work better with the whistle-and-clap music, or something a little more “corporate”-sounding?
  • Male or female voiceover?
  • Do we really need to painstakingly slash this video down to one minute?  Can’t it be five minutes?  Or ten?  Will our audience really lose interest that quickly?
  • Do we need to keep the long opener, or cut it?
  • Does it really improve our retention rate to have an animated intro?  Couldn’t that also hurt our retention rate?

Make two videos, upload them, plug their embed codes into this tool, and see which one has a better retention rate.  Or a higher view count, or a greater number of shares, or whatever you’d like to improve.

It might not be possible to split-test various lighting configurations, but it’d be easy to split-test, say, a rush job versus a video that’s been polished to a high shine.  It’s a new spin on the old “Can’t we try it both ways?” tactic.  Nearly every style question can be turned into a hypothesis that can be tested.

 

Just one caveat:  you’ll need a pretty big audience to get reliable results.  This tool can help you figure out how big it needs to be.  Here, also, is a good article on how to conduct these tests in a way that wouldn’t make a seasoned statistician chuckle.

 

Now, if we could just ask a favor…

We’d like you to watch this video.

This is Mouse.  You could call him our mascot, but he’s really just everyone’s best friend.  Here he is doing some cool tricks.  (Consider it a treat for making it to the end of this post!)

Are we conducting a split test right now?  Maaaaaaybe.

 

(This post was revised on October 8, 2015.)

Sign up for our newsletter to receive more video tips!

* indicates required

Leave a Reply

Your email address will not be published. Required fields are marked *