Shortly after inventing the light bulb, Thomas Edison was asked how it felt to fail so many times before finally achieving his goal. He famously responded that he did not fail but that he successfully discovered several hundred ways not to make a light bulb.
Inbound marketing sometimes looks a lot like finding several ways not to increase business before finally hitting on the right formula that generates customer response. Then, because the public has ever changing attitudes, tastes and opinions, we continue to tweak our message responding to the new set of circumstances in order to optimize our performance.
One way to limit successfully finding ways to not engage with customers is using A/B testing, also called split testing, with your call to action (CTAs). You already know that CTAs help increase your engagement with your customers, but where one button might generate 50 responses, a different button might generate 150 responses. And this is where A/B testing enters into the equation.
A/B testing is simple in concept and really goes back to the basic scientific process. In A/B testing, you set up two buttons to see which triggers the most reaction from your audience. Then, you collect data (click through rate and click to submission) to determine the option that generated the best outcome. The call to action with the most responses is the call to action that is more successful. Voila!
Of course, nothing is quite so simple so let's break down the parts of a successful A/B test for your calls to action.
First, determine your desired outcome. What do you really want to see happen as a result of your call to action? You could look to increase your click through rate on a specific ad, get customers to ask for a free consultation or increase your newsletter subscription rate.
Then, you generate two different calls to action, which should be identical in every way except for one element. Most often, A/B tests gauge the effectiveness of language — for example: "call now" versus "visit us today" — but you can also test design elements like the shape, color copy or placement of your buttons. All other elements of your website must remain the same so you know any difference in the response to the call to action is based solely on that call to action.
Next, you send out both calls to action to two different and unique groups. Group A will see just option A and Group B will see just option B. And then you wait, which may be the hardest part of A/B testing. Give your test ample time to collect enough data to make the comparison meaningful.
Finally, you assess the results, which is where the rubber really meets the road and you find out if you found another way to not generate web traffic or if you finally found the one magic combination that keeps on out performing any other marketing combo. Programs like HubSpot make analyzing the data very simple and can test multiple variables beyond just clicks - such as views to clicks %, submissions, and clicks to submissions %.
Here are real results from an A/B test:
You can see from these results that the CTA button Version A (with the dark blue background) performs much better. The next step would be to eliminate the white background version B and start another A/B test. This time test the winning Version A and alter the button shape or text for version B.
You can almost imagine Thomas Edison performing split testing as he sought out the perfect combination of materials to get to his light bulb. Testing your calls to action will help get you to the success you're after faster and, like Edison, also successfully show you what not to do.
Do you go with your gut most of the time, or do you test your ideas using marketing analytics? Let us know if you're Version A: Go With Guts Guy/Gal or Version B: Ruled by Stats. Which version are you?