Your new digital marketing campaign may be motivated by any number of goals – you may be aiming to increase revenue, leads or likes/shares on social. So, now we’re going to answer the question: How can you guarantee the most successful marketing campaign? Just kidding. (tap, tap… is this thing on?)
Guess what? There’s no formula. There’s no “this will work every time exactly this way.” So what do we recommend when approaching a new digital marketing campaign for your brand? Change it up, introduce some A/B testing into the mix, expand on what works and get rid of what doesn’t. The greatest risks oftentimes yield the greatest reward. Remember, “don’t let the fear of striking out keep you from playing the game.” Your approach to testing will make all the difference between a foul ball and a home run.
1. Evaluate current practices.
Look at past campaigns and whether or not they were considered successful. What were the variables that were unique to each campaign? What do you think specifically worked for these campaigns? What didn’t work? Were you happy with the end results or did you see obvious room for improvement? Make note of these variables—they will be good “test subjects” for the future.
2. Pick test subjects.
Treat this like a lab experiment. You will need test variables to alter in order to know what works best. Variables we suggest changing up to keep you on your toes include:
- Ad Copy
- Email Copy
- Email Subject Lines
- Landing Page Copy
- Calls to Action/Offers
- Homepage Content and/or Layout
- Social Copy
- Form Fields
Prioritize this list for the items that are most relevant to your organization and the ones that you think will have the greatest impact on your audience. For example, if email marketing is a big part of your overall marketing strategy, consider prioritizing “email subject lines” – the subject line decides whether or not someone will actually open your email, so it could be worth it to test a couple variations.
3. Use A/B testing.
First, let’s review A/B testing. A/B testing is an experiment you run on two versions of one piece of content (two versions of an email subject line, for example). The two versions are then sent to two different, equal-sized groups. After sending, you can analyze metrics and see which version performed better. Remember to test one variable at a time so you can draw accurate conclusions from your data.
To start, create two versions of your campaign component. One will be your control – this will resemble what you’ve done in the past (A). The other will be your variation – this is your opportunity to try something new (B). Platforms such as HubSpot have a built-in A/B testing tool to make this process more streamlined and manageable. (By the way, we also recommend their Intro Guide to A/B Testing.) If you’re not using a marketing automation platform like HubSpot, you can also run tests manually. Give both versions your very best. They should be different in style and tone, but not in quality – they should still point your audience toward the desired goal. Launch version A and version B to randomized groups, similar in size, at the same time and then give it some time to develop data. Make sure you wait long enough to have a measureable amount of interaction. Then, it’s time to evaluate. Always evaluate.
- Which of the two versions was most successful?
- Why do you think that is?
- Are there changes you need to make to your buyer persona profiles to reflect your customers’ preferences?
- Have you added or lost customers since your last test?
Even if you don’t see a significant difference in the performance of the two versions, don’t let it get you down – “there’s no crying in baseball!” Learn from the experiment and move on to the next test.
On the flip side, if there a significant difference between how each of your two variations performed, that’s awesome! Apply what you learned to future campaigns, but don’t get comfortable just yet! Keep testing. What else can you alter slightly and test against this success?
Basically what we are saying is, there’s always room for improvement. And A/B testing—it’s easy as 1, 2, 3.
At Pyxl, we are always eager to test new ideas because we see every experience as an opportunity to be better. If you want to work with us on improving your next campaign, contact us today.
Updated: Nov 06, 2020