Jason Puckett

April 19, 2016

5 Steps to Test and Optimize Your AdWords Mobile Ads

Benchmarking, testing and optimizing your mobile preferred ads in AdWords is becoming more and more important. In 2016 mobile traffic will surpass desktop traffic for most verticals and AdWords is already accounting for this. Our data tells us that mobile preferred ads have a drastically higher CTR than your current CTR benchmarks. Let’s look at how and why.

Mobile shoppers are a force to be reckoned with and digital marketers must adjust their strategies. How do you know if your ads are mobile optimized? How do you know if your strategy is working? AdBasis has crafted a 5 step plan for you to benchmark, test and optimize your mobile ads.

What is the goal here?

  1. To measure exactly how much “mobile device preference” improves CTR compared to “no preference”.

  2. To find the optimal ad content for both “mobile preferred” and “no preference” ads. Theoretically, to optimize by device.

Creating exact copies of search ads within the same ad groups and enabling Mobile Device Preference yields huge improvement in CTR’s. AdBasis customers have ran this exact experiment 50+ times and the average improvement to CTR is 40.1%. Yes, you read that correctly. See some examples below:

Here’s a good one:

Mobile Benchmark 1

And another:

Mobile Benchmark 2

Why Does This Drive Such High Improvement?

There are two reasons why Mobile preferred ads will help improve your CTR’s:

  1. AdWords actually prefers these ads and will give them higher positioning. Not a lot of AdWords managers realize this. If AdWords is analyzing your ads vs. a competitor, for the same keyword, same bid; your ad will outrank theirs and receive better positioning if the user is searching from mobile.

  2. You should be using mobile specific content. Mobile users respond to ads that speak to the fact that you have a great mobile experience. Read the full report here.

Ok, Mobile Preference works. What now?

Here is a bit of instruction on a multivariate test we love to execute for mobile benchmarking:

There are two variables being tested here; (1) a single creative change + (2) a change to device preference.

Three easy steps:

1. Make your creative change

Examples (select one):

  • Two headline variations
  • Two D1 & D2 combinations
  • Two display URLs

This will essentially be an A/B test from a creative content standpoint.

2. Create each ad combination for both device preference settings.

You will essentially have four different ad variations in this test.

2 creatives x 2 devices = 4 total variations

Mobile Benchmark 3

3. Choose Your Ad Groups

It is always an AdBasis best practice to distribute your experiments across as many ad groups as possible. If you are unable to distribute tests easily across multiple ad groups because your ad groups all require unique content, please considering using AdBasis custom parameters. Testing across ad groups will allow you to aggregate data for faster and more reliable decisions.

4. Distribute and Collect Data

Aggregate data by creative variation (#1-4), and measure how each ad performs in aggregate across all ad groups as a whole. Once you reach 90% confidence, move on to step 5.

5. Final Step: Pick 2 Ads

The conclusion of this experiment will be two winning ads. You will have a winner for “mobile preferred” and a winner for “no preference”. You will have a great understanding of which ad creative works best for each device, and you will also understand how big of an impact mobile device preference has on your AdWords campaigns.

It should be your goal to have an optimal ad for mobile and desktop within every ad group in your AdWords account. Sometimes that’s easy, sometimes it’s not easy. If you’re looking for a solution on how to implement this through technology. We’d love to chat.

We’d love to talk to you about testing!

Have a question for us? We'd love to hear from you!

comments powered by Disqus


Popular Articles

Understanding Conversion Layers and Executing Ad Experiments

August, 14th 2014 - Optimization

3 Steps for Building a Paid Acquisition A/B Testing Plan

October, 15th 2015 - Optimization

All Things Tested - 3Q Digital's Sean McEntee

August, 31st 2015 - "All Things Tested"