When it comes to programmatic ad-tech companies they all seem to have three things in common. (a) The all seem to have more data than everybody else, (b) they all seem to have a longer algorithm than everybody else, and (c) they all seem to have a company founder that has single handedly invented the internet whilst doing one armed push ups.
If you judge these companies just by their sales collateral it becomes very difficult to sort out the wheat from the chaff. It’s my job to look at these technologies all day, every day, and I worry that if I can’t figure out which technology is good and which isn’t then how the hell is a CMO going to be able to distinguish the difference?
Therefore, I decided to run another head-to-head test of over a dozen of the world’s largest DSPs to emphatically prove which technology was working better than the rest. However, I was very surprised to see the results.
Note: Please note that this is only a sample list above of ad-tech companies and not the definitive list of who was involved in the test. Please also don’t ask which ad-tech vendors participated in the test as I can’t say.
Defining the Test
This isn’t my first rodeo when it comes to reviewing Ad-Tech and therefore, I was pretty sure that I knew how to setup the tests from a digital ad-tech perspective. Firstly, I needed help from a 3rd party ad-server, so I asked my friends at Sizmek if they would help. Check! The offered to help out free of charge as they realised that these tests would (hopefully) benefit the industry. Thanks to my great friends at Sizmek (Thanks Imran Masood). Then I needed an ad-operations team; these are the men and women that are behind every digital ad-campaign. They are the unsung heros of the digital advertising industry and luckily I had one such team working for me so a huge thank you for this team to run the test. Specifically to Carmen Feleo who put in countless hours behind this test.
Then I needed a client; an advertiser that had an e-commerce brand; somebody that would be able to provide me with some creative and didn’t mind if I deployed a massive amount of tracking pixels on their site. I asked a friend of mine: Malcolm Ball as he runs a company that sells health products. The product name is Nutrikane and its dietary supplement which helps maintain blood-glucose levels. The upshot of which is if you mix this powder with water and take it once a day then it can help regulate your blood sugar levels which is a big deal for people who are diabetic. It was nice to know that some of this programmatic testing was actually helping diabetics.
The goal of the test was to find out which of the DSPs would be able to drive the lowest CPA to purchase Nutrikane online. In my mind this test was very simple; it was an old fashioned shoot out between some of the world’s largest programmatic companies. However, what happened next really shocked me.
60% of the DSPs said NO!
When I called these companies and explained that we wanted to run a test they were at first very excited. However, when I explained that there were other DSPs also involved in this test they became more hesitant and many of them pulled out. Here are some of the reasons why the majority of the DSPs didn’t want to participate.
- I’m very interested in participating in the test, but unfortunately this was not the right timing.
- At this stage we won’t take part on this test but we are definitely in next time for a “prospecting test”.
- If you want to truly test the power and efficiencies of an optimization algorithm, they must provide a campaign with enough data (at least 3,000 pixel fires per month) to be statistically significant.
- So, we’ll sit out this test and we hope another opportunity opens up for a more robust test in the not too distant future.
- We just don’t believe it would be a true reflection of the real time machine learning and platform.
- We need between 300 – 1,000 conversion points on site. Sorry.
- My concern is the focus on pure CPA, which puts us fair and square in the red ocean (lower funnel DSPs bloodbath/Dogfight). A place we are consciously not playing.
- I feel that testing vendor effectiveness and efficiencies on a larger scale will yield a much more definitive outcome and we would love to be a part of it. – Followed by “Sorry. We changed our minds.”
If you are a CMO and you are thinking about using a DSP to launch a brand from scratch then there are a number of key takeaways from these findings. I can summarise them as follows;
- Minimum Quantity of Conversion History – Many DSPs will require a minimum number of conversions in order for their ‘algorithm’ to be able to find look-a-like models. This number varied massively from vendor to vendor but on the safe side, you will need circa 1,000 conversions per month for the data modeling to be useful.
- Upper versus Lower funnel – Some DSPs consider themselves as “prospecting” technology and others as “retargeting” technology. I have no problems with working with either tech, however, ask your DSP if their core business is either Upper or Lower funnel activity before engaging with them.
- Oranges vs Mangoes – You may have heard me talk about this before, however I feel that there are two types of ad-tech in the world. Mangoes are ad-tech companies with a large core piece of technology wrapped around a relatively small amount of soft client service skills. These are good companies as you need a large solid piece of technology at the core of a DSP. However, there are other companies (Oranges) which only have a tiny amount of tech wrapped around a very large amount of soft fleshy service. It all looks good from the outside, however, when you cut it open you see just how small the tech really is.
You may be wondering “How did the test go?” or “Who won?”. Well, you are just going to have to wait until next week and I’ll write another short article about how the test ran.