Should you worry about opens and clicks?
Marketers often rely on open-and-click rates for quick campaign performance evaluations. Yet these metrics have long been known to be inaccurate. We often believe that things on the internet improve over time. That is not the case with email metrics - they are getting less accurate. Why? Should you be worried, and should we abandon these metrics altogether?
Opens and clicks, or to be more accurate open rates and click rates, have long been the most important metric for all email marketers. A single number tells how well a campaign has been performing. A single number to judge the engagement by. The underlying interactions have been the go-to data for recipient targeting, marketing automation as well as list-cleaning operations. No matter how inaccurate, there are many use cases for the interaction data.
Over the past couple of years, we have seen these KPIs growing until everything broke - Apple announced its Mail Privacy Protection (Apple MPP for short). That was the moment when open rates skyrocketed. Marketers could no longer trust their data.
But the problem started long before MPP - the steady growth of open rates was one of the symptoms.
For the longest time, the tracking of opens was considered as accurate as it could be. We knew about the inaccuracies caused by tracking images not being loaded, email clients blocking images for security reasons, etc. The overall consensus in the industry was that while inaccurate, the data represented real interactions and therefore could be trusted.
Unlike opens, clicks were considered highly accurate. Clicks could not be easily intercepted by email clients (as always, there are exceptions) and provided firm data. Over time, we have started to see more and more emails being opened and links clicked with patterns that did not add up. The opens and clicks occurred at extremely high speed - messages opened immediately after delivery, and multiple links from a single mail clicked in a single second. These were clearly “Non-human interactions”!
We had so many questions… Did Rossum’s Universal Robots* escape to conquer the world of email?! How do we stop them?! How do we make the analytics accurate again?
We soon found the Robots opening emails and clicking the links did not come to conquer the email. They came to protect it - from the evil Robots! (Oh yes, there is such a thing as evil Robots - spam, phishing, and malware are often sent and controlled by these evil Robots.) By opening the messages and following the links inside, they were able to identify phishing attacks and spam campaigns faster and better than before, protecting individual recipients.
Clearly, these were not the robots we would want to or should stop. We had to find a way to let them do their work while keeping the accuracy of our analytics as high as possible. And we were not the only ones affected - our industry peers were facing the same problem.
The mission was clear - we had to find a way to identify such robots without interfering. You would think that if we could spot the robots in our data it would be simple to flag them. Not quite, because most of these robots want to stay undetected so they can stop the bad robots. It had to be an efficient, accurate, and real-time method for detection.
After weeks of testing machine learning (because that's the cool tech), which was resource heavy and could not be done in real-time, we came up with a different method. One that relies on multiple data points about the source of each interaction. Our datasets allow us to use the information to identify non-human interactions in real-time and flag them as such. Non-human interactions no longer affect analytics, targeting, or automation as they are not considered opens or clicks but bot-opens and bot-clicks instead.
The crucial moment came with the launch of Apple's MPP. As Apple MPP was heavily publicized, we had many marketers asking us about the feature and how we are going to respond. For us this was not a big deal - just another good robot doing its work (even though the motivation being different this time). For the marketers this was a major event - in some cases this could mean 80% open rates of their campaigns… except none of those would be real. We spent a week monitoring the behavior before we enabled the detection patterns to start flagging Apple MPP opens as non-human.
We have implemented our detection technique in Mailkit about 5 years ago (long before Apple MPP) and Omnivery has had it since its launch 2 years ago. It is not 100% accurate but has a very low false-positive rate. We do reviews of the underlying datasets every month to make sure our customers always get the most accurate data. The reliable engagement allows our customers to remove unengaged recipients from their lists and drive marketing automation to the engaged ones.
The effect of non-human interactions varies country by country, industry by industry. We had the privilege to participate in writing a paper on this topic within M3AAWG - Exploring the Impact of Nonhuman Interactions on Email Send Metrics. No matter where you stand, Non-human interactions are affecting your email campaigns more than you may think. How is your platform dealing with non-human interactions?
* "R.U.R." (Rossum's Universal Robots) is a science fiction play written by Czech author Karel Čapek. The play is set in a factory that produces artificial workers called "robots" based on the Czech word “robota” for forced labor. "R.U.R." is credited with introducing the word "robot" into popular culture and has had a lasting impact on science fiction literature and popular culture.