About Larry Kim
Larry Kim is the CEO of MobileMonkey, Inc. a provider of chatbots for mobile marketers. He’s also the Founder of WordStream, the World’s largest PPC management software company, managing $1 Billion in ad spend for over 10,000 customers worldwide and employing over 200 people. Larry’s blogs on marketing and entrepreneurship are read by millions of people every month. In 2015, he won Search Marketer of the Year from PPC Hero, Search Engine Land, and the US Search Awards.
In his Learn Inbound talk, Larry reverse engineers how RankBrain and other Google machine language algos actually work, exposing various critical vulnerabilities. And he shares unusual, completely white-hat ways to greatly benefit from future algo updates. The biggest SEO ranking factor shift of all time is now underway, moving away from links and keywords towards Rankbrain user engagement signals. Watch this session if you want your rankings to live. Join the resistance today!
- Google is using machine learning for Image Search, Speech Recognition, dynamic translation, YouTube and AdWords (entirely based on ML for ad targeting).
- RankBrain, stated by Google as the “third most important ranking factor” is affecting long tail queries since the second semester of 2015.
- Identify which pages are unlikely to survive SEO judgment day (also known as donkeys). Find on Search Analytics which pages have low CTR and high bounce rates and improve them. Define what is a good CTR based on your own data in your niche.
Hey guys. Good evening, Dublin, thanks for having me, it's so great to be here, thank you organizers and sponsors. Before we begin this evening, just an important disclaimer, there are different types of presentations, and in preparing this content I didn't just read a bunch of Google press releases and summarize them for you. This is a different type of presentation, I'm going to be sharing with you some interesting experiments and theories about how league SEO is working, and it's a little bit forward looking, but I think that's an important aspect of SEO, it's not just about doing what everyone else says to do. There's a little bit of experimentation and stuff like that, that's why we love it.
All right, so that out of the way, let's begin. So imagine a day, not too far off, in the distant future, you're just doing your SEO stuff, content marketing, only to find all of your rankings that you worked so hard on working for missing from the from the search results page, obliterated in a microsecond by a new order of machine learning enabled algorithms. And that's exactly what happened two years from today when SEO Judgment Day, when Google Search changed the way they rank stuff in the biggest way ever. It's not just rankings; it was algorithms for spam detection, image search, video search, query interpretation, an all new generation of algorithms powered by machine learning. And so millions of websites were impacted including websites that had managed to fly under the radar from Penguin and Panda, and all these other previous updates.
So in a desperate attempt to save our future and our children's future from these algorithms, John Connor, the leader of the SEO resistance in the year 2020 has sent me back in time, equipped with weapons and tactics from the future to warn you about the impending changes to come for your websites before it's too late. So Judgment Day, SEO Judgment Day happens actually two years from now, that's September 2018, that's Google's 18th birthday. Now, don't quote me on that, time travel is very tricky, and my presence at this conference has notably altered the timeline. So let's get into the details here.
Before talking about all the tactics and tricks, I want to just make sure we're all on the same page here in terms of machine learning and what tactics this all means so we all are on the same page, all right. All right, so Google Search algorithms today, or at least historically, they weren't machine learning enabled, they were known in computer science as heuristic algorithms. A heuristic algorithm or an expert system, it's just like hand-coded big mess of code that's used to...they just basically wrote a big program to organize all the information in the world, and every time... It's not dynamic and learning, every time they need to make a change they actually rewrite sections of this big algorithm and they release it periodic, like every six months or something, they'll do a big push and you'll see that the ranking changes. So that's kind of the
historical approach to this algorithm.
Machine learning is a little bit different, it's basically...it learns, by definition there's a computer program that's trying to check to see if the page and then content being served match the intent of the user search. If yes, then the computer says. "Wait, this is a great result, we should probably remember this, and use this in the future." If it's a fail, if the document does not match the user intent of the query intent then it remembers that, and is probably not likely to use that again in the future. All right, so before we move on here, are you with me here? Does everyone understand the difference between machine learning and like a conventional heuristic system, are you with me?
All right. So where does Google use machine learning? There was quite a big announcement last fall where they were talking about using machine learning on the Rank Brain, but actually that's a laggard, that's kind of one of the last pieces of the puzzle, they've been using machine learning in many other aspects of their business including image search, speech recognition, translation APIs, think about YouTube search and suggested videos, like there's no links, there's no keywords to use, so they have to be using something else to figure out what to suggest.
In a Google Webmaster video in 2013, Matt Cutts did an interesting Webmaster video on the porn vertical. And basically what he said was, there's a difference between popularity and authority, and what he pointed out was that high value sites typically don't link to porn sites because they just don't want to be associated with this. So they needed to come up with some other algorithms to figure out how to rank this stuff, and so basically he didn't say it specifically but my theory is that, however Google ranks porn, that is the future of SEO.
Google AdWords, this is actually my background, the platform is entirely based on machine learning ever since inception. When you buy a keyword on AdWords, within 200 impressions it figures out the ad targeting by auditioning the keywords and ads that you have against different types of queries, and it figures that out on the fly, there's no links and there's no on page content SEO kind of thing. The Facebook Newsfeed, I'm sure you know, as Rank is dead, this newsfeed algorithm, it's all machine based learning. You've probably noticed that if you engage with certain people and certain types of content, you'll actually see more of that in your feed tomorrow and vice versa. So even stupid Twitter ads are machine learning based. So I think machine learning systems are pretty much everywhere, and they're coming to Google Search algorithms finally, and I think in the community there's a
good amount of debate around what this means for our day-to-day. A lot of the debate is around semantics, it's around like is this like a core
algorithm change or is just like a side validation signal that Google uses.
And my point is just to say, A, it's definitely impacting Rank. And B, if it is impacting Rank, who cares about the semantics? Like, we ought to care about this. And so basically what I'm suggesting here is, just come with me if you want your website rankings to live. I've always wanted to say that.
All right, I know what you're thinking, "It's a lot to take in here, what the heck?" I've been sent from the future to warn you about future things and you're wondering like, "Why didn't they just send Rand Fishkin if this has to do with like the SEO rebellion in the future?" Well, actually,
unfortunately Rand was unavailable. You see next year he actually writes this really awesome book, and so in 2018 he was on an international movie tour, they actually turned it into a movie. It's a great book by the way, I won't spoil the ending. But I'm just kidding guys, this is a joke. But I'm sure you have a lot of questions, like, "What does all this mean?" And who the heck am I to be sent back from the future to warn you about this?
So I just want to introduce myself really quickly, my name is Larry, I'm the founder of WordStream, I started this company about seven or eight years ago, initially working out of a bakery because it had free Wi-Fi and unlimited Diet Coke refills. So probably not unlike most of you, kind of an individual proprietor, like a one man show doing Internet consulting work. I later turned that into a software company, my background is in electrical engineering. Today I have a company that has over 200 people; we managed almost a billion dollars of AdSpend for over 10,000 customers worldwide. Yeah, my background prior to marketing stuff, I was doing software engineering for various server products, background. And this was kind of relevant. I spent a couple years of my life building long tail keyword research tools, including related keyword tools, and a lot of this machine learning stuff and computing query interpretation, all these stuff, it's actually an area of interest for me personally like from an academic perspective. So that's one of the reasons why I'm actually doing a topic on an SEO topic as opposed to just a pure AdWords topic today.
All right, so enough about me. Back to our story, what's Rank Brain? So according to Google it's used... Well, it was primarily used on long tail queries, however now it's actually used on all queries, "And it's query interpretation that changes Rank." That's a direct quotation. And so how does it work? That's the million dollar question. They don't actually tell you, obviously that's not their job to tell us how this works; we're going to have to figure this out. But basically my theory of how this works, and this is not an official Google diagram, okay, this is Larry's diagram. But I'm from the future, so you can trust me. I think what happens is, someone does a query on Google, it originally was a long tail but now it's any query. Basically, the system needs to make a guess and figure out like, what is the user actually looking for? What content? And then there's a test, its like, did the results have what the user wanted... Yes or no?" If yes, Rank Brain says, "Great, next time I see stuff like this I'm going to send him here. And if the answer is no, it's like, "Rats, next time I'm going to try something else."
All right, so they're saying that Rank Brain is like the number three most important ranking signal in SEO. And if this is true you would expect to be able to see evidence of some of this rank manipulation happening upon doing some data analysis, like how could we not see it if it's so important? And so over the last couple of months I've actually been chasing after Rank Brain, trying to find this, this signal amongst the...or isolate the signal amongst the 200 or whatever number of organic signals they used to rank stuff. One of the ways that you can do this is called a differential diagnosis.
So when they rolled this stuff out initially, they said, "Oh, we're only using it on long tail terms, and we're not using it on head turns." Make sense, right? If you're going to rule out something new, why would you roll it out on the stuff that you have the most confidence on because it could make things worse? It's better to use it on the stuff that you're not 100% sure on so that there's less risk and more potential upside. So doing a differential diagnosis, kind of isolating the two, I was able to analyze behavior of long tail keywords versus head terms for organic search. And what I found was that the organic click-through rates for long tail queries were dramatically higher versus the head terms for any given position. So this actually makes a lot of sense because what machine learning wants to do is it doesn't want to just give you 10 answers that you have to scroll through, it wants to give you one answer that's the correct answer. You see what I'm saying? So what we're seeing actually is that...what we're seeing as I track these keywords over time is the steepening of the click-through rate curves, because rather than giving you 10 mediocre answers, they're giving you, they're piling on the good answers at the top. So I thought it was interesting.
But you might be thinking like, "Well, wait a minute Larry, doesn't it make sense that the long tail keywords have high click-through rates, higher commercial intent, higher query intent?" This is normal. So that's a good argument, it's very hard to figure out the answers to these stuff, but basically I just ran the same keywords like in AdWords to try to figure out, what did the click-through rate characteristics look like for a long tail keywords versus head terms? And basically it looks pretty different, it's the opposite. In the top positions, the head terms and the long tail keywords have similar click-through rate characteristics, and they diverge as you go down the page. So it's basically the opposite when I look at the characteristics in paid and organic search.
So guys, it's confusing here. So the problem is that click-through rates and rankings are codependent variables, it's like, where did the terminator come from? Well, it came from Cyberdyne systems. Well, how did the Cyberdyne create that terminator? Well, they used a part, like a CPU, from a robot that came from the future. Well, how is it possible if it wasn't created yet? That's a chicken and egg problem, what came first, the click-through rate or the organic ranking? And so, the way that you resolve these chicken and egg problems in math is you do something called a relative analysis. So you back out the expected click-through rate for any given position, obviously the higher you are in the organic search results page the higher your expected click-through rate. So by measuring only the extent to which a page and a query beats or is beaten by the expected click-through rate, I can kind of figure out a relative click-through rate that's normalized by position. And basically what I've discovered here is that for every 3%, and this is an average, for every 3% you beat the expected click-through rate, on average your ranking improves by one spot. You see what I'm saying? So if you're beating the expected click-through rate for a given spot, you're likely to show up in this one, two, three, four, five, six spots. If you're falling below the expected click-through rate, it's not that there's a penalty, it's just that you don't get any bonus points. And if everyone else is getting these bonus points it's like, "Wait a minute, I didn't get any bonus points," so looks like you're being depressed, it's just that you didn't get any ranking bonus. All right, so that's my theory, I'm sticking with it, it's the exceptionally high click-through rate, long tail keywords get this additional ranking boost, and this makes a lot of sense because in machine learning in general that's kind of the pattern that you would expect where instead of just spreading it around equally amongst the pages, it becomes increasingly winner take all, they just want to provide the answer.
And so the question here I guess is, if I'm right, what should SEOs and content marketers be doing to prepare ourselves for these changes that are coming in terms of Rank Brain and other machine language based updates to the algorithms? Now, the SEO community is all over the place on this, there's a bunch of people who think that... Google is saying that none of this is true, so... They're not denying it, but they're not confirming it either. So they're saying just do nothing. But my
point is merely to point out that it's not the job of the Google's PR spokespeople to tell us how this actually works, that's our responsibility.
The second thing is, there are a lot of people who think that this thing is so complicated that there's absolutely nothing we can do to optimize for this, and I think this is totally wrong. This isn't a case of really smart people, maybe not fully understanding the problem or something. But basically what I'm suggesting here is, if you're with me so far, that there's this machine based learning system that's auditioning your content, what you need to do is make it so that every time these algorithms audition your content, that you crush these auditions, that you pass the test, and that Google then associates those queries with your content from that day forward kind of thing.
The challenge unfortunately is that these new machine learning algorithms, they're a little tougher to game than previous algorithms, you just blow them up and they just keep coming back. And so that's where some of these weapons come into play. I've brought back with me five SEO weapons that you can use to survive SEO Judgment Day. Are you ready for this?
So our number one SEO weapon from the future is my donkey and unicorn detector. So, I know Joanna was talking about unicorns previously in terms of billion dollar companies, I'm just going to be talking about unicorns in different contexts, and I’m talking about remarkable stuff. So, how to figure out if your content is a unicorn or a donkey. And so, essentially Google doesn't make it very easy to figure out, basically what you need to know is, what's the click-through rate for the organic content that you're publishing? So it's very difficult to know, first of all, what is that click-through rate. And secondly, it's very difficult to know if that click-through rate is any good or not. Like if I tell you, "You've got a 4% click-through rate on the position 3.8." Is that good? I don't know. Or is it terrible? Hard to know, right? So, don't worry, here's how you figure it out. You just go into the search console, Webmaster Tools, you download your query data, see how you can check off click-through rates and the average position, and impression data and all these stuff. So you just download that data, they'll only give you a thousand queries, but it's still better than nothing. Now you have your data in terms of the click-through rates and the positions, but now you have to figure out is it any good or not. And so how do we figure out if those click-through rates are great or if they're donkeys? And so there are a lot of studies out there that tell you what the average click-through rates should be for different positions.
All right, this is one from Moz. And I'm suggesting that you ignore these, you do not want to use these. Why? The reason is because; average click-through rates are totally dependent on your niche. Like if you're in a shopping niche where you've bought these ridiculous, huge, compelling ad units taking up the entire half of the page, that's going to change the click-through rate, organically click-through rate characteristics considerably versus a difference niche that doesn't have these. And so what I'm suggesting is, don't even bother using those studies, instead perform a relative analysis of your own content. All right, so you've downloaded your query data which includes the click-through rates and the average positions, dump it into a spreadsheet, so you're only looking at your own data, and you're kind of looking at, "Of my own content, what is beating the..." It's a plot on average, that red line; I just plotted an exponential trend line. And from this I can see, of the content that I have on my own site, what are the donkeys and the unicorns? The unicorns are the stuff that's really killing it, like top 10% kind of stuff in terms of beating the expected click-through rate for any given position.
You're donkeys are the opposite; those are those pieces of junk at the bottom of the page there. All right. So, donkeys versus unicorns, what's the difference? People think, "Oh, they kind of look the same." They're very different guys. So unicorns, content unicorns are actually six times better than donkeys. So here I've plotted the average click-through rate in black, you could see that the unicorn click-through rates tend to do two times better than average in terms of their click-through rate for a given position. And the donkeys, they're three times worse than that average. So basically, there's a lot of upside here if you can convert a donkey into a unicorn, because basically you would be increasing your click-through rates by six times. So even if you don't believe my theory about, high click-through rate listings get bonus points, even if you don't believe that you should still do this, because you'll get six times more clicks, do you follow? But if I'm right, then in addition to getting six times more clicks, you'll actually get better rankings as well. And so the question is, how do we do this? So we've detected our donkeys, it's kind of the bottom 10% of our content, that's stuff that's at the very bottom of that chart.
This brings me to my number two SEO weapon which is a donkey to unicorn converter. Okay, so guys, you see these SEO title tags, big data solutions, they all suck. Do you know how I know they suck? It's because an SEO headline, like a keyword rich headline where you'd put in that keyword and the secondary long tail keyword, all that ranking for junk in the title, that's the same thing as using dynamic keyword insertion in AdWords. In that scenario we actually have a lot of data, I've got like a billion dollars of data here, and I can do this analysis.
So, when I looked at hundreds of millions of ads, and tried to figure out what are the ads that generated really, really high click-through rates? It was very interesting, the keyword rich headlines, they were like this upper, middle class express ticket if you will. So if you just want to be slightly above average to kind of 75th or 80th percentile, then you should use those dynamic keyword insertions, you should use an SEO title that has like the keyword dense titles. Unfortunately, when I zoomed into just the very top outliers, like the unicorns, the top 5%, top 1% of ads, what I found was that dynamic keyword insertion in AdWords is actually less likely to produce unicorns.
So basically, I thought those were interesting because, what is it about these non-dynamic keyword insertion ads that produce such ridiculously high click-through rates? I'm not talking like 5% high, I'm talking six times higher click-through rates, like 600%. And so, I just downloaded a bunch of data, a couple billion dollars of data, and went through this stuff, and I was just looking at the top 0.00001% of ads, and I'm telling you what I found in common here.
So the first thing, well it had to do with, it wasn't that they were using keywords in their titles, it was that they leveraging emotional triggers. And so, what type of emotional triggers? Well, there were 10 of them, 9 of them, sorry. It was a fear of laughter, amusement, joy, etc. So you just, step one, when you're writing these headlines that need to get like six times, five times click-through rates above average, you have to just pick one of these emotions that make people click like crazy. The second step of this headline exercise is not to write the headline from the perspective of the company. Don't make it like the company is speaking to the user; make it so that one of these personas is speaking to the user. So it's like, the bearer of bad news, the hero or the villain, the comedian, or the feel good friends. The third thing that they did was, they then employed this kind of commonly used employed viral template for headlines.
So a format, emotional hook, content type, topic, four awesome SEO strategies to defeat Rank Brain. See, I use this topic, title, and look, it sold out. So that means its working. Just kidding. Guys, the point is you don't have to do like "Rank Brain, Rank Brain algorithm," those are old tail-tag formulas that are stupid, don't do that. Do these... you don't even have to use the keyword because these algorithms are really smart in figuring out what the topic is about.
Now, because I'm from the future, I just wanted to share a little bit of information. There's been a lot of debate in the SEO industry for many, many years to try to figure out, do SEO rankings have anything to do with Facebook shares and Twitter shares? And Google always denies it, but whenever you look at your stuff on Twitter, on Facebook, you always see the pattern where you're most shared stuff is doing the best in organic search. Guys, there is a relationship but it's an indirect relationship, it's basically the same emotions that make people want to share stuff, and are the same emotions that make people want to click on things like crazy. All right, so that's the relationship, it's just the topic is really exciting, and so it generates a high click-through rate and a high share rate which generates the high rankings. And that was proven in April 25, 2018.
All right. So guys, how do we come up with these unicorn headlines? So I've given you the formula, the three step formula, a unicorn, we're talking top 5%, top 10%, it's like playing a lottery where the odds of winning are 1 in 10. So the problem is we're biased, we think all of our ideas are unicorns, and really they're not really unicorns, they're like narwhals or rhinoceroses, or hummingbirds, or some other animal that looks like a unicorn, but it really isn't a unicorn. So we need to figure out what we're dealing with when we try these new headlines, is it a unicorn or is it something that just looks like a unicorn, like a unicorn fish? And so, the idea here, if the odds of winning are 1 in 10, what you need to do is you have to audition 10 different headlines, you can do more if you want, it'll improve your odds, but 10 is the minimum. And so I just have a question for you guys, if you see these headlines, how many headlines do you see here? Barry, how many do you see here? No, it's one. These are the same headlines over and over with different punctuation. See, it's just like comma, dot, dot, and dot.
So when I say you need to write 10 different headlines, I'm saying it's not just mucking around with the punctuation or the capitalization, you really need 10 different headlines, 10 different emotional hooks, 10 different personas, 10 different triggers. So I've done this here, I've come up with 10 plausible unicorn candidates, but I still don't know which one is the unicorn. The worst thing that you should be doing is like, changing the title tag of your organic content like every week, because Google will be like, "What the heck is going on with this page? It just keeps changing its title, it must be dynamically generated or something." So I'm not suggesting that you willy-nilly change the title every week, instead what you do, just use AdWords, just use the AdWords, you create an ad group, and upload the 10 different possible titles, bid to a position, so like the top spot or number two spot so that you're always comparing apples with apples, you don't necessarily have to...like if you're doing business in New York City or something, you don't necessarily have to pay for the most expensive verticals, like Ireland is actually substantially cheaper than New York.
So all we're doing here, it's basically a user survey, so bid wherever it's cheap. This is a very important concept, use Broad Match for keyword targeting, why? Broad Match is actually Rank Brain. Broad Match is this thing in AdWords where you put up a keyword and it's trying to figure out what are all the different possible related keywords that this thing should be targeted to? And so by using Broad Match you'll actually get a very good sampling of the types of queries that Rank Brain would audition your content for. All right, so are you with me so far?
The Larry's Donkey to Unicorn Converter: Summary. We're going to find the donkeys, we're going to write 10 different headlines, we're going to test out these different headlines on AdWords, and then we'll figure out who the winner is, and then we'll do a one-time swap of the headline from the donkey to the unicorn, and that's basically what you do. And just to prioritize your efforts here, usually it's the case that maybe 5% of your content produces 50% of your traffic. Like for WordStream I've got 16,000 articles that I've written in like six years, 50 of them generate half the traffic. So what we did was we prioritized the optimization efforts on those top 50 articles which is a lot easier than doing it on 1,600.
All right, a note of caution here, don't use those spam sites like CrowdSearch.me, these are like bots that will click on your organic search rankings to artificially increase click-through rates. They actually work very well right now, but the problem is, there's this notion of a log file, so Google can remember where these clicks came from, they log it and so they could conceivably go back in time looking for the spam at some point in the future and just eradicate it all after the fact. So I wouldn't be using stuff like CrowdSearch.me, especially since that's exactly how their ad fraud units work in AdWords. So I'm not suggesting that they use the exact same technology to fight click fraud on organic and paid, but they could definitely be inspired by these ideas, and this technology has been around for over 15 years.
All right, guys we've got Rank Brain by the balls, this is awesome. Oh no, they've got this backup system, it's like reactivating itself. We're going to need even bigger guns. So this brings me to my number three SEO weapon from the future, which is an engagement rate donkey and unicorn detector. So basically the backup system that they have in place has to do with something called dwell time. So what Google is doing is they're not just looking at the click-through rates, but they also want to know whether or not the user then goes back to the Google Search results page and then click on something else because it wasn't what they were looking for. And so basically we can't measure dwell time because it's measured off of our site, it's measured on the Google Search result page.
But what we can do is we can measure user engagement metrics on our own site that are proportional to dwell time. They're not identical but it's proportional, so like bounce rate. You would expect that if you have a low bounce rate that would mean that it's not likely that they're jumping right back to the search result page, you follow? So here's why I just crunch a bunch of numbers on thousands and thousands of quick queries, and what I can tell you is that if you have a very, very good, like a low bounce rate, like 74%, 75%, 76%, then you're eligible, more than average you tend to find that you're in the top four spots, where you have a really, really, really high bounce rate like 90% or something, then you're less likely to show up in those top spots. Now, of course there's going to be some relationship between bounce rate and organic position, but my point is, this is a very unusual relationship. See that kink in the graph, that's what mathematical people call discontinuity, and it really looks algorithmic in nature to me, where you have these two different curves that are discontinuous.
Another thing that I notice is that, if your bounce rate is low enough, you're eligible for those good, top positions. But even if you make the bounce rate lower to like 73% or 72%, it's not like you get any additional ranking point. So it's more like a pass/fail, like as long as it's good enough then you're eligible for those top spots. Whereas if it's terrible, then you become less eligible. But beating it by even more doesn't seem to help.
So about dwell time, we know that Google measures dwell time because in a blog post on March 10th, 2011, they talk about this new feature where if you search for something and you clicked on the link and you hit back, this thing that says, "block all these results" that would magically appear if you hit back within like 5 or 10 seconds. And they said that they were going to use this as a ranking signal. This is no longer with us, but this at least proves that they're tracking the metric. I just wanted to back this up with a little bit of the additional research. Like time on-site is another user engagement metric that we can track, and what I found was that as long as you have good enough engagement then you're eligible to show up in the top six positions, any worse than that it gradually dies in terms of the rankings. Again it's just a pass/fail. And these metrics I would expect to be very different for different verticals right, because some e-commerce related engagement rates would be made perhaps shorter.
All right, so the key takeaway here is to use task... I believe that these algorithms are using task completion rates. So basically, bounce rates, time on site, dwell times, it's kind of a secondary backup metric to validate whether or not the high click-through rate was warranted or not. Because I could trick people into clicking on to my site by offering like free beer or free kittens or free iPads, but if there's no free iPads or kittens to be had, they'll just bounce right back. So basically that makes a lot of sense intuitively, and in fact that's what we're seeing in our data here.
So getting back to our engagement rate, donkey and unicorn detector, guys this slide takes forever, I'm like “I'm not getting any reaction from you guys." It's a handheld device, you can use it, and it’s super easy to use. Basically... Basically, I've mapped out the distribution points for a couple of billion dollars of ad spend, and what I can tell you is like the median conversion rate is around 2%. The unicorns tend to be 5 times better, 5 or 6 times better, so 11%, that's the top decile. And no matter what industry you’re in, whether you're in e-commerce or finance or whatever, that rule of the unicorns being 3 to 5 times better than the donkeys, that held true for all industries.
So basically, I think what we want to do is make it so that our content or basically our sites convert at these unicorn levels rather than the donkey levels. And so the way that we're going to convert donkeys into unicorns, well, first of all we're already halfway there. Remember we were talking about raising click-through rates, so generally it's the case that if you raise the click-through rates organically, you'll also raise your conversion rates. Why? If you can get people excited about whatever it is you're offering, that excitement generally carries through to kind of a purchase or sign up. It's the same idea of, like if you can get people excited about your landing page, more of them will actually convert. Well, that concept also applies upstream to the organic listing.
I can tell you how not to do this. So basically we looked at these high performing conversion rate unicorns at the top of 0.11100001%, and what was the common denominator of these conversion rate unicorns? They had nothing to do with the font, colors, or the spacing, or the images, all these stuff. I know you've probably read some story, words like, "Once upon a time there was a landing page, and we change the front color, and boom," it's like you got a 5% increase in conversions. No, no, no, that's not how it works. What really happens is just the early lead that you see actually disappears over time. Why? It's because the reason why it worked initially was because it was new, it's new and exciting, but the reason why it stops working is it's no longer new and exciting anymore. So this, we're talking about fatigue. So basically, I think marketers discount this idea that... I think they think the games will stick forever, but they don't.
Instead, what I'm suggesting is that you try these big changes rather than going after these little changes. If all you're doing is, you've got this offering and you're kind of mucking around with the lipstick on the pig kind of thing, all you're doing is you're trying to summit donkey hill. What you really should be doing is just throwing out... I'm not talking about like a 5% increase in conversion rate; I'm talking 500% increase in conversion. So basically there's no way to summit unicorn hill by making dumb tweaks to your donkey offer. What you have to do is actually change your offer dramatically.
Can I just give you an example from my own company, WordStream? So WordStream, we're not idiots, we've been around for like six or seven years, but for the first three years of my life as a company, I was offering a free trial of Wordstream, PPC management software, and it wasn't terrible, it had like a 2% or 3% conversion rate. But the problem was it took a lot of effort and time. People had to log into the system and learn all the tips and tricks and all these stuff, it could take over an hour to figure out. And so we ended up with these conversion rates in the low single digits. I then had this idea, instead of...for like three years we were trying to muck around with the videos, and the copy, and the user experience. The idea hit me that really what we should do is just flip this thing around rather than pushing people into a trial of some software, just give them an instant report card of how they're doing, and compare their click-through rates and cost per conversions, and cost per click to other companies that are in their industry, because small business owners are competitive, they want to know how they're doing, and so it motivates people, it pulls people through the funnel. It's also more appealing, it's like more people are interested in this. And so the net result of this kind of changing the offer rather than just tweaking the donkey was that we have a 40% conversion rate on this thing, and that it's persisted over time.
I tell you the truth, WordStream would not exist if it wasn't for that tweak, it just took me 90 days to write up the software, and then it changed the fortunes of the company from being a single digit million company to a multi...a much bigger company.
All right guys; Rank Brain has been terminated, congratulations. What are we going to do now? Are we going to just go home call it a day? No, we're going to do some more time travel to make sure that these algorithms don't mess with us in the future. And so we're going to my number four SEO weapon from the future, it has to do with Facebook ads and the Google Display Network. So how can we use time travel to influence click-through rates and conversion rates basically?
So guys, let me just give you a little explanation of how advertising works. So advertising, the way that works is we're going to promote some inspirational, memorable piece of content to our target audience. The people who then consume that content, they don't necessarily take action right away, but they'll become biased, they'll have heard of us. And so later, step three, when the need arises for the products and services that we're selling, they will do one of two things, they'll either do a branded search for the products or services that you're selling, in which case you've won, automatic win, because branded search have just like 40% or 50% click-through rates. Or, they'll do an unbranded search for your solution, but when presented with like 7 or 8 or 9, 10 options, they will recall your advertising and they will be biased towards choosing you rather than the other people on that list. And in fact we've done a lot of research on this, and what we found was that brand affinity and brand awareness can actually dramatically increase both the click-through rates, and the conversion rates by as much as 2 to 10 times. And so it's not just us doing this, this research, obviously Facebook is doing this type of research, and they're just showing that if you do Facebook ads to your target market, it actually increases paid and organic search ROI. And so just a warning, Facebook ads is also machine learning, so you can't just promote junk ads, they actually have to be great, inspiring, and interesting things in order for it to show up. But I just wanted to give you a couple examples of how this works in the real world.
So this is what I do for my business, maybe you've seen some of my ads. So I'm targeting people who might have an interest in the software that I'm selling but aren't yet in the market for it you see what I'm saying? Maybe they're using hotspot or maybe they're in their search engine marketing industry, or maybe they're a business owner using entrepreneurship. The usual types of people who are likely going to search for my stuff in the future. This is a demographic targeting; one of them has to do with like anniversary within 30 days. Say you're a jeweler, that would be an example of an audience that you could target on Facebook ads who is likely to search for engagement rings and all those stuff in the near future, so you could target like the area around your shop. And the neat thing about this is its ridiculously cheap, it's like for a thousand people you can target them for $1 or $2.
This is our final example, behavioral targeting. You can target people who've purchased certain things. So this is...its limited here in the UK, and in Ireland, but what you can do is it basically just looks at your credit card purchases to see what you've purchased. So maybe there's certain other things that you're complimentary to or competitive with, like if you're selling boats, let's target people who bought anchors.
If you're selling coffee machines, let's start with people who buy coffee,
that kind of stuff. Then you might be wondering like, "What the heck, how does Facebook know that I bought coffee or that I bought a boat?" Well, Facebook knows your email and your phone number, and your credit card company knows your phone number and e-mail as well. All right, so remember on Facebook when they said, "Can you please set your mobile password recovery phone number and your secondary e-mail so that you never get locked out of Facebook?" They're just trying to get all your contact information and they match them up with these data providers, and so they end up having all this information on what you're purchasing. And so I can go after people who are business travelers if I'm a tourism company or something like that.
All right. A final one that I wanted to talk about was this notion of, the custom affinity audiences in the Google Display Network. I think this is really interesting because it's based on search history, so maybe there's like a website that people go to, for me it would be like people who visit Search Engine Land or AdWords blog, or search engine watch, so I can target display ads to people who visit those sites based on their search history. So just biasing them so that they'll remember me, so that when the need arises they'll consider my offer.
If you want to expand beyond these ideas, you can upload your customer lists into Facebook and Twitter, and you can clone those audiences. So what it will do is rather than you having to decide, what are the behaviors, demographics and interests that are relevant to your target market, it will just figure it out by analyzing and finding people who are 99% similar in terms of demographics, interest and behaviors. All right, we're almost done here. So that's time traveling to the past, we're now able to do time travel to the future to make sure that not only will they be biased to us on their inaugural search, but we want every subsequent search to go our way, do you follow? And so the way to do this is remarketing, I'll be back. I've always wanted to say that.
So this is one of my company's... I love Top Card, I don't know if you guys watch Shark Tank here if it's popular, but basically they did a deal with Mr. Wonderful last fall and they sell these three dimensional pop up cards, this is a unicorn pop up card, that's actually my idea. But basically
what we do is we make it so that when people visit their card site, because card purchases are unpredictable, it's like, "I don't know when your birthday is," or, "I don't know when you're mother's birthday is." So we want to make sure that they just remember the brand essence, and so we put together this saturated color video with a lot of emotional triggers, and then kind of showcasing their products. And what it really does is it just biases people to remember what they have to offer, and it's remarkably cheap because video ads have ridiculously high engagement, we're paying like one half a penny for these video views. All right, so this is our ad targeting stock powered by time travel, you can download the slides later.
Final, a final note here, it's this notion of deleting your bad neighborhoods. So Google has said that Rank Brain is a ranking factor. Are you following? Google says Rank Brain is a ranking factor. It follows that if it's a ranking factor there needs to be certain metrics for Google to use to do relative comparisons of competing documents on different sites, otherwise how can they rank stuff? Meaning there has to be some kind of like a Rank Brain score for lack of a better term. So my guess is... I don't have a ton of data on this, but my guess is that it's based on your domain average, normalized click-through rates and engagement rates as well as at the page level. And so if that's true, that's actually how it works for all the other older signals like Page Rank and all the other systems that they used to use. It's also how AdWords work, so it's like a keyword level quality score and an account level quality score.
But basically my suggestion here is, if you have these neighborhoods that are terrible, like bad neighborhoods with really, really high bounce rates, low click-through rates, just delete them. They're not really helping you because they're not generating much traffic or conversions to begin with, but likely that they're being averaged into some kind of the domain level score and you're better off just pulling the plug on those in my opinion.
All right guys, congratulations, mission complete here. We've defeated the robots, let's hear it for humanity. I just want to summarize a couple of key points here, what does all that mean? We've been chasing robots all this evening, what the heck does it all mean for us as marketers? And basically I think what it means is that SEO is changing in a very profound way, more profound than ever before. The old SEO bag of tricks and tactics that we have isn’t working anymore.
Here's one from Rand Fishkin, eight old school SEO practices that are no longer effective. My friend Marcus Taylor over at Search Metrics conducted research showing that these old SEO tactics are actually making things worse. So what he's saying is like, if you have the keyword in a title, you're actually less likely to rank on this thing, he's showing negative correlations in certain verticals, he's showing negative correlations with number of back links in certain competitive verticals.
And so, basically in summary let's not just wait for these bombs to drop, these algorithmic bombs to drop on our sites in the form of these updates, instead what we should be doing is getting into the virtuous cycle of unicorn land. Remember, even if my theories are incorrect, if you improve your click-through rates and conversion rates, you'll still get more clicks and sales as much as 6 times more, 6 to 10 times more. But as I've shown in the data here, I'm pretty confident that those improvements would lead to even better rankings which would be a virtuous cycle of even more sales and leads for your business. The opposite is the death spiral of Rank Brain and other machine language, and the old algorithms, so that's where you have garbage click-through rates, garbage engagement rates. And that's a terrible situation that you should fix anyways, but it's going to lead to even worse situations for you and your business.
Finally, just be open to expanding your inbound marketing efforts to expand to the power of online advertising, remember how advertising really works. We're going to promote this content that is inspiring and memorable to make people remember about your brand and what you have to offer, and actually create demand as opposed to just harvest the existing searches for the products and services in your business. People will become biased because they've consumed this content, they don't necessarily purchase their stuff right away, but when they do they'll do branded searches or will be biased towards choosing you from an unbranded search page. So that's all I have today guys. Be a unicorn in a sea of donkeys. Thank you so much.