author
Michael King
iPullRank

About Michael King

An artist and a technologist, all rolled into one, Michael King recently founded boutique digital marketing agency, iPullRank. Mike consults with companies all over the world, including brands ranging from SAP, American Express, HSBC, SanDisk, General Mills, and FTD, to a laundry list of promising startups and small businesses.

In his Learn Inbound talk, Mike will distil the various elements of Search Engine Optimisation that you absolutely must to do be successful. You will learn how to effectively manage their crawl allocations, internal link equity, optimise content and capitalise on new machine learning driven technologies to scale.

Key Takeaways

  • How to effectively manage your crawl allocations and internal link equity

  • Optimise content and capitalise on new machine learning driven technologies to scale

Video Transcription

LEARN INBOUND make some noise, y'all.

Audience: Woo!

Michael: Yes, yes. So I just realized as I was looking at myself on a screen that I'm kinda like dressed like Mario in a way. I didn't even realize that until just now. I wanna talk about excitement before I get into this, you know, when I was heading to the airport in New York to get here, I wasn't excited at all. And when I got to the airport, I realized I'd left my wallet at home. And this was 9:48 p.m. Or excuse me around 9:00 p.m. In New York last night, and it was the last flight out and if I missed that flight, I wasn't gonna make it here. So I was like, "Well, fuck it. I'm just not gonna have a wallet."

And so immediately I was just like, "What do I do?" And I realized, I've got so much love in Dublin like I have so many friends here. And I reached out to a friend, I was like, "Yo, let me PayPal you some money and meet me at the airport in the morning, so that like when I get there I'll have some money." And so on that flight and then also just this morning like hanging out with a lot of these speakers that are good friends of mine. I got more and more excited to be here, and so I just wanna say that thank you all for being such an awesome city because I'm very excited to talk to y'all today.

Audience: Woo.

Michael: All right. So I wanna talk to you about SEO of course, and I'm calling this The Player's Guide because this is you, this is Google. And you're gonna face all types of obstacles trying to beat Google, you're gonna go through all these hoops, you're gonna do all these things, and you're gonna be like, "It's gonna be great." And this is gonna happen. Google is gonna be like, "Our princess is in another castle." So I'm not talking about strategy today. You're gonna hear a lot about strategy from a variety of different people. I'm gonna talk to you about things that I would do tactically to make these things work for you in an SEO perspective.

So we're gonna go through things that are common that like clients asked me about, or just, you know, random people ask me about and how would I approach them? Like what would we do? So let's start with keyword research. I don't think there's any hacks for keyword research because my process for keyword research has always been very audience driven. And what we're trying to get towards is understanding who's the user behind the visit, and what is their expectation, what are they trying to do so we have the right content for those people.

So effectively, you're creating segmented content strategies. How are you speaking to the person when they need what they need, and then how do you effectively make sure your message is the one that they're looking for. So the concept behind that or the one that I've kinda coined is called Persona Driven Keyword Research. So we do a lot of audience research to understand, you know, who that user is, what their motivations are, what they care about, how do they search. And then we align those searches with the user journey, and we also align those with the right different KPIs.

So it doesn't make sense for us to be like, " Okay, we're gonna put out this infographic and we're gonna expect to make a lot of sales from it." No one looking for infographics is ready to buy something. This is a fact. So if you do that you're not....you know, you're gonna lose your job and you're gonna say that Mike told me to do that and you're gonna be a liar. So what you should do instead is think about, "Okay, well, a buyer's guide that's much lower funnel." That's something we wanna measure on sales, and we wanna have the right keywords and aligned with it.

So effectively this requires content strategy, it's not just content marketing because, you know, if you just make this infographic, you put it out and, you know, you got some shares or whatever. You've done content marketing, but once you've built a system for creating useful and usable content and then maintaining that and sourcing it and so and so on and so forth, you've done content strategy. Well, if we're talking about keyword research, I really feel like there's four tools that are still useful to me, because when these keyword research tools give you these ranges. Honestly, I don't know what to do with that. I can't show a client and be like, "Oh, yeah, here's a range of what you can expect." They're gonna be like, "So? When are we gonna get the top range? I care about the top range."

So I would imagine some people either average that, or they use the top range because that...or the upper bound because that's what the client wants to see. But me I still wanna use those whole numbers. So SEMrush is good for that. I'm pretty sure we're all familiar with that tool. GrepWords is also a tool that can do that, it's an API-driven tool. There's a variety of different plugins out there for Excel and such where you can pull that data. However, I don't know how often they update. Keywordtool.io will also give you that whole number data, and then so will Ahrefs.

So those are the four tools we spend the most time in for keyword research, but again, we're bringing it back to the audience so we can have a strategy behind what we're doing. So let's talk a little bit about content because I can't get out of this without talking about content. And the thing that I focus on the most when we're talking about content from an SEO perspective, is TERM RELEVANCE or TF*IDF. So for those that aren't familiar with that. It's this concept that or it stands for Term Frequency Inverse Document Frequency.

And so it's this concept that if we have a hundred documents that are ranking for the keyword Basketball, and they also have these co-occurring terms like, you know, Dunking, Lebron James. Well, your page that you want to rank for basketball also has to have those terms. So the interesting thing about that, you know, you've seen very many ranking factor studies where this is always one of the top two or three things that people are saying, like, "Hey, this is what matters to ranking." So what I've seen in our experience is that a lot of our clients when we do this, even if there isn't an increase in authority as in links, when we do the TF*IDF optimization, they shoot up in the rankings pretty easily.

So the tool I would recommend for that is a tool called Ryte or Content Success by Ryte. They used to be called onpage.org. And so what this tool will do for you is its gonna... You put in the keyword, it's gonna pull and calculate the TF*IDF for ranking pages for that keyword, and then you could put in your URL and see how that compares. So in this case, we have a URL for the keyword Antivirus Software and that green line graph across the orange bar chart shows that this client is not really using any of these words. And so constant success has a section in it where you can basically bring your copy into it and then optimize it or write it, and then see which keywords it's suggesting that you should use more often in your copy.

And then it'll also show you the weighting of that as you go. So when you're thinking about your content, I definitely recommend you bringing this into your workflow, because it's gonna go a very long way. Also, I forgot to mention on the flight over here, I was sitting next to a very large hairy man. And I was not able to sleep at all, so if there's any lapses in what I'm talking about it's because I haven't slept. So let's talk about log file analysis. Well, Crawl Budget is the name of the game here. We're trying to understand how is Google using their Crawl allocation or your Crawl allocation throughout your site, based on a variety of different factors.

And so there's a variety of different tools out there, you know, if you're working with an enterprise website, they likely already are using the ELK Stack and Kibana is part of that. So this tool you can do some pretty heavy duty log file analysis, or you can even just take the files to Excel and, you know, that's gonna be pretty difficult because log files tend to be very big, you know, you're talking about gigs and gigs and even terabytes for big sites, but let's say you just wanna do a sample real quick dirty analysis you basically just open your log file and text the column wizard in Excel and, you know, you just slice up the columns and you can see with the different HTTP codes and so on and so forth, and then you can start to do your analysis.

But the one thing that you're gonna wanna do because you wanna really narrow this down to the actual Googlebot visits, is you're gonna want to verify Googlebot. And so this is a plug-in that you can use in Excel to look up Googlebot or the look up the IP address through the reverse IP lookup and it'll tell you whether or not it's an actual Googlebot visit, rather than one of these SEOs who are just, you know, looking at your site. So then you can just essentially slice and dice it and compare it with whatever data you want based on that URL. So you can get a sense of, you know, why is Google crawling the different sections of the site, or how often are they crawling?

Screaming Frog also has a log file, an analyser tool if and for whatever reason, you gotta spell that with an S, I don't get it. But I mean we really just improve this language, so y'all just get on board. Just think about my accent. Anyway, so this is a pretty simple tool where you can leverage this for, you know, a variety of different log file analysis ideas or concepts. I'm just gonna walk through a few of the different use cases that you should be looking at. So again, you can get a sense of where is Google crawling and why. So you can see the URLs that are crawled based on verified Googlebot visits, so that step that I just walked you through, and you can do in Excel. This tool will automatically do it.

And then you can identify the Low Value-add URLs based on query parameters. So one of the URLs that are being crawled that are, you know, duplicate for whatever reason, or what-have-you. You can just, you know, look at it based on a query string. And then you can look at the Most versus Least Crawled based on number of events, so it'll show you right about here, and you just filter or sort by a number of events.

So you can see what's the most crawled sections of your site. You can also look at that by Subdirectory as well, and this fantastic animation behind me is showing it. You can see that by content type, so are your images being crawled a lot? Why? Obviously, we want our HTML pages to be crawled the most, and then we can look at Crawl Frequency by User-Agent. So is Bing crawling you guys a lot, or is it just a bunch of spammers or, you know, spam bots and such?

Then you can look at URLs Crawled by Day, Week or Month. You can discover where the Crawl Errors are. You can also find the Inconsistent Responses, so things that you think are 301 redirects that are suddenly 404s, real easy to identify that. You can see those Errors by Subdirectory, you can see Errors by User-Agent again, is it, you know, the spam bots that are seeing it or is it potentially Googlebot mobile that seeing things, that desktop isn't and so on?

You can look at your Redirects throughout your site, and this is something you definitely wanna look at a lot because oftentimes we find that when people do big changes across their site, there's a bunch of links to redirects and that's always gonna be a loss in link equity. Ideally, you would always have the final destination URL that you're linking to. And then you can also see which bots are calling your site. So is it a bunch of spam bots? Is it spoof bots? You can verify the IPs of the different search bots. You can see which pages are too large on your site, you can also see the pages that are slow. And then you can combine your Crawl data from Screaming Frog SEO spider with this and then get a lot of insights from that combination as well.

So you can see which URLs are not crawled by Google but are picked up by Screaming Frog. You can see which spam bots you should block. You can see the frequency by depth throughout the site and the internal linking structure. You can also see the Crawl Frequency based on the Meta Robots and robots.txt directives, and then also you can see the Crawl Frequency by link data.

So you can do all that essentially kind of manually using that tool, or you can use Botify which is kinda like the power tool for this type of stuff. So you just basically bring in your crawl data or excuse me, your log file data with the crawls you've done there, and it does a lot of automated charts for you and you can dig into everything pretty heavily as well. One of the things that's very interesting to me is that Venn diagram. So what are these URLs that Google is crawling and Bodify has crawled? And then what are the ones that, you know, they're not crawling at all?

So you can get a sense of what these issues are just by digging into those different URLs. But the bottom line is that log files give you a lot more detailed insight into Google, so irrespective of what anybody says today, you're gonna get more insight from looking at your log files and understanding what it is that Google cares about. So for instance, we had a client that lost a ton of traffic and they weren't sure why. They were like, "Hey, you know, we've been running these TV ads and, you know, we're doing link building and, you know, I know Penguin happened but that wasn't us." and you know they were like, "We also turned off Paid Search. I don't know what's going on." so what I did is I layered all that data across that time series, so they could see.

So typically when you get hit by an algorithm update, you see a huge spike in crawl activity and then a huge drop-off. And so I layered that with all the other stuff they talked about, and then I also layered when the Penguin update happened, and I was like, "Hey, guys you wanna stop buying links or not?" Cool. So another interesting thing is that we expect as SEOs is that, you know, the crawl patterns are generally dictated by the authority that you have throughout the site. But what we found in a variety of different, you know, reviews of this is that the Social Shares have a much higher correlation with where and when Google crawls.

So next thing I wanna talk about is Auditing JavaScript Sites because there is no way that you are not gonna run into a JavaScript-driven website at this point. And the standard SEO thinking is like, "Oh, yeah, just avoid JavaScript." No, that's not gonna happen because the reality of it is the web has moved on, and we're still holding on to like these static sites, and Google is actually catching up pretty heavily as well. So the first concept I wanna bring up is the fact that View Source is dead. Looking at a website using View Source is not valuable to you at all at this point, except for just checking out the difference between that and what is actually displayed.

So you wanna spend more of your time in the Inspect Element part of the browser. And so what this is showing you is what we call the Computed DOM or Document Object Model, and effectively this is what happens, you know when the page is rendered. These are all the transformations that are seen. Whereas, the View Source version of this is just the text for that page downloaded and not executed. So what you really need to understand when you're auditing these different types of sites is the difference between the View Source version and the Computed Dom. And a quick way to understand this and see if there are disparities is to do two different Crawls and Screaming Frog.

So you can do the text-based crawl, and then do the JavaScript rendering crawl, and then you compare this hash here. So if these hashes are different, then that means there is a disparity between the two different versions of the page, and then you're gonna wanna look closer at those pages. So key thing to note about this is anything that requires a user action is not going to get indexed. So whatever loads at first will be indexed, but if it requires you to click on a tab, or to scroll, or whathaveyou for that stuff to display, Google will never index it.

I would also encourage you to check out this blog post by Justin Briggs, it really walks you through it step by step. But it's pretty much the same as how we always do it but you just have to effectively render the two different versions, and then crawl them and see what the difference is.

So a lot of clients will come to me and they'll say, "Hey, how do we own more of the SERP? How do we get more positions for this keyword? This very competitive keyword?" it's not a thing. The only way you're gonna get more positions for a keyword is if the competition on that keyword is weak, and in that case, it's likely to have very low search volume as well. So you don't wanna be thinking about, you know, how do we get number one, two, and three for the term TV, it doesn't make sense.

But what you should think about is, you know, how do we get more featured snippets, because featured snippets are the key play here. And so there's been a lot of great research on this especially coming out of stat and, you know, they're always triggered by the different question queries and such, and we're seeing the changes in the different types.

So there's a paragraph lists and tables, and you really just wanna optimize for that. But the thing is you would expect that you need to be number one for that. You don't. We've seen it be triggered by positions as late as 71, so it's not necessarily just about authority it's about being the right result, and when you're thinking about paragraph, the sweet spot is around 46 characters. So how do you get these featured snippets? Well, one you wanna research the query space. So what type of results are being triggered based on these different types of keywords?

So if it's a table result or it's an ordered list result, that's the type of thing that you're going to want to optimize your page for. Then you wanna have a header on the page that is near your answer, that directly reflects what that query is. So if the question is, you know, how do I get girls? Well, put that in your header tag. Not that that's the question I wonder about. He's like, "That's a little too close to home, man." So then you wanna have the right structured text very close to that. So again, if it's a table, if it's, you know, the ordered list or a paragraph you want it to be very close in proximity to your header tag, and then you wanna do a fetch and render in Google search console.

And this is one of the only things in organic search where you can get pretty much instant gratification because if it works for you, you'll see that pop up in the next hour, hour or two. Next item, Internal Link Building. This has been proven a number of times by a variety of different companies, especially those with large websites. When you build a lot of internal links to pages throughout your site, those pages rank better. And the sweet spot based on the different studies that I've seen, and also what I've seen with our clients is you wanna build for a big site that is it's around 2,000 links, after that you get the diminishing returns.

So if you have a page that's very important to you have a large site, build 2,000 internal links to it. There are a variety of tools out there that'll help you with visualizing your Internal Linking Structure this is something that the guys that Port Interactive put together. Searchmetrics also does a pretty cool job of this where it shows you the internal link equity that's flowing to the different pages. The gentleman named Paul Shapiro, who's worked out a way that you can do this using R.

So effectively we just, you know, plug a Screaming Frog export into it, and then it'll calculate the internal page rank, and then you could figure out where you can adjust that internal link equity, and then Will Critchlow from Distill just last week came up with some new thinking on this as well. But the quick win is look at your pages that have the most internal link equity and then build links from those pages to the pages that don't. Pretty simple right?

All right, let's get into some Speed Hacks. I like Speed Hacks because I get to get nerdy and, you know, I don't really look like a nerd, so it's like kind of cognitive dissidence for y'all, you're like, "What? This guy's a rapper. He's talking about speech shit." Anyway so the coding in your sight matters and it's not just, you know, how it's written, it's also how its laid out.

So the way that a webpage is constructed in your browser is that the first thing that happens is, you know, there's a request to the server, and then it gets to HTML and it starts to build document object model, and then once it runs into CSS it starts to build the CSS object model, and then once it starts to find JavaScript, it stops constructing the Dom until that JavaScript is complete unless you're running asynchronous JavaScript.

And so all of these different resources require a request and a response. That round-trip from the server and this all takes time. So there's this concept called the Critical Rendering Path, which talks about what is the minimum series of actions that have to happen for the page to essentially render. And so this is what you're seeing when you're looking at page speed insights, it's telling you how your code is impacting the Critical Rendering path. So there's another tool within dev tools called The Code Coverage tool, which will show you how much of your code is not being used at all on that page, and so you can figure out what can be pulled out to then speed up that page.

There's also another tool called Critical, which will load the page for you in a variety of different dimensions, and it'll pull out the CSS that's actually used and the code that's actually used so you can then use it on your site/ shout out to my man Bastian for that tip. Also, you should look into HTTP/2. Because that idea that I just laid out to you for every resource or every file on the page is a request and a response. Well, HTTP/2 to will speed that up, because you download a...or you ping the server for a URL and it's like, "Oh, you need all these other things." And it just starts to push them to you rather than you needing to request and respond to them. But I will say that Googlebot is not crawling with HTTP/2 yet.

So to that point, what are some other things that we can do to speed things up? Well, Pre-browsing hints. I love these because they're super nerdy and they're super cool and they make things faster. So you have Preresolve, Prefetch, Pre-render, and Preconnect. And Preconnect this is a route directive that you can just set up in your in your head on the page, and effectively what that does is it makes a direct connection to that server, so then once you start to need to download stuff, it's already connected and it speeds it up. And when you wanna use that if you look in the... What is this? The network section of Google Chrome and in Dev tools, if you see a lot of idle time, that's a good situation where you might wanna use Rel-Preconnect.

And it's real simple. You just, you know, just like a rel canonical tag or something like that. You start a domain name in there and then it automatically connects to it. Then there's something called Rel-Prerender. What Rel-Prerender does is it loads a page in an invisible tab in the background, and then when the user clicks on the link to that page, it loads it instantly.

Google uses this themselves in the SERPS. So if you type in a branded query like CNN, it's gonna automatically pre-render that page so once you click the link to CNN, then it immediately loads. So I did a test on this to see if this actually works, right? And so what I did is I sent thousands of pre-rendered visits to a series of pages, also had a control set, and then what I saw is that the Rel-Prerendered pages actually did perform a lot better.

The thing is you can only set one URL for pre-render, so how do I pick which one? So what we did is we started to use the Google Analytics API to figure out what is the most likely next page that users will go to. And here's your code. You guys wrote that down right? And so what we're doing is we're just injecting pre-render programmatically, and it improved our site speed 68.35%. A couple caveats here. You don't wanna use it on mobile because people's mobile devices are already slow enough, and you don't wanna use it on analytics packages outside of Google Analytics unless you account for it because otherwise, you'll have fake sessions.

So the way you get around that is deferring the load of certain items with the page visibility API. So effectively what you can do is say, "Okay, we're only gonna load our analytics once this page is actually loaded, not when it's just pre-rendered." And then there's also Rel-Preload where you can load assets in the same page, you know, as you go.

So the thing that's interesting here is that, you know, let's say you have an image that is very deep in your JavaScript, and it takes forever for the code to execute, so it takes a while to then download that image. If you preload that then it's gonna make the page appear faster, but the thing you wanna remember about PageSpeed, you don't have to load the entire page fast you just have to load the above-the-fold content fast because then it looks like it's loaded fast to the user.

Now as far as how you can get more accurate with regard to which page is loaded for Rel Pre-render. There's some other people that have kinda taken my idea further. So there's this concept of using essentially a statistical modeling software to determine what is the page that this user is most likely to go to. So effectively you plug in GA data and then use Markov Chains, and then it tells you like, "Hey, this is the URL that they are most likely to go to."

And then you put that in. And then there's another person that got even more nerdy with this and they're predicting what you're gonna click on based on you slowing down your mouse. So as you get closer to a link, you slow down your mouse and then they do the Rel Pre-render on that, and here's your code for that as well.

Let's talk a little bit about Indexation. Indexation is obviously what we need to do with our website, especially if we have very large ones, it's very difficult for us to know where is Google gonna crawl next, or are they crawling enough. So there's a little-known tip here. You can actually request in Google search console in the crawl section for the crawl team to ramp up your crawl.

And whenever we do any type of like migration or, you know, some big site change, we'll put a little note in there and then they will ramp up the crawl for about two to three days. You can also use your URLPROFILER to check indexation because it's one of the only ways that we can know whether or not at scale these pages are actually in the index.

So another thing that clients asked me about a lot, it's like, "What is this gonna get me? What's the forecast here?" And I'm just like, "This is complete bullshit but I'll do it." So, I mean, forecast modeling is questionable because, you know, we all know that there's no way to tell that, you know, we get you to number one, how many clicks you're gonna get. But a lot of people will use those anecdotal CTR models that you find on different websites, and that doesn't make any sense because Google actually gives you that data. So I'd encourage you to pull that data out of Google search console and build your own models. But as far as how you would, you know, actually calculate this stuff, it really doesn't get much better than the standard ways of, you know, doing like traffic, versus CTR, versus conversions and so on.

But bottom line pull that data out and make sure you're getting the performance level data. And also you can pull even more data out of the GSC API using trees, and shoutout to William for let me know about that one. Okay, Link Analysis. So aside from Indexation, there's really no way for us to know what Google is measuring for links. So there's all these link indices out there this AHREFS, there's Majestic, there's Moz, so on and so forth. But their metrics don't really mean anything if Google doesn't have those links. So I prefer to use a combination of tools, you know, just to like give the client an idea of where we're standing, but again it's hard for us to use any of these tools and really know what's going on.

I love CognitiveSEO just because it allows us to really dig into link spam, it also allows us to visualize these opportunities. LinkDetox is great for identifying link spam as well, and also LinkResearchTools has a section for reviewing disavow files. People tend to use the disavow file like it's a machete, it should really be more like a scalpel. So I would encourage you when you get a new site to take a look at that disavow file using this tool and, you know, just looking at it yourself and make sure that you're not, you know, just chopping down the entire forest. That metaphor really just kept going.

And so anyone is still doing link removals, LinkDetox and Pitchbox actually integrate with each other, so you can completely scale that as well. All right, let's talk a little bit about Outreach because that's the one thing that, you know, everyone will always struggle with until the end of time. So I've got a few different tactics that I use that have been kinda time-tested, so I do something that we call, you know, Video Outreach with the tool called BombBomb, and what we find is that we get like something like an 80% response rate when we send these videos, because it's not some random person for some random country sending you a poorly worded email, it's you actually seeing me.

And so what BombBomb allows you to do is like send a very short video, and give you analytics when people actually look at it as well. So another tactic that I use pretty heavily using BuzzSumo, is we identify what's the most shared piece of content on the site, and then when we reach out to people, we talk to them about that piece of content.

Why? Because people are already talking to them about that piece of content, so it doesn't seem so out of left field when you're like, "Oh, yeah, I looked at your contact page. Your blog is great." Like you just sound corny when you do that. But if instead, you talk to them about something that people are already talking to them about, they're gonna be far more engaged with you.

So fundamentally, Outreach and link building are exactly like sales. And it's so funny because, you know, in the Inbound Community we are so against outbound sales but we sit here and do link building all day. My homie, John Henry said it best. He explained like how you can actually turn a link builder into an outbound salesperson, but effectively we both build a funnel, we both optimize that funnel wherein we're creating prospects, we're turning those into leads, sending those and the opportunities, turning those into customers. Well, let's talk a little bit about Machine Learning. You know, I had to bring that up that's one of the buzz words. Let me just say Machine Learning, Drones, Block Chain, Cryptocurrency, let's just get them all out.

So the one thing I wanna be clear about with Machine Learning is it's often conflated with, you know, some of the other ideas of artificial intelligence and deep learning. These are distinct things, so artificial intelligence is a big idea, machine learning is a type of artificial intelligence. Deep learning is a type of machine learning. So effectively what machine learning is ways for computers to draw conclusions without explicitly being programmed. I like to say it's a mathematical version of guess in check. Just like we do CRO, we're like, "Hey, we think this might work. Let's try it. Hey, the test worked or it didn't work." It's guess-and-check.

People like Ali are a lot better at the guess-and-check then maybe me but it's still guess-and-check. So there's two different types, or there's a lot of different flavors of Machine Learning but two key ones I wanna talk about are Supervised learning and Unsupervised learning. So Supervised learning is when you say, "Here's the data set. Here are the outcomes. Here's a new data set. Give me what the outcomes are based on that data set." And then Unsupervised learning is when you just give it a data set and say, "Hey, what are the patterns? What are the outcomes? You tell me."

Then there's Predictive Modeling. So based on what's happened before, whether that's leads or sales or whathaveyou. What's going to happen with these users that have similar characteristics? Then there's also Natural Language Processing. We see a lot of that going on in search, especially with voice search and things of that nature but it's really the computer understanding what you're asking it.

And then there's Chabots. Chatbots fantastic in that you can teach them how to respond to things that a user is saying to it. So training Chatbots typically involves, you know, putting a lot of data into it and having it sorted out and figure out what the response would be to your question based on that data set.

So sales software has solved many of Outreach's Scale issues, in that a lot of sales software's are doing these things to qualify leads, to speak to leads, to move them through the funnel. They use it for our Lead Qualification and Scoring, they use it for Close Prediction, they use it for Prospecting, they use it for Lead Intelligence. So if Outreach is like sales and many of these tools already have this stuff, why don't we do the same with Outreach?

All right, so typically when you're doing Machine Learning, you're using programming languages, whether it's R or Python. I'm not gonna go that deep with you all today. There's three things that you can use Machine Learning to scale your outreach. So Prospecting, Initial Outreach Research and Overcoming Objections. This tool is called Orange Canvas.

his tool allows us to do Machine Learning tasks and all the other data mining tasks drag and drop, so it's not requiring you to know how to code, not requiring you to know math. Again, this is you doing guess-and-check. So what you can do is essentially drag and drop the different models and then, you know, see what works for you. So what we're gonna do here is use supervised Machine Learning to predict which sites are gonna be worthwhile prospects.

Now, what would you do I would imagine if you've been doing prospecting for a long time, you've been collecting a lot of data for a long time. You know what good prospects are, you know what bad ones are. So essentially I have all my data on all the prospects, and then I have one column that says whether or not it's a good prospect. So zero is a bad one, one is a good prospect.

And then I wanna clean that data and then specify which metric we're trying to optimize against, and then I tried different models to see what actually works. So this is literally me just dragging and dropping. There's a file here and then I do the test and score with all these different models, and then I do the confusion matrix to see which one did the best, and then I pick a model based on that.

So at this point, I realized that the Random Forest Classification works best, and then I do predictions, I plug in a new file of data and then I get my data table of prospects and that's what that looks like. So relatively simple. I mean, you guys do way harder things than that, and then so on the research side for your prospects, it's a variety of different tools that are leveraging machine learning. CrystalKnows, I would imagine a lot of you have already seen, it's giving you details on how do I speak to this person I'm reaching out to? What type of communication style do they use?

There's also PeoplePattern which will take social media profiles, and then figure out what the persona is on these users. And then again going back to this concept that I talked about before where we're like, "Okay, let's look at the most shared content on a given site." Well, you can run it through a Text Summarization tool so then you can read just a little bit of it and, you know, do your outreach, rather than having to read a whole, you know, 500-word blog post. So effectively that idea improves that BuzzSumo Tactic that I just talked about.

And the last thing I wanna talk about is overcoming objections using Chatbots. So what you'll find if you're doing a lot of outreach is that you're gonna get a lot of the same objections over and over, like people are like, "Oh, we don't take these types of posts." Or, "We're not gonna build your links." Or, "We need to be paid." Or whathaveyou.

All of those things are gonna become patterns, so it's gonna be very easy for you to spot it. So what I'm suggesting here is that you can tie a Chatbot to your outreach email, and have it automatically respond to that so you don't have to waste your time fielding responses for things that aren't likely gonna close.

So what you use is a tool called API.AI, and this is a tool that allows you to set up a Chatbot without any code. So effectively what you're doing is you're saying, "Here's what someone says to me and here's what the response might be." And it will learn from every additional touch point, so every person that talks to it, it's gonna learn how to respond to it even better. And then what you can do is once you've set that up is you can connect it to your Gmail using Zapier or Zapier. I don't know how you say it, and then it will automatically respond to your emails where those objections happen.

So who am I? I'm Zora's dad. My name is Mike King. I am a recovering big agency guy, also Full Stack Developer. I run a better marketing agency called iPullRank, we are based in New York. We do all these things, and to the point of me dropping all my buzzwords earlier, we're also launching an initial coin offering called Skratch for another website that I own called undergroundhiphop.com. That's all I got.

you might also enjoy

Share your email address and we’ll keep you updated on all upcoming marketing related events and news so you never miss a beat...