If you have spent any time studying SEO and listening to industry thought leaders, you have undoubtedly heard the advice to cater – above all else – to the user experience. It gets a lot of attention. And rightfully so! If you think about Google’s mission to organise the world’s information so it is universally accessible and useful, this strategy falls in line to an extent.
But catering to the user experience is not enough. In this article, I argue it is secondary to the experience of the Google Search Bot. While it is not acknowledged nearly as often as it should be, in the SEO hierarchy of needs, the Search Bot is the most important visitor to your site.
Here’s why. The Search Bot is essentially the gatekeeper between your content and target audience in the search result pages (SERPs). While it is important to queue up the ultimate user experience on your website, this strategy is utterly worthless if the Bot is unable to access, crawl, understand, and index the content. That is why I argue the Search Bot is the most important visitor to your website. Only by giving Google the ultimate crawling and indexing experience – so they can do their job efficiently and effectively – will you validate the investment in user experience and command the search attention you deserve. It’s that simple.
Introducing Google’s Perfect World
What does a website built for Search Bots look like?
If you take for granted that the Search Bot is the most important visitor to your website, the next logical question follows. What should your website look like? My colleagues at Huckabuy call this “Google’s Perfect World”. Website content is written in flat HTML, everything is marked up with world-class structured data, and pages load instantly.
Build Google’s Perfect World Via Dynamic Rendering
The most important SEO policy change in the last decade.
Fortunately, the solution to this problem isn’t a mystery at all. Google has been very public in recent years about how they want websites to be designed to facilitate the ultimate crawling experience.
- In 2014, they endorsed structured data markup as the preferred method of communication with their Bots.
You just need to add a tool or step in your server infrastructure to act as the renderer. Here are some of the suggested steps it can perform:
- Converting your web pages into flat HTML
- Marking content up with relevant structured data
- Caching pre-rendered pages so they are instantly available for crawling
Again, this is a critical policy change from Google. Essentially, they studied internet trends, recognised the explosion of content over the years, saw the influx of new coding languages, and realised things weren’t going to change. Businesses weren’t going to listen to their directives in a trend-setting mass. So, they basically threw up their hands and asked for website versions that their Bots could easily crawl, understand, and index to do their job better.
What makes this initiative so powerful is that it allows you to cater to your most important visitor – the Search Bot – but also serve the interests of the user without making any critical tradeoffs.
The SEO Benefits of Dynamic Rendering
There are many.
Ultimately, when you give Search Bots the perfect crawling experience, a lot of good comes to your website. Here are at least 5 benefits to consider.
1) Google sees, understands, and indexes more of your website content.
2) Maintain a version of your website designed for the user experience.
Google’s John Mueller has commented on how user experience is a “soft ranking factor”. With dynamic rendering in place, developers can continue using state-of-the-art front-end coding languages that give human visitors an engaging experience. They can use JS React and JS Angular, for example, without fear of SEO repercussions.
3) Capture valuable insights into Google’s crawling behaviour.
When you add a dynamic renderer to your server infrastructure, you are able to monitor what calls it. For the relationship with Search Bots, this means you can track when and how often they are visiting your website, how many pages they are crawling, and how much time they spend downloading content among other important metrics. It is a unique opportunity to report on Google and make adjustments in your SEO strategy as a result.
4) Increase the number of ranking keywords associated with your content.
When you mark up your website content with structured data – the language of search engines – Google is able to better understand your content and make more connections with the search queries your target market is making. As a result, you will tend to see your number of ranking keywords grow, which means you are appearing more often in front of your most qualified potential customers.
5) Qualify for rich results in organic search.
When you mark up your content with structured data, it is qualified for a new organic search experience called rich results. Rich results are Google’s way of saying thanks for helping their search bots understand what your business is all about. These features enhance the standard blue links that you are accustomed to seeing in SERPs.
There are over 30 applicable rich result opportunities including:
- Frequently Asked Questions
- Ratings and Reviews
- Product and Service information (Pricing, Availability, etc)
These features are embedded directly on the SERPs to make your links stand out from the competition, encourage more impressions and click-throughs, and satisfy queries faster than ever.
Google’s Perfect World Matters Now And In The Future
Technical SEO has come full circle.
Image Source: Technical SEO Is a Necessity, Not an Option
If you take nothing else from this article, remember that a robust SEO strategy is predicated on the perfect conversation between your website and Google. If Google can’t come to your website, access your content, and understand what you are about, then everything else you do for the human angle is all for naught.
Finally, note that building a version of your website for Google’s Perfect is not a short-term strategy for the here and now. If you are wondering what it will take to future-proof your strategy well into the new decade, look no further than structured data, which powers recent innovations like voice and zero-click searches that may very well take over the SERPs as we know them sooner rather than later.
Technical SEO is difficult. It takes a lot of marketers out of their comfort zones, which is why they tend to avoid these initiatives. It’s why fixes like dynamic rendering haven’t yet gained the traction and notoriety they deserve. But most things that are difficult tend to have big payoffs. Remember that structured data, dynamic rendering, and Google’s Perfect World are the building blocks for a sustainable, long-term organic search channel plan. Now is your chance to distinguish your SEO from the competition.