The webinar How is technical SEO shifting to the new standard? is the first episode in the SEO in Orbit series, and aired on April 17th, 2019. For this episode, we spoke with Detlef Johnson about how technical SEO has become an essential element when ranking a page in 2019. How does technical SEO give you an advantage in an evolving search landscape, and what does it promise for the future?
SEO in Orbit is the first webinar series sending SEO into space. Throughout the series, we discussed the present and the future of technical SEO with some of the finest SEO specialists and sent their top tips into space on June 27th, 2019.
Presenting Detlef Johnson
Detlef Johnson is Editor at Large for Third Door Media. He writes a column for Search Engine Land entitled “SEO for Developers.” Detlef is one of the original group of pioneering webmasters who established the professional SEO field more than 20 years ago. Since then he has worked for major search engine technology providers, managed programming and marketing teams for Chicago Tribune, and consulted for numerous entities including Fortune 500 companies. Detlef has a strong understanding of Technical SEO and a passion for Web programming. As a noted technology moderator at the SMX conference series, Detlef will continue to promote SEO excellence combined with marketing-programmer features and webmaster tips.
This episode was hosted by François Gouble, serial entrepreneur, and the Co-Founder and CEO of OnCrawl. He has founded several companies and is actively involved in the startup ecosystem. Passionate about semantic analysis and search engines, he loves to analyze scientific Google publications and is a regular speaker at SEO conferences.
What early SEO looked like
Search engines also looked very different. Google was a massive newcomer, and AltaVista had most of the market share.
What is technical SEO?
Technical SEO is an assurance from the technical point of view that your published content is samplable (indexable) by search engines. In 2019 we’re largely talking about Google when we refer to “search engines”, but this also includes Bing and Duck Duck Go.
These engines use bots, which can be thought of as robot users, that need to go to your website to pick up your content so they can store it in their own databases and deliver it in response to user queries, the things that users search for.
[Note: this has changed since the airing of the episode. Google now uses Evergreen Googlebot, which is kept up-to-date with the latest releases of Chrome. The Evergreen bot is used both for indexing and for Google’s user-facing testing tools.]
Technical SEO has to look at issues like these and solve them for businesses whose websites struggle with visibility.
What to be excited about in technical SEO
Opportunities in the modern world of impressive frameworks center around remembering web standards and adhering to them. What Detlef find most exciting includes the following concepts:
Semantic HTML5 containers allow you to describe your content. You use markup to provide meaningful information about your content, not only for search engines, but for many other user agents as well.
This is something that’s fundamental to the web. This in incredibly important to how the web works, but is often overlooked by some of the large frameworks and by some people who design and develop websites.
This is a type of markup often cited by SEOs, but which is also often overlooked by modern website designers. They forget that there are facilitating standards that Google pays attention to and that can enhance your search listings.
Detlef finds there are many new things that he finds exciting, although he’s a little reserved about them. One example is the AMP (Accelerated Mobile Pages) standard for mobile indexing in the future.
For Detlef, this is one of the biggest exciting elements in the last year of two. We’ve recently found ways to use cache server capabilities for implementing SEO. This is also known as Edge SEO and is an entirely new concept.
This is something that will be able to help companies migrate from one platform to another by taking advantage of cloud capabilities.
History of technical SEO
In the early days of the internet, SEO was seen as something done by webmasters; there were no marketers in the original SEO groups back in the 1990s. There was no marketing SEO; all SEO at the time was technical.
The appearance of the GoTo.com search engine with penny clicks suddenly drew the interest of marketers because they were able to buy results. Google AdWords solidified this effect, which turned Google into a behemoth ad agency.
Marketers are now intimately ingrained in SEO practice, and we can say that technical SEO has waned.
Resurgence of technical SEO
SEOs as marketers
Detlef believes that SEOs who are unfamiliar with web development will likely stay in the marketing department. There’s still work there to be done to translate what is being done on the technical side.
Technical SEOs as full-fledged developers
To Detlef, the technical field is one where SEOs should become full-fledged developers in order to be able to address issues. This should go beyond an ability to “talk to the developers”, into being able to provide snippets of code. Technical SEOs should be able to get our hands dirty and fix things on the client side.
This is Detlef’s primary focus as he tries to bridge the gap between SEO and web development. He tries to leverage his column on Search Engine Land, SEO for Developers, to encourage developers to participate in technical SEO, and to encourage marketers who have development skills to ask questions and participate as well. This is something he sees as being very important for the future.
Technical SEO today
As search engines grow and use increasingly advanced web technologies, the technical aspects of a page that make it easy or difficult for a modern webpage to rank also evolve. Here are some of the first and most critical technical concepts to look at when ensuring that a page is SEO-friendly.
In SEO, we talk a lot about rendering. For marketers, this might be a vague concept, although you probably have some idea of what it means.
The first thing you need to understand is that there may be some content that doesn’t get displayed right away. It might have a delayed rendering, or it might be rendered in a constant update cycle. This can be seen if you’ve loaded a page and you’re seeing something that’s changing in real time.
Modern frameworks have taken that concept and applied it to a whole user interface. That’s the role of frameworks like React, Angular, or Vue–these, along with Ember, are the main ones, though many more exist. They provide web developers with the ability to design a whole user interface where everything from the start can be updated in real time to reflect the state of a database. If the database changes because the score changes, this come down the pipe and be modified on your screen without you needing to reload the screen.
This sort of thing poses problems for search engines because whenever they come to your site to index your content, they’re only going to get one snapshot in time. This poses a problem: is the data in the snapshot old? Is it new?
The frameworks don’t always help search engines unpack all of this and render it at the specific moment when the page is visited by the search engine. Often they leave that work to the client (the user’s browser), and search engine bots really just don’t operate that way. They have a separate process for rendering content. They’ll index your content, and then if they deem it “worthy” of rendering, it will be sent to a separate rendering queue. Rendering costs search engines a lot of energy and money.
When you have content with delayed rendering, the recommendation from a technical standpoint–which has been echoed by Google itself–is to try to do pre- or hybrid rendering.
Detlef’s preference is for pre-rendering: server-side rendering (SSR). The reason for this is that we’re talking about optimizing for search engines. If you’re only half-optimizing, you might want to look at hybrid rendering. If you’re trying to go all the way with your optimization, the only way to do that is to server-side render.
If your website doesn’t change in real time, why are you even using React in the first place?
This problem brings up fundamental questions of web development and design like the one above. As a technical SEO, Detlef loves this discussion and thinks it’s an essential one to have. He believes many sites are over-engineered because–in this example–React is the new toy; developers are interested in acquiring experience with it because it’ll be great for their career. This leads them to sell React to corporations, which can lead to indexing issues and to the business suffering.
This is just one example of how it’s important to get SEOs involved in all decisions early in the process in order to help make website technology and structural decisions that impact SEO.
You may still have situations where client-side rendering is required. In this case, you still have lots of options. You can detect the user agent strings and send pre-rendered pages to Googlebots while allowing users to use client-side rendering. This needs to be part of the discussion around modern website development and technical SEO with modern frameworks.
This has been a trending topic over the past couple of years.
Google has been behind the AMP standard to force increased page speed. They’re always excited to pull out statistics on how they’ve sped up significant parts of the web.
Google definitely recognizes the importance of page speed and give a ranking boost to faster sites.
Page speed, however, is not the silver bullet. Even if you have fast pages, you still need to have quality content that is popular and linked to.
Speed should not be the one area where you fall behind, but you don’t have to have a perfect score on Lighthouse (a Chrome developer tool that offers many metrics). A lot page speed fixes are both simple to implement and good for improving user experience.
As it’s key for user experience, it shouldn’t be neglected. You can improve speed though best practices for building websites:
- Compressed images
- Bundling scripts (combining resources to limit the number of web requests)
Another big topic in the SEO industry is machine learning, from Rankbrain to entity-based indexing.
How search engines use it
Machine learning is clearly something that’s essential to Google for serving excellent search results, notably through Rankbrain.
Search engines use machine learning to predict what would be good results for queries they’ve never encountered before. In these cases, machine learning engineering can surface results that may be of interest to the user.
Should SEOs be using it?
From the publishing side, machine learning probably has less usefulness.
It’s something that can still be explored. In the SEO world, there’s a “fight fire with fire” attitude: if Google’s doing machine learning, we should be doing machine learning in response. Detlef is not in favor of that. The investment required to develop a machine learning application is significant, and ranking payoffs aren’t guaranteed.
However, it can, in certain circumstances, make a huge difference in rankings for some larger companies. Detlef gives the example of Amazon.com, which may use machine learning to analyze and chance a few things about what they present to search engines.
Machine learning would be particularly useful in helping to change your content in response to conditions. The question then becomes: do you really want to give your content over to an algorithm?
Even if you’re pretty confident in your algorithm, there are serious hazards.
Error in automatic fact-checking on footage from the Notre Dame (Paris) fire in April, 2019
SEOs have also started using machine learning models to predict ranking changes and search behavior. Detlef believes there’s space to explore here. It’s a trendy technology, but he believes we need to wait to see machine learning mature before he’s willing to recommend that people invest in it for SEO.
Evolving search UI for mobile search
We see more and more comprehensive SERPs. And this is a great timeframe for this subject, since we’re in the middle of seeing a change in Google.
When Google was first created it was just a search bar, their logo, and two buttons (“Search” and “I’m feeling lucky”). And the SERPs were just ten blue links.
They’ve slowly started adding elements as they’ve bought properties, such as Keyhole to enhance Google Maps. This allowed them to bulk out their search results to look more like Yahoo!. SEOs used to tease Yahoo! for being a portal and looking too busy.
Mobile use makes SERP design a relevant question today. Screen real estate on mobile is incredibly precious because of the relatively smaller screens, which makes a stripped down approach pertinent.
While Google uses knowledge base panels and carousels for results like news and AMP results, they’re doing their level best to become Hal9000, the computer that can do everything.
Detlef believes that the mobile user experience is key, and that Google has been digilent at producing a good mobile experience as a search engine. It’s interesting to see how different search engines measure up when you compare mobile user interfaces. Detlef’s favorite is a throwback to the original look of Google: Duck Duck Go, which presents only search results with no additional elements added. However, Detlef recognizes that, as a technical SEO, he’s not necessarily the best representative of all search engine users.
Key areas to follow in the future might include voice search for mobile and evolving mobile SERP real estate.
Technical SEO includes the use of technical tools to measure and understand SEO performance. One major tool is the analysis of web server log data.
In the “old days”, log analysis was central to SEO, though we no longer do as much log analysis today.
Difficulty with analytics data
The OnCrawl Log Analyzer impressed Detlef particularly because of how OnCrawl crosses log data with analytics data.
Analytics data shows you user information. However, the growing use of ad blockers means that many users block scripts, including the analytics scripts. For some sites, this means that there are fewer and fewer hits visible in analytics dashboards. In essence, to protect personal data, search analytics has gone “dark”: Google no longer gives us the keyword in the referring URL string… (Duck Duck Go still does!)
It’s been Google’s evil doing ever since its search went dark on us and provided tokenized referring strings instead of google?q=the+user+query https://t.co/BjwRnfR4eO
— ᛑᛂᛐᛛᛂᚡ ???????????????????????? (@deckarts) April 17, 2019
In the “old days”, SEOs would have packages based on log data with the stats of all the traffic to a website. Now, everyone’s been pushed by Google into using Google Analytics for the same purposes.
Advantage of log file analysis
Detlef thinks that log file analysis gives you a far greater amount of information, provided that you know how to use the tools to mine log files, whether using a BI tool like OnCrawl or on the terminal itself using GREP utilities to find what you’re looking for. This may depend on your level as an SEO.
Abstracting away from your server into applications that sit on top (the website itself, Google Analytics with its scripts) has gotten SEOs’ skills rusty.
Log file analysis is as pertinent today as it was in the beginning. It’s incredibly useful, and even more so now with the advent of privacy and security concerns that lead to the use of ad blockers. This means that you’ll have a better view of your traffic by looking at logs rather than just Analytics.
Detlef uses Google Analytics in combination with log files.
Tips for technical SEOs
This will also allow you to understand how the major frameworks function, and how to advise people who want to rank a site that uses these technologies. This understanding will help you advise people on how to do SSR, whether it’s universal (Angular) or HTML string (React)… It will allow you to be able to talk the talk to get developers who work in these frameworks through the process of indexing. It will also allow you to keep a sharp eye out on opportunities you can get with Edge SEO, which allows you to do both basic and advanced SEO things “on the edge” without having to involve a developer directly, while still collaborating with the development team.
The more you become a developer yourself, the better off you will be.
If programming is not for you at all, you might want to keep your technical SEO to the basic level.
Favorite technical trick: Edge SEO
Cache or Edge SEO is one of Detlef’s current favorite things to have in his arsenal.
Least favorite SEO myth: number of H1s per page
The idea that you can only have one H1 per page is a myth–and one which Detlef might have had a hand in propagating in the early days. It’s perfectly fine to have more than one, and it might even be logical to have more than one. An optimized page in many cases will have only one, but having more than one does not spell disaster.
Why? Watch this: https://t.co/WWBoHOQUNr
— Barry Adams (@badams) November 8, 2018
Flash was destroyed by Steve Jobs not allowing it on the iOS platform. Flash had its own unique problems.
Hats off to Dan Taylor and all of the people on the cutting edge. Detlef had taken a hiatus from SEO, and upon returning didn’t see any innovation that stood out as much as Edge SEO.
Yes, you can certainly serve e-commerce product pages–if you want to.
AMP comes with inherent problems:
- You’re storing your content on Google cache
- Until April 2019, you couldn’t get your own URL
- Google’s desperation to keep AMP from failing: Google moved forward with the web exchange package that hadn’t been approved yet as a standard in order to force the issue
- You may never get hits on your origin server despite visits to AMP pages
You as a publisher have to make a decision: does AMP add value or detract from your value? News sites, for example, really have to struggle with this decision.
As a webmaster, Detlef is not in favor of the inherent transfer of control of your content to Google, as a commercial enterprise.
Any SEO on a big site that requires Angular or other dynamic frameworks requires advanced SEO tactics. On a big site, you’ll likely use logic to power your metadata and titles, etc. It becomes unwieldy if everything is totally static, so you’ll end up using some language to use includes for advanced SEO.
When it comes to Angular or React, your SEO is driven by a database and the algorithm on the front-end.
Detlef doesn’t have a preference: he’ll use whatever the job requires.
[Ebook] Technical SEO for non-technical thinkers
Multiplexing, in a nutshell, is how many hops it takes for you as a user of the internet to get to the server which may be across the world. Multiplexing concerns the nodes or switches in the network that need to be passed to reach the server.
Cache servers located near the user solve this issue. Cache servers contain a copy of the site as it generally is within a refresh cycle, so that it can be served directly, skipping all of the steps in the middle.
HTTP/2 gives us more facility with solving the multiplexing issue. Detlef thinks people should be moving to HTTP/2 as they can.
SEO in Orbit went to space
If you missed our voyage to space on June 27th, catch it here and discover all of the tips we sent into space.