Technical SEO has gotten more complex because of many things: choice of hosting, different website platforms, how often content changes, and even what Google expects on its results pages.
Thus maintaining your website’s overall performance can be difficult without a technical SEO checklist, especially for those without a technical background. So let’s go through the basics so you can do it yourself and understand the process. Even if you decide to hire an expert later, you’ll know what they did on your website and why. Remember, the most expensive SEO services are often those where the outcomes are unclear.
What Is Technical SEO?
If you’ve followed along thus far, you should already have a budding comprehension of this crucial term and its broad-ranging implications. Simple definition says that technical SEO, in essence, refers to the optimization of a website’s backend structure to enhance its visibility and ranking on search engines.
Read more: What Will SEO Stand for in 2024
Contrary to what some might believe, technical SEO isn’t a novel concept that has recently sprung into existence. Its roots can be traced back years, underlining its enduring significance in the realm of digital optimization.
A clear example from 2002 involves researchers from the University of Nebraska in Lincoln. As mentioned in a Wired article, they looked into the issue of “link rot.” While creating graduate-level biochemistry courses, they noticed many hyperlinks were quickly disappearing. This led them to track the loss systematically. Within 20 months, they discovered that nearly one-fifth of their total links were gone. This early study of how digital content can easily disappear highlights the basic challenges that have influenced today’s understanding of technical SEO.
Essential Technical SEO Terms to Understand
While there’s a vast lexicon associated with technical SEO, we’ll zero in on the most pivotal ones. By optimizing, adding, or upgrading these components, you can confidently anticipate a surge in organic traffic.
- Page Performance & Core Web Vitals
- Website Navigation and Crawl Depth
- URL Structure and Breadcrumbs
- Crawl Errors
- Dead Links
- Schema Markup and Rich Snippets
- XML Sitemaps and Crawl Budget
- robots.txt
Technical SEO Checklist
At the heart of SEO is content – it represents our business, our value proposition, and what we bring to the table for our customers. With a solid foundation in content, everything else tends to fall seamlessly into place.
We will use content as a starting point and show you how technical SEO is wrapped around it. Because technical SEO is not here to put your content in chains. It’s here to help your content reach the target audience and communicate messages in a clear way.
And regardless of whether you are running an SEO campaign for an established brand or working on SEO for a startup, the same rules apply.
Page Performance & Core Web Vitals
Page performance differs from speed. The emphasis is on the user experience, and the objective is to ensure a website that loads smoothly on a 4G network. This is where core web vitals become essential.
Visualize a potential customer relaxing on the beach, browsing your site on a 4G network; the website should load without hitches. The initial content should appear on the webpage within the first two seconds of browsing. For the user, this implies they won’t need to return to Google to find another site; they’ll see that your site offers what they were searching for. This is known as First Contentful Paint (FCP).
Once the FCP is in place, we need to offer more, and quickly.
Thus, we address First Input Delay. First Input Delay (FID) measures the time it takes for a website to respond when a user first interacts with it. In simpler terms, it’s the delay between when you click or tap something on a webpage and when the website actually starts doing something in response. A shorter FID means the website feels more responsive and quick to users.
Note: In March 2024, INP will take the place of FID in the Core Web Vitals. INP (Interaction to Next Paint) checks how fast a webpage reacts when you click, tap, or type on it. It looks at all interactions during your visit and picks the slowest one, but ignores really unusual delays. When you see a Group INP value in a report, it means for 75% of the times people visited a specific webpage, they experienced that speed or faster.
Finally, we address Largest Contentful Paint. Largest Contentful paint (LCP) tracks the duration required for the primary, most evident content element (such as a prominent image or text segment) to load and display when accessing a webpage. This metric offers insight into the wait time before users view the page’s main content.
It’s crucial for these three metrics—FCP, LCP, and CLS—to be optimal. The indicator of success? A consistent 28-day average score. Occasional minor issues on your site, be it from a suboptimal plugin or hosting challenges, won’t be harshly judged by Google. What’s essential is maintaining a favorable 28-day average.
Bonus tip – apart from Core Web Vitals, ensure your Cumulative Layout Shift (CLS) is also good (green). CLS quantifies the extent to which webpage content unexpectedly shifts or “jumps” during loading. This is particularly important for your SEO.
Consider reading an online article. Just as you’re about to select a link or proceed to the next section, the layout alters due to a late-loading image or ad. This sudden movement can be irksome, correct? CLS quantifies this experience by assigning a value to the unexpected shifts on a page.
To address this, start with a reliable hosting choice like WPEngine for WordPress or WooCommerce users, or platforms like Shopify. The second step is the website’s construction. Unless opting for a custom-built solution, using various apps and plugins is inevitable. A basic guideline: if an app doesn’t enhance performance or conversion rates, avoid it. This reduces maintenance and boosts page performance.
Read also: WordPress SEO Tutorial
Website Navigation and Crawl Depth
The 3-click rule suggests that users should be able to find any information on a website within just three clicks or less from the homepage. If it takes more than three clicks, the design might be too complicated or not user-friendly.
However, it’s worth noting that even if a website has a deeper structure, where it might take more than three clicks to reach certain page, Google recognizes the value of larger websites. They are dedicating more resources to crawling bigger websites, even when their crawl depth is often more than 3 levels deep. Furthermore, visitors don’t necessarily view browsing through a large collection as a downside.
So, while the depth of a website is important for user experience and rankings, there are clear exceptions to the 3-click rule.
URL Structure and Breadcrumbs
When possible and if the platform permits, align your URLs with your breadcrumbs.
Let’s take Decathlon as a simple example.
- Page Name: Bike Accessories
- URL: /sports/cycling/bike-accessories/
- Breadcrumb Path: Home > Sports > Cycling > Bike Accessories
This is a great example of a neat URL that matches the page path. If you’re not convinced, just check out the first page of Google when you search “bike accessories” and see their listing.
Crawl Errors
Use Search Console for identification and resolution. Until fixed, Google will ping those outdated URLs. The risk? Potential visitors may do the same. Consider the scenario where an externally linked URL, which you’ve removed, directs a user to a 404 page. This not only damages SEO but your brand image as well.
Dead Links
Whether internal or external, dead links compromise the browsing experience. Use tools like Screaming Frog to identify and fix all dead links.
Schema Markup and Rich Snippets
Using schema markup on your website is a way to give search engines more details about your content. Think of it as giving clues to search engines so they can display your website’s information in a more attractive and informative way. For example, if you have a movie website, you can use a movie snippet to show key details like cast and release date.
If you’re selling products, product snippets can highlight prices, availability, and ratings. For food bloggers, recipes can be enhanced with cooking times, ingredients, and even images. Websites with frequently asked questions can benefit from FAQ rich results, showing answers directly in the search results.
There’s also the Knowledge Card that provides a quick snapshot of important info, and the Knowledge Graph that presents connected information on a topic. And if your website collects reviews, showcasing star ratings and customer feedback can really make your website stand out.
Pro tip: Use Schema Markup Validator Tool to verify schema markup on your pages.
XML Sitemap and Crawl Budget
An XML sitemap is like a roadmap for search engines, showing them which pages on your website you want to be highlighted in search results, often called SERPs.
Think of it this way: if you have a book and only certain pages are important for someone to read, you’d mark those pages. Similarly, with a sitemap, you’re marking the important pages of your website for search engines. If there’s a page you don’t want to appear in search results, you shouldn’t include it in the sitemap. On the other hand, if you want a page to be easily found by search engines, make sure it’s in the sitemap.
robots.txt
Robots.txt is like a doorman for your website. It tells search engines which pages they can enter and which they can’t. For technical SEO, it’s important because it helps guide search engines to the important parts of your site and keeps them away from areas you don’t want them to see. By using robots.txt correctly, you make sure search engines understand and show your website the way you want them to.
Some common types of pages that are often disallowed include:
- Admin pages: These are internal pages used for managing a website, and there’s no reason for them to appear in search results.
- User profiles: On some websites, especially community or membership sites, individual user profiles might be kept private from search engines.
- Search result pages: These are dynamic pages created when users search for something on a site. Indexing them can lead to a lot of redundant pages in search results.
- Temporary pages: Pages that are under construction or are used for testing might be disallowed until they’re ready.
- Sensitive content: Pages that contain confidential or sensitive information.
- Duplicate content: If there are multiple versions of the same page, it’s common to disallow the duplicates to prevent SEO issues.
- Files and directories: Sometimes, it’s not just pages but entire directories, or specific file types (like PDFs), that site owners might not want search engines to index.
Remember, just because a page is disallowed in the robots.txt file doesn’t mean it’s hidden from the world. It only means search engines are requested not to index it. If someone has the direct link, they can still access the page. If true security or privacy is required, other measures should be taken.
Conclusion
Why is all this crucial? Google prioritizes sites with robust technical SEO, from FCP to functional links. If not, the consequence would be users landing on sites that offer a subpar experience, leading them to alternative search engines. This is precisely why Google rewards sites with proficient technical SEO.
Noticed 50 dead links or lagging page speed? Don’t panic. Google won’t immediately penalize you. Just ensure you address the issue promptly rather than letting it linger.
Prepare for the future with our 2024-focused Technical SEO Checklist. For comprehensive guidance and expert strategies, our SEO Consulting services are here to elevate your website’s potential. Are you ready to take your site to the next level?