I am a huge believer in link building and content. However, technical SEO is as important as a solid foundation in SEO efforts. Ensuring that search engines can easily crawl and index your site is essential.
Simply put, technical SEO refers to optimizing the technical aspects of a website to ensures that your site meets technical requirements of search engines. Ultimately with the goal of improving organic rankings.
Technical issues that may hinder this process can range from broken internal links, incorrect use of robots.txt files, poorly structured sitemaps to other factors such as site architecture, URL structure and page speed.
Today, technical SEO can be provided as a standalone SEO service. You’ll be surprised at what a good technical SEO audit, combined with deliverables and action steps can do for your business.
Nonetheless, to move the needle holistically for traffic, rankings and sale conversions, you’ll still require link building, conversion rate optimization and content creation.
So, Why is Technical SEO Important?
Impact on SEO Rankings
Technical SEO directly influences your site’s ability to rank in search engine results pages (SERPs). Issues like slow load times, non-responsive design can impact rankings. This is concurred in Google’s core web vitals references.
User Experience and Conversion Rates
Technical SEO done well not only helps search engines crawl your site better, but it also enhances user experience by ensuring your site is fast, secure, and easy to navigate.
This keeps users engaged and can lead to higher conversion rates. For instance, if your site loads quickly and is easy to navigate, users are more likely to stay longer and interact with your content, potentially leading to increased sales or leads.
Here is a Basic Checklist for Technical SEO:
For the non inundate, here’s a basic checklist you can run through for technical SEO without hiring an expert SEO consultant.
Site Structure
- Have you organized your site’s content in a logical, hierarchical manner?
- Ensure all important pages are easily accessible within a few clicks from the homepage.
- Create a clear and consistent navigation system.
URL Structure
- Are you using clear, descriptive URLs that accurately reflect the content of each page?
- Avoid long and complex URLs with unnecessary parameters.
- Use short, keyword-rich URLs that clearly indicate the page content.
XML Sitemaps
- Have you created and submitted XML sitemaps to search engines?
Page Speed
- Have you enhanced your site’s load times to improve user experience and search engine rankings?
- Use tools like Google’s PageSpeed Insights to identify and fix issues affecting your page speed.
- Optimize images, minify code, and leverage browser caching.
Mobile Responsiveness
- Is your site optimized for mobile devices?
- Use Google’s Mobile-Friendly Test to check your site’s responsiveness and make necessary adjustments.
Security Protocols
-
- Have you implemented HTTPS to secure user data?
- Secure your website with an SSL certificate to protect user information.
- Have you implemented HTTPS to secure user data?
- Ensure that all pages on your site are accessible via HTTPS.
Tools for Technical SEO
There are many complex tools you can use for technical SEO. However, the value of technical SEO comes an aggregation of data from different tools such as Google Search Console, Google Analytics, SEMRush/Ahrefs and Screaming frog.
Nonetheless, free tools like Screaming Frog and Google Search Console can be used for and basic audits. There are paid tools such as Sitebulb, SEMrush that can compliment free tools.
These tools can help you identify and resolve a range of issues, from broken links to thin content, to slow loading times.
- Screaming Frog: Freemium tool for crawling websites and identifying technical issues.
- Google Search Console: Provides insights into indexing status and optimization suggestions.
- Page Speeds Insight
- LightHouse Extension for Mobile Friendly Test
- SEMrush: Has their own site audits and mainly used for keyword research.
Steps to Conduct a Technical SEO Audit
Check for XML Sitemaps
Ensure your sitemaps are correctly formatted and submitted. This helps search engines discover and index your content efficiently. Regularly updating and submitting your sitemaps ensures that search engines are aware of all your pages.
Check for Mobile Friendliness
Unfortunately Google has sunset Google Search Console Mobile Usability report, the Mobile-Friendly Test tool and the Mobile-Friendly Test API. Nonetheless, Google is aware of the growing number of mobile users: making sure your site is mobile user friendly is key.
Today, you can use Google’s light house extension to check your site’s responsiveness and make necessary adjustments.
Check Page Speed
Page speed is a ranking factor, especially after Google’s core web vitals documents. Use tools like PageSpeed Insights to analyze and improve your loading times. Faster page speeds enhance user experience and can lead to better engagement and higher rankings.
Check for Crawling, Indexing, and Rendering Issues
Indexing and Rendering
Ensure that search engines can render your site’s content correctly, particularly for JavaScript-heavy websites. Google Search Console’s Coverage Report is useful for this.
Secondly, proper indexing ensures that your important pages are visible in search results. You can ensure this through indexing of your pages by resolving issues like blocked resources or incorrect ‘noindex’ tags.
For the non technical business owners, no index tags can also be referred to as a “robots meta tag”.
The definition of a robots meta tag: it is an HTML code element that provides search engines with instructions on how to handle the indexing and crawling of a webpage.
It is inserted into the <head> section of a webpage and typically appears in this format:
<meta name=”robots” content=”noindex” />
This specific tag informs search engines not to index the page, ensuring it doesn’t appear in search results. On your end, it’s to ensure that this tag is only implemented on specific pages that are intended to be non index-ed.
Canonicalization
Canonicalization is a big word but basically it means: when here are multiple versions of the same page, Google chooses only one to keep in its index.
This process, known as canonicalization, involves selecting a canonical URL that will appear in search results.
Google uses various signals to determine the canonical URL:
- Canonical Tags
- Duplicate Pages
- Internal links
- Redirects
- Sitemap URLs
The simplest way to check how Google has indexed a page is by using the URL Inspection tool in Google Search Console. This tool will display the canonical URL that Google has selected.
Canonical issues are closely related to duplicate content issues.
If Google identifies duplicate content on your website without clear guidance, it might not index the page you prefer.
To address this, you can add a canonical tag (<link rel=”canonical” href=”…” />) in the HTML of each page. This tag informs search engines which version of the page is the primary one to be indexed.
Crawl Budget: HTTP Errors
Yes, search engines actually spend money to crawl your site efficiently. Henceforth, there is merit in optimizing your site for search engine crawlers. This includes avoiding duplicate content and unnecessary pages.
HTTP errors, such as 404 and 410 pages, can significantly consume your crawl budget. You don’t want search engines to waste valuable resources crawling pages that don’t exist instead of indexing important content. Secondly, these errors also negatively impact user experience.
Henceforth, by fixing all 4xx and 5xx status codes, you not only optimize your crawl budget but also enhance the overall experience for your visitors.
This smoothly brings us to our next point: 301 redirects.
301 Redirects
A 301 redirect is a server-side action that permanently redirects users from one URL to another.
This type of redirect is crucial for SEO because they pass nearly all of the original page’s PageRank to the new page.
The new page retains the authority, ranking power of the original content and inherits the search engine rankings and link equity from the old page.
The reason it’s commonly termed as “permanent” is straightforward: it tells search engines that the original URL is no longer in use and has been replaced by a new page. Consequently, Google and other search engines update their indexes to reflect this change, replacing the old URL with the new one.
When to Use 301 Redirects
A 301 redirect should be used when a page has been permanently removed or deleted from your site, but you want to preserve its traffic, rankings, and links.
Use Cases for 301 Redirects:
- Site Migration: If you’re moving your site to a new domain, a 301 redirect ensures a seamless transition.
- Multiple URL Access Points: If users can access your site through various URLs (e.g., `http://example.com/home`, `http://home.example.com`, or `http://www.example.com`), it’s best to choose a preferred URL and use 301 redirects to send traffic from the other URLs to this canonical destination.
- Website Mergers: When merging two websites, 301 redirects ensure that links to old URLs point to the correct pages.
- Broken Pages (404): Redirecting 404 pages to relevant content helps retain users and preserves link equity.
- Fixing Duplicate Content: Using 301 redirects can help resolve duplicate content issues by pointing duplicate pages to a single, canonical page.
How to Implement a 301 Redirect
The simplest way to implement a 301 redirect is by adding a rule to the `.htaccess` file located in the server’s root folder.
SEMrush has an excellent guide on how to do it.
Optimize Website Architecture and Internal Links
Flat Website Hierarchy
Establish a clear hierarchy and structure for easy navigation and indexing.
This includes using a consistent URL structure and organizing content logically. I recommend going with a flat website hierarchical structure.
The concept for topical maps is popularized by Koray Tuğberk GÜBÜR. Koray goes at length at explaining technical intricacies (can be extremely confusing) of a topical map and how search engines index a site:
I use a 80/20 rule, especially when it comes to local SEO:
- Create a basic topical map
- Use keyword research and map keywords to topics
Therefore, creating a topical map, mapping it to keyword research and defining your website hierarchy is helpful in planning out a website structure that is both optimal for users and search engines.
Internal Linking
Implement strategic internal linking to distribute page authority and help users navigate your site. This also aids search engines in understanding the relationship between your pages.
The number, anchor text and quality of internal links to any given page can be audited using ScreamingFrog.
I shall come up with an automated process for this eventually.
Delete/ Audit Thin and Duplicate Content
Identify and enrich pages with thin content that offers little value to users. Tools like Screamingfrog, Sitebulb can audit and help identify duplicate content:
- Enriching thin content can improve user experience and boost your rankings.
- Use canonical tags for duplicate content and 301 redirects where necessary.
Summary: Technical SEO by a Singaporean Led Team
Yes, most SEO experts over complicated the technical SEO audit process. However, our agency is in the process of building out a well oiled technical SEO audit process.
This technical audit should cover:
- Technical audits
- Content audits
- Keyword research
- Content planning
Thus eliminating the need for multiple conflicting deliverables. It should also run on Google sheets, hence making easy access for clients and team members.
It also should aggregates data from multiple sources: Screaming Frog, Sitebulb, Sitemaps, Search Console, SEMrush, Google Analytics, and Search Console data. Therefore, an all in one process that allows us to analyze and categorize each page, assigning technical or content actions as required.