Gabriel Both

SEO Manager

Gabriel Both at Hike SEO

In this video, we’ll discuss the topic of Technical SEO and how you can identify and resolve any bottlenecks hindering your website’s ranking and organic traffic. You can watch the video or read the text summary below:

Here’s an overview of what we will cover:

  1. Definition and importance of Technical SEO.
  2. Anatomy of a webpage and its elements.
  3. The hierarchy of Technical SEO, explaining different levels of importance.
  4. Site Structure and navigation.
  5. Crawling, loading, and indexing elements.
  6. Identification and rectification of duplicate and thin content.
  7. Importance of page loading speed and other key technical aspects.

Technical SEO is essentially optimizing the technical aspects of a website to improve its search engine ranking and visibility. This includes server configuration, site code, and site structure, aiming to make your website user-friendly and discoverable by search engines.

The importance of Technical SEO lies in its ability to improve your site’s credibility, security, mobile-friendliness, and ultimately visibility. It lays a solid foundation for your website to perform well on content and indexing performance.

Three crucial elements form a webpage – HTML, CSS, and JavaScript. HTML is the Hypertext Markup Language that creates context around content, helping browsers and search engines understand it. CSS or Cascading Style Sheets style your website visually, determining the appearance of HTML elements and fonts. Lastly, JavaScript determines how the elements on a website behave when users interact with them.

Diving into the hierarchy of Technical SEO, crawlability is at the bottom, highlighting the importance of your site’s discoverability by search engines. Indexability follows, determining how well your pages are set up to be indexed. Accessibility, rankability, and clickability form the top layers, all contributing to your website’s SEO performance.

Site structure and navigation are crucial aspects of Technical SEO. A well-structured website is easy for users to navigate and allows smooth distribution of link equity. Tools like ‘hike’ platform offer insights into your site structure and how your pages are connected.

Breadcrumbs and pagination are essential elements for an optimized site structure. Breadcrumbs are navigational structures that guide users and search engines through your site, while pagination helps divide content into multiple pages, improving user experience and page load times.

Lastly, a consistent URL structure helps users and search engines identify the page content. URLs should include the primary keyword, reduce filler words, keep descriptions short, and be logically organized to be user-friendly and search engine friendly.

Crawling, Loading and Indexing

This section focuses on the ease with which a website can be explored by search engine bots to discover new pages. The loading speed of a web page is crucial. If the page loading speed is slow, the crawler may time out, leading to incomplete loading or skipping of the page. It is essential to know which pages are indexable by default and which are blocked, as blocked pages will not show up in search results.

Google Search Console (GSC) Usage

GSC is used to identify issues and check for crawl errors. The report generated shows the pages that Google was unable to crawl, which prevents them from being indexed. These errors can be identified and rectified by visiting the page indexing page on the Google Search Console.

URL Inspection Tool

This tool checks the status of a specific page. By pasting the URL into the search bar in the URL inspection section, it is possible to determine if a page is indexable or blocked.

Crawl Stats

These stats provide information about the frequency with which Google crawls your website. By going to settings and clicking on “crawl stats”, the report will display the number of crawl requests, download quantities, average response time, and other performance metrics for your site.

Website Crawling

Google will automatically crawl a public website. It is recommended to have an account with Hike, an SEO tool that can crawl your site and flag any technical issues that require attention.

XML Sitemaps

XML sitemaps are essential for search engines to easily discover and crawl new, updated, or removed pages on your site. Most CMS platforms such as WordPress, Squarespace, and Wix automatically create and update XML sitemaps.

Sitemap Index Pages

These are higher-level pages that contain multiple sitemaps. By submitting these to Google Search Console, you can direct Google to crawl them promptly.

Robots.txt File

This file contains rules for how search engines should crawl your website. It should ideally link to your XML sitemap. The rules may include which spiders are allowed, how quickly they can crawl, and which directories or pages they can or can’t crawl.

Internal Linking

Internal linking allows web crawlers to discover new pages and build a map of the relationships between page topics. It also aids users in discovering new, relevant content.

Fixing Dead Links

Fixing dead links is essential, as they create a negative user experience. By creating a 301 permanent redirect to the next most relevant page, dead links can be fixed effectively.

Redirect Chains

Redirect chains occur when redirected pages are redirected again. These chains cause slower page load times, increased server load, poor user experience, decreased crawl performance, and reduced link equity. The goal is to remove the middle pages in the redirect chain.

Duplicate or Thin Content

Google aims to provide a high-quality user experience, and duplicate or thin content can degrade that experience. It is important to provide unique, useful, and original content. Duplicate pages have identical or highly similar content, while thin content refers to pages with minimal or no content.

No Index Pages

Adding a no-index tag to pages that should not show up in search results can be useful. However, it should not be used as a substitute for canonical tags.

Avoiding Thin Content

Every page should have some unique, useful content on it. Thin or empty pages should be avoided as they can hinder the user experience and obstruct search engines’ understanding of the pages.

Page Loading Speed

This refers to how quickly a webpage loads, which could range from one to five seconds or more. The slower the loading speed, the worse it is for the user experience and web crawlers. Slow loading may cause web crawlers to timeout, resulting in skipped pages and therefore, they won’t get indexed. Google’s free PageSpeed Insights tool can be used to check a website’s loading speed and identify improvements.

Image Optimization

Images should be in a web-friendly format like JPEG or next-gen formats like WebP for quicker loading. Images should be resized to fit the purpose on the page, ensuring they aren’t excessively high resolution. Also, the width and height attributes should be defined to prevent layout shifting issues during page loading.

Page Size Optimization

Alongside image optimization, review third-party scripts and unused plugins, as they can slow down page loading. Use caching plugins like W3 Total Cache if you’re using WordPress.

Content Delivery Network (CDN)

A CDN speeds up loading by distributing your website over multiple servers around the world, leading to faster load times and increased security. Managed platforms like Wix, Squarespace, and WordPress.com have CDNs built-in.

Reduce Third-Party Scripts

Less scripts equal faster loading times. Be selective and consolidate functionalities into a single platform or plugin. Google Tag Manager can help manage scripts from one place, ensuring you don’t clutter your site with too many scripts.

Other technical SEO aspects to consider:

Hreflang Tags for International Websites

Hreflang tags inform search engines about the language or country your content is intended for, ensuring it appears in the right search engine results. It also prevents duplicate content issues.

Fix 404 Links

Broken or dead links create a negative user experience. These should be redirected to the most relevant page.

Structured Data

This is information about your content that tells search engines what it is about, enhancing search engine results. Types of structured data include schema.org, Open Graph tags, JSON-LD, among others.

XML Sitemap Validation

For custom created XML sitemaps, it’s crucial to ensure they follow the requirements and are validated.

Noindex Tag and Category Pages

CMS platforms like WordPress automatically create new pages for tags and categories. These pages often lack valuable information and may create unnecessary clutter. Using plugins like Rankmath or Yoast SEO, these pages can be “noindexed” to clean up search engine results.

Page Experience Signals

These affect user experience directly. Interstitials like non-user initiated pop-ups should be avoided. Ensure your page is mobile-friendly and has a secure SSL connection.

Technical SEO issues will get flagged in Hike actions, so it’s important to regularly check and fix them using platforms like hikeseo.co.

For any further queries or assistance, feel free to reach out!

  • Pick a plan. Grow your SEO with Hike.

    Join 10,000+ small business users using Hike to get their website to the top of Google. Drive more traffic with as little as a few hours per month.

    View Pricing