13th of September 2022 by Alex Mungo
laptop & technical seo

Technical SEO is essential to understand if you want to ensure your web pages are structured for both humans and crawlers. Understanding this can make the difference between your website:

1. ranking well.

2. ranking not so well.

3. not ranking at all.

Obviously, we want our website to rank well, so even though doing proper keyword research, writing great content, and optimising it are crucial, you should also ensure your website meets all technical requirements.

What is Technical SEO?

Technical SEO is the process of optimising your website to meet technical requirements set by search engines so that your website ranks higher organically.

Laying The Foundations of Technical SEO

When laying the foundations of technical SEO make sure the website is:

1. easily understood by search engines.

2. fast at loading pages.

3. easy to crawl.   

This means you need to improve your website's structure, page loading speed, indexability, and crawlability.

Site Navigation and Structure

If your website’s content is not structured in a way that it is easy for search engines to index and crawl, you may find that some of your web pages are being crawled but not necessarily being indexed. This can have a detrimental effect on your overall ranking, and may later cause issues when trying to fix them.

Ideally what you need is a flat website structure. This means content is placed in a pyramid-like structure so that it is easy to navigate from the homepage to your website's deepest pages in three to four clicks. A flat structure makes it easier to discover 100% of these pages and articles, even if you have a big website with thousands of pages.

Site navigation and structure are also essential for properly formatted sitemaps and robot rules that prevent bots from crawling and indexing certain pages.

Website Speed

Google in particular takes website speed very seriously. According to the company, website speed is an important user experience component and because it wants to lead people to websites with great user experience, faster websites rank a lot higher.

Second, Google is pushing websites to provide a great experience for mobile users. They understand that an overwhelming majority of people doing searches, especially those looking for products and services, do so on mobile devices, and they need them to have a great experience on the websites they visit.

Numerous online tools can help you find out how fast your website speed is and what is impacting it on slower websites. Google also has a speed test tool that tells you about any speed-related issues on your website and how to rectify them.

Indexing & Crawlability

Search engines use robots, spiders, bots, or crawlers to discover content on your website. Site structure and navigation are crucial for getting your website crawled and indexed, but there are other things you need to think about.

First, you need an XML sitemap. This includes links to everything on your website including images, textual content, audio files, and videos. Google even admits that sitemaps are one of the most important sources for finding links.

Second, you can add an optional robots.txt file. This file tells bots what to do on your website. You can tell them to ignore certain pages and not follow certain links such as your JavaScript and CSS files. This file is very powerful, and you should only use it if you know what you are doing.

Dead Links and Orphan Pages

Dead links impact user experience because it can be frustrating to follow a link only for it to lead to a 404 page. Search engines do not like these pages, and they can find more dead links than users can because they follow all links on a website. You should do a periodic audit to discover and remove all dead links on your website for a better user experience and ranking.

Orphan pages are pages that do not have any links pointing to them. Because there are no links leading bots to these pages, they do not get crawled or indexed and this is a serious issue, especially for websites that have a lot of orphan pages. Although orphan pages do not impact user experience, they impact crawlability, and you should do an audit and link all orphaned pages you find.

Thin and Duplicate Content Issues

Google says that thin content is not an issue, but most pages on the first results page typically have over 1500 words. This shows that thin content can cause your page to rank poorly. A related issue that Google and other search engines take seriously is duplicate content. This can be content posted elsewhere on your site on a different link (mostly happens when using a CMS) or duplicate content posted on other websites.

Google will penalise you if they think you are not that content’s original author so you should find and remove it as soon as possible.

Google wants pages that rank highly to be helpful and provide a great user experience. They must also meet the requirements of modern search engines and browsers. Doing technical SEO properly will help you eliminate the issues that cause your website to rank poorly.

Alex Mungo Founder Of Go Mungo SEO
Alex Mungo

Founder of Go Mungo SEO Alex has a passion for SEO, enjoying sharing his knowledge and experience through his writing.

Share on Twitter   Share on Google   Share on Linkedin