Skip to content

 

technical-seo

One of the most critical parts of every SEO campaign is technical SEO. In this article, we will help you get started with actionable items to optimize your websites from a technical perspective.

Imagine your website as a car, you would need to build it properly so that it can run. If you want it to run fast with less fuel consumption and maintenance, you need to optimize your vehicle!

What is Technical SEO?

Technical SEO refers to optimizing your website’s back and front-end codes to ensure search engines can crawl, index, and rank your website better.

Here are 2 most important aspects of Technical SEO:

  1. Crawling: how search bots access and collect data on your website;
  2. Indexing: how search engines include (show) your website in their SERPs. If your website is not in Google’s index, no one can find it with Google Search.

the-gears-of-technical-seo

 

Why is technical SEO so important?

A website with good technical bones helps search engines do their jobs more effectively and efficiently.

Image if your website is not even accessible by Google bots, how do you expect it to rank on Google?

Moreover, a technically optimized website would need less SEO effort because it flows link-juice better.

 

What websites need technical SEO?

Every website essentially needs technical SEO. Especially when SEO is a part of the brand’s digital marketing strategy.

Large publishers and e-commerce stores with thousands of URLs need different approaches. While small business websites normally require less effort.

 

Technical SEO 101

In a technical SEO campaign, you need to consider many aspects of your website that may have an impact on search engine spiders’ behaviors.

website-accessibility

 

Accessibility

The first thing you always want to check on your website is that if it’s accessible by search engine bots or not. Because all SEO efforts come after will be wasted if Google can’t access your website.

You can check this in your robots.txt file, which is a server-side file of your website. This file is usually found in this sub-directory: “/robots.txt”.

For example, https://giantcreativeinc.com/robots.txt is where you can find our website’s robots.txt file.

This file is a “gatekeeper” for all web crawlers accessing your website and how they can crawl your web pages. You can block bots from crawling certain parts of your websites with some simple rules (codes).

So, you want to make sure that Google bots can crawl the pages that you want to rank.

You can use Google’s robots.txt Tester to check if Google bots are blocked from accessing your website content. Besides, you can do this manually if you understand the rules or want to learn more about it.

Discoverability

The second area you want to look at is whether your website content is discoverable or not.

What if your website is accessible, but search engines’ bots just CAN’T discover your content? It will then be considered as never existed, and thus will never be indexed.

internal-links

 

Internal Link

You want to ensure that your website has a great internal link structure so that crawlers can discover new URLs from pages.

One of the most common places to look at is your website’s navigation bar. You want to make sure that it contains links to all important pages on your website.

It’s recommended that all web pages should be found within 3 clicks from your homepage. SEOs define this as page depth, which should be no more than 3.

Tips: if you link to external websites, you can consider marking your links with rel=”nofollow”. This is to guide search bots not to follow the link to the other website, which helps prevent losing link juice. But if the external webpage is high authority and helpful for your users, you should not mark it “nofollow”.

XML sitemap (sitemap.xml)

Google recommends webmasters build and submit XML sitemaps for websites that are big (more than 500 URLs).

However, every website nowadays always has an XML sitemap to maximize discoverability, even if it’s a microsite.

An XML Sitemap normally resides in the following sub-directory:

  • /sitemaps.xml
  • /sitemap_index.xml

For example, https://yourdomain.com/sitemap_index.xml

XML Sitemaps are created for search engines’ consumption. It is an “instructor” for search engines to crawl your website more effectively.

A typical XML sitemap contains links to all web pages and posts on a website.

You can find various tools that help you to generate an optimized sitemap file on the internet. However, the sitemaps generated by these external tools are static. It means you have to update them whenever there is new content on your website. So it’s really time-consuming and should not be a long-term strategy.

We recommend you use CMSs (Content Management Systems), like WordPress. Because you will have a sitemap created and updated automatically for you, which will make your life much easier.

Indexability

You need to ensure that Google and other search engines can index your web pages. If there are posts marked as noindex or are canonicalized, you would want to double-check that it’s not an accident.

You should expect to NOT see these code snippets in the <head> section of your web pages:

  • <meta name=”robots” content=”noindex”>: This means you are telling search bots crawling the URL not to index it.
  • <link rel=”canonical” href=”https://yourdomain.com/another-webpage”>: If the canonical link is not the current URL, you are telling search engines that it’s a duplicate version of another page. Search engines normally won’t index canonicalized URLs.

HTTP status codes

The next thing you want to check on your website is the HTTP status codes of internal and external URLs. In most cases, Google won’t index non-200 pages.

Here are three of the most common status codes:

404: Page not found

This means the URL is not found. It may be deleted, so it doesn’t return any content. This damages UX so does it with SEO.

301: Permanent Redirect

The URLs that users or bots land on redirect to other URLs internally or externally. In the case of the destination, pages are not expected by users, it would affect UX strongly.

200: OK

This is the status code that you want to see across your website. It means the page is OK, accessible, and able to serve content to users.

You can see the status code report of your web pages by using Google Search Console in the Coverage tab.

The reason why these URLs are found is that they are linked from somewhere on the internet (backlinks or internal links).

However, in most cases, webmasters are not able to control natural backlinks, so they focus on internal link optimization.

You want to ensure that you place links to the correct destinations with the HTTP-200 status code.

For example, if there is a typo in your hyperlinks to /pahe/ instead of /page/, it will cause a broken internal link because /pahe/ doesn’t exist on your site.

So, you can use Screaming Frog or Ahrefs Webmaster Tool to find all broken links internally. Google Search Console is very limited in terms of providing the reference to fix these issues, so it’s not recommended.

mobile-optimized

 

Mobile Usability

Nowadays, mobile accounts for 58% of all Google searches. Additionally, Google updated its algorithm to Mobile-First Index. So, there is no reason not to optimize your websites for mobile devices.

First, you want to make sure that your website is accessible by mobile devices. Second, check font size, colors, buttons, loading speed, etc. to make sure they are not collapsed.

website-loading-speed-optimization

 

Images

The first place you want to look at for speed optimization is your images. Most websites today use a lot of visual elements like images. But they will tremendously slow down your loading time if you don’t optimize your images.

Here are some quick tips to optimize your images:

  1. Compress your images (no more than 1MB);
  2. Use next-gen formats like WebP instead of PNG, JPG, etc;
  3. Assign width and height attributes to your images (to avoid CLS);
  4. Avoid using unnecessary images for design purposes

Cache

Caching can greatly improve your website’s loading speed. It refers to the files of a webpage stored in temporary memory on the server or in clients’ browsers.

If you previously visited a web page, the next time when you return, that page will load faster.

Most web hosting offers server-side caching support. There are also many plugins available in WordPress for caching.

HTML / CSS / Javascript

If you are not familiar with web development, you would need help from web developers to optimize your codes.

If you use WordPress, you can easily install third-party plug-ins to help optimize your codes.

Schema markup

Schema markups are snippets of code in the HTML document that communicates with search engines. They give search engines information about an entity in the most efficient way.

Google has become more intelligent to understand a website without any schema markup. But it’s still a best practice and install the correct structured data snippet for your web pages.

Some of the most commonly used Schema types are:

  • Organization: telling search engines that your website represents an organization;
  • Local Business: telling search engines that your website represents a local business;
  • Article: telling search engines that the specific web page is an article.

There are also a handful number of schema markup generators on the internet. For common use cases, you can visit Google Search Centre to find examples of using schema markups. Or you can visit the biggest library for schema markups at: https://schema.org/.

Conclusion

You don’t need to be an expert in web development to become a technical SEO. However, understanding how search engines crawl and interact with websites is critical.

User experience also affects SEO, so you want to optimize your website not only for search engines but from a user’s perspective as well.

Ready to
transform your
business?

Let's Talk