Even if you have very best site with the best content but, if your Technical SEO is messed up you're not going to rank!!
At the very core search engines like Google or any other search engines should be able to find, crawl, index and render pages of your website.
What is Technical SEO?
Technical SEO is a process of making sure that your website meets all the technical requirements which are required by modern search engines which would help you to improve your organic rankings.
Make sure you are on top of these factors for best Technical SEO experience;
- XML Sitemaps
- robots.txt
- Mobile friendly (Responsive)
- URL Structure
- SSL / HTTPS Enabled
- Website Speed / Performance
- Optimized 404 Page
We'll go step by step. So let's get started without any delay...
Step 1: XML Sitemaps
If your website has contents then it also has URLs and search engines need to crawl these URLs to index it and display it in search results. But, Search engines need a source from which they can find those URLs and index them.
Here comes XML Sitemaps in play. A sitemap.xml is set a of URLs that your website contains, which makes search engine crawlers easier to index without any additional efforts.
Normally sitemap for any website is located at their root folder i.e if www.stackmantle.com is your URL then www.stackmantle.com/sitemap.xml would be your sitemap URL. If you do not have your sitemap on this URL you need to work on it and make it online. If you're using WordPress you can use Yoast SEO plugin.
Search engines are automated by default to find sitemap.xml on root folder. However, if you login in tools like Google webmasters and Microsoft webmasters you can provide your custom sitemap.xml URL.
We would not recommend to have sitemap.xml on custom URLs. Think about other search engines, you would end up creating accounts and justifying your custom sitemap and let's say you do an SEO audit using some tools they will think that you do not have sitemap.xml on your root folder / for your website.
Here are some sitemap.xml rules that you should follow
- Always keep your sitemap.xml on your root directory
- You cannot have more than 50,000 URLs in one Sitemap.xml
- You can segregate / link categorized / bigurcate sitemaps
Take a look on below for more details regarding sitemap URL and structure of bifurcating sitemaps URLs in different files.
For more sitemap.xml rules you may refer sitemap protocols
Step 2: robots.txt
By default search engine crawls everything on your website i.e every accessible URL / assets, etc. Let's say you have a specific part of your website which you don't want search engines to index like your admin URL / comments URL, etc. you can put it up in robots. txt. If said in right terms it also helps with better traffic management.
Normally robots.txt for any website is located at their root folder i.e if www.stackmantle.com is your URL then www.stackmantle.com/robots.txt would be your robots.txt URL. If you do not have your robots.txt on this URL you need to work on it and make it online. If you're using WordPress you can use Yoast SEO plugin.
Here are some robots.txt rules that you should follow
- Always keep your robots.txt on your root directory
- Your may put sitemap URL in your robots.txt for better indexing
- filename should be in lowercase(robots.txt), it is case sensitive
- You can provide what should be allowed to be indexed and what not to
For more robots.txt rules you may refer robots.txt creating steps
Step 3: Mobile Friendly (Responsive)
Let's face it from past few years the use of tablet and mobile users have increased drastically. More than 90% of people have handheld devices and they tend to have all information / data on their fingertips. It's utmost necessary to have a responsive website i.e you should have a single code that could be used for all size of devices. But, the code would decide how to open on which device.
Ex: Let's open stackmantle on desktop mode and mobile mode. We haven't hard-coded it for different devices. Instead we used bootstrap to make it responsive.
Search engines loves responsive websites and also you do want your website to be responsive for your users. Keeping would website responsive would not only open your site in any device of any size but it also let search engines know that your website is user friendly and is OK for their users.
Step 4: URL Structure
Yups, next comes your website URL structure, no i'm not talking about sitemaps. But, i'm talking about your website URL, the URL which is created when you publish a post / category / something else.
Let's take an example below
Here are some URL Structure rules that you should follow
- Always keep your URL structure friendly because search engines expect the same
- Keep your URLs small and sweet, lengthy URLs are not healthy for SEO
- Always keep only 2 folder hop in your URL Structure i.e after your base URL you should have "/" only two times. Ex: From the above image shared you can see that we only have 2 "/" i.e 2 folder hops in the URL which makes it easy to understand and is friendly
Step 5: SSL / HTTPS Enabled
Having website and making it SEO optimized is OK but what about SSL / HTTPS. We also have to make sure that we have a secured connection enable for our website. It gives search engines confidence that your website is secure also it give your users confidence about their data transfer from this website is very secure.
You should make sure that you 301 redirect for https for your website. If you are using apache server on ubuntu you can use cerbot to generate a free SSL certificate for your website. Moreover cerbot issues free SSL for anyof your website. You may click here and check issuance as per your server software and OS.
Here are some URL Structure rules that you should follow;
- Give all your URL redirects so that they can load from https instead of http
- If you are using apache server you can alter your .htaccess code or apache config file to make sure you use SSL for https connection
- Never load URLs withour https it would rank you down in search engines and your users might lose confidence on your website
Step 6: Website Speed / Performance
Website Speed / Performance matters a lot to your website. As per Google any website which loads within 3 seconds or at-least shows interactive content within 3 seconds would be perfect. Not only you cover your technical SEO aspects but would also reduce your bounce rate.
Yes consider yourself why would you wait for a website which takes 5 / 8 / 10 seconds to load. It just irritates you right? Moreover then you thing of clicking on some other link for the search result.
You can use tools like
StackMantle uses both of these tools. Below are some examples which we saves from Google PageSpeed insights. It also suggests you what you could improve to cover SEO aspects
It check for all aspects like load time, responsiveness, paint time, first meaningful paint and gives you a detailed analysis what need to be improved and where.
If you are using Google Scripts like Google Analytics or Google AdSense then the results might defer by 40 - 50%StackMantle team is working on this issue and trying to see what efficient can we use to find a solution for it. But, till then you may use above to tools to check your website speed / performance.
Step 7: Optimized 404 Page
An optimized 404 pages provides a proper gateway to your users if they land up to a content / page which does not exists moreover it also help search engines to not index pages here's how.
How does 404 page help search engines?
Let's say you have two URLs on your website www.stackmantle.com/some-url and www.stackmantle.com/category/seo which were indexed by search engines, for some reasons you restructured your site and www.stackmantle.com/some-url was replaced by some other URL / was deleted. Now, next time when search engines access that URL to re-index they'll get a 404 error code sent by your server. Which means the URL is no more valid and it needs to be removed from index.
Another scenario would be when a user gets on a broken link then instead of showing only plain 404 page you may show content like, posts / products you might be interested, suggested posts / products and what not.
You see you can get some traffic from your 404 pages too. That;s called playing smart :-)
I hope you loved this article and was able to add some value to your knowledge Stack. More on the way, make sure you're subscribed for unique contents like these.