The Technical Side of SEO

Things to Check Your Developer is Doing to Improve Search Engine Ranking

Previously we published an article about what SEO is and how you can increase your SEO ranking with non-technical steps. But now comes the technical part, sometimes even with following the steps in our SEO guide, your search engine ranking won’t improve, and this comes down to how your website was built.

Typically, if your website doesn’t have things like a sitemap, your meta tags are wrong or underutilized, or something as simple as relying on JavaScript too much, search engines won’t index your site. This is why speaking with your developer and bringing these up to discuss how they approach them is a great idea to do early on.

Meta tags are set to “Noindex” or “Nofollow”

When websites are being built and you don't want them found yet, or when you don’t want duplicate sites to be indexed by search engines, a “NoIndex/ Nofollow” can be added to the meta tags. All this means is that the likes of Google will not search or index your website. Make sure these have been set to index/follow when your website is launched.

Website speed - how fast does it load

We’ve mentioned this many times because it is important for multiple reasons. Having a website that loads fast and is responsive is imperative to search engines indexing you and placing you in the top results.

Search engines will still index slow sites, but if you’re on the third or fourth results page, it’s no good for you. Secondly having a fast website keeps your visitors around.

Your sitemap is incorrect or nonexistent 

A sitemap is just a file that is used by search engines to read and understand your website more effectively. It provides basic information about what is on your website, amount of pages, videos, etc. This is best if it is correct.

Broken internal or outbound links

Have you ever visited a website, and clicked on a link only to find that it is broken or doesn’t direct where you were expecting, it’s frustrating, we know. Bad or broken links are a sign of poor user experience and search engines pick up on this, Make sure your links are good and correct.

The next part of this is to have the appropriate links available to the user at all times. So when your services or products page is being built, make sure to include links to the order page, and your contact us page.

Optimizing your robot.txt file

A robot.txt file is a list of instructions for search engines about how to crawl your website and what not to index. Every website has a limit on the number of pages that can be indexed, so only the most important should be, hence the robot.txt file. 

Pages that shouldn’t be indexed are:

  • Admin pages 
  • Shopping cart 
  • Checkout pages
  • Temporary files

Check if your developer hasn’t unintentionally disallowed search engines from indexing your main pages.

Not mobile-friendly enough

Since Google has introduced its ‘Mobile-First Indexing’ this basically means that you are first ranked on how the mobile version of your website is. This never used to be the case, but with the vast majority of people accessing the internet from their phones, it only makes sense. There are a few simple things you can ask your developer to do.

Your website was coded in a very complex way

As with spoken languages, if someone has written something in an overly complicated way, we either don’t understand or don’t even try to comprehend it. This is the same with search engines, your developer needs not write superfluously.

The user experience isn’t very user-friendly

User experience affects you in multiple ways, but in regards to SEO, it can severely impact your ranking. There are many ways to improve user experience, but let’s say you’re on a website and you have to click too many times (more than three to four) to get where you want on the website, it means the page/content is too deep. This makes it harder for people and search engines to find it, keep the most important information/pages quickly and easily accessible

Make sure you are using HTTPS 

HTTPS is basically the same as HTTP except that it encrypts data and is more secure. This extra security is why search engines rank secure sites higher. You can easily check if your website is using HTTPS by looking in your URL bar and if there’s a padlock on the left, then it’s secure and using HTTPS. If not, then you should really consider migrating over, not only for SEO but also for extra security.

If you’re selling things on your website, some people will look for the padlock in the URL bar before the purchase, if they don’t see it, they are reluctant to purchase due to fear of being scammed and will probably take their business elsewhere

Check for multiple versions of your website

You may know that it is now very common to omit the ‘www’ when searching on the internet, this has been made possible by 301 redirects, so you need to check if Google is indexing only one version of your website, which is what you want. This also ties into a previously mentioned point of Noindex/Nofollow, as these duplicate sites should be set to Noindex and redirected to where you want.

As an example, we are https://ahdesign.website, this is what you’ll see in the URL bar (although you won’t see the https:// as it is represented by the padlock), but an alternative version is https://www.ahdesign.website. When you click this second version, it will redirect to the first one (via 301), this is what you want. If when you click it you are presented with the full URL of the second option, this means there are two versions of your website out there, and this will affect your SEO ranking.

No crawling errors

This is exactly as it sounds, search engines are constantly crawling the world wide web and indexing websites. If while crawling an error is found on your site and the search engine can crawl no further, this prevents you from being properly indexed.

Clean and informative URL structure

We’ve mentioned the importance of this in our previous articles, having a clean and informative URL structure is imperative. It is the only way that someone/thing can look at your URL and see exactly what it is that they’re entering.

For example, you are currently on https://ahdesign.website/post/technical-side-of-seo, the underlined text is the domain name, and the text in italics is the slug. This URL is very informative to both search engines for SEO purposes, as well as informative to people visiting your site.

Now just imagine that the URL was https://ahdesign.website/post/452830946271940. Nothing can be gleaned from this, there is no information on what the webpage contains. Hence the importance of clean and informative URLs.

Scott Green

A Content Specialist with a passion for words. He’s experienced in SEO, graphic design, content management, marketing, proofreading, and editing.

Author posts
Recent Posts
16 Oct, 2022 / Website
Website Design: Part-1
12 Sep, 2022 / Website
Web Development Dictionary
1 Sep, 2022 / Website
The Technical Side of SEO
22 Aug, 2022 / Website
What is SEO? A Guide for Beginners