fbpx
Technical SEO: Accessibility and Crawlability | The Audit Lab
Trying to get your head around Technical SEO? There’s a lot to cover, but fear not, we can break it down for you. Let’s focus on crawlability!
1927
post-template-default,single,single-post,postid-1927,single-format-standard,bridge-core-1.0.4,ajax_fade,page_not_loaded,,qode_grid_1300,hide_top_bar_on_mobile_header,qode-child-theme-ver-1.0.0,qode-theme-ver-18.0.6,qode-theme-bridge,qode_advanced_footer_responsive_1000,qode_header_in_grid,wpb-js-composer js-comp-ver-5.7,vc_responsive
 

Technical SEO: Accessibility and crawlability

Technical SEO: Accessibility and crawlability

If you have a website or if your job involves improving or writing for one, you will of course have heard of SEO. At least, we hope you have. There are a few branching off terms under the umbrella of SEO – including onsite content and link building –  but today we’re here to give you a simple breakdown of what goes into the branch of Technical SEO. 

So, what is Technical SEO exactly?

Well, let’s start with the basics of understanding general SEO (Search Engine Optimisation). 

It’s a term that encompasses the process of adjusting, building or optimising websites for search engines – with the aim to get a website high in ranking results, organically. Who doesn’t want that? If you have had any experience with SEO strategy at all you’ll know it’s an ever-changing field where the goalposts move fluidly, so staying on top of SEO best practices is key to being able to adapt and react.

The Technical SEO branch then focuses on optimising websites for indexing and crawling. With this, you basically want search engines to access, interpret and index your site with as much ease as possible. It’s important to know that in September 2020, Google introduced mobile-first indexing through an update, which ranks mobile-friendly and high-performing websites higher than those who aren’t.

And whereas other SEO branches delve more into content strategy – focusing on the right keywords in relevant content – Technical SEO is more concerned with the infrastructure of the website. Basically, if you’re making a search engine’s job to access, crawl and index your whole website easy, it’s going to like that site much more and favour it into its ranking results. 

If the crawl bots are happy with your website, it’s more than likely visitors of it will be, and to be a good search engine, you want to show only the best and easy-to-navigate websites… with good content, of course.

Getting started with Technical SEO

Getting your website right in terms of Technical SEO is also about improving functionality, to improve search visibility. Think meta tags, sitemaps, linking, JavaScript indexing, etc.

So, where to start? First, it’s important to realise that for a website to be in with a good chance of ranking well, there needs to be an overlap – so yes, get the Technical SEO right, but also don’t forget about your SEO content marketing, as this plays an important part also.

Once you’re confident with your content, and you’ve done your keyword research, it’s time to make sure that not only can humans read it well, but that search engines can too. Don’t forget about the robots, ok.

Knowing your website, technically

There are a few technical things that are crucial for those in the SEO game to understand that will help a website rank better. Things like load times, core web vitals and user experiences all are factored into where Google decides to rank a website.

Take JavaScript, for instance (a much-debated topic in technical SEO), often Google will first browse a page without JavaScript before coming back at a later date to render it – meaning vital parts of that page using JavaScript may not get indexed. What a waste! And this is just one example of a poor, technical search-engine-optimised website.

Are domains important for Technical SEO?

There’s more to just naming and setting up a website. You need to pick a preferred domain and tell search engines your preferred domain. By default, websites are accessible with or without the preceding ‘www.’ in front of a domain name when searching URLs in address bars. So, if you were to name your website fashionistas.com, for example, your website would be accessible either with or without ‘www.’ in front of the domain name. But, and it’s a big BUT, this can confuse search engines as they would class them as two different websites, and you really don’t want to do that.

With this confusion, you can create duplicate content issues, indexing problems and page ranking losses. That’s why it’s crucial to inform search engines which is your preferred domain. Pick one, set it up and stick to it, making sure search engines know (redirects and canonical tags won’t hurt either).

The journey from server to browser

Knowing more about the pathways of how a website works is key to understanding Technical SEO and how to improve it for a website. 

The internet world works via servers and browsers. Users search and request domains via browsers every day to shop, read, research, communicate, watch, apply for jobs…  you name it. All these websites have unique IP addresses (Internet Protocol) that are attached to domain names via the DNS (Domain Name System). 

A quick communication channel happens where the browser requesting a web page prompts a DNS lookup, converting the domain name to its attached IP address. Then the browser sends a request to the server for the code of the construction of the web page (like CSS, JavaScript or HTML). The server will then send resources, sending files from the website to be assembled into the searcher’s browser. 

It’s not quite finished here, though.

Once the browser has received all the resources from the server, it then needs to put them all together to render the web page, ensuring the user can view it via their browser. And voila! From code to the visual format, you can now see the site as intended!

As you can see, even though this seems to happen so quickly when we browse and visit websites, any minor issue or interruption can affect the decoding, page load times and viewing of sites, which in turn affects the SEO, user experience, search engine crawlers and potential ranking. 

What types of website codes are there?

There are three common types of website codes: 

  • CSS (to do with presentation, like fonts and colours)
  • JavaScript (to do with behaviour, like dynamic or interactive etc.) 
  • HTML (to do with content, like body content and titles etc.)

HTML

Hypertext Markup Language 

No matter what you do to modify a web page, it’s all changing underneath through its HTML code – from adding anchor texts, internal links to rewriting content or adding visuals – behind the scenes, the code is changing.

Every bit of that code, every element will be crawled by Google, where it assesses how relevant and how well navigated that web page is to a search query, and you guessed it, this then determines a ranking result!

CSS

Cascading Style Sheets

CSS is the code focused on how a website looks. Where HTML helps describe a website and its content, CSS styles it. The substance and the style pair. But, how does this relate to SEO? 

One thing you need to know is that as search engines – in particular, Google – evolve, what may have helped with SEO one year, can have you facing a Google penalty with damaged search rankings another year.  

Before 2014, Google’s indexing process would render web pages as text-only browsers. After that, the system began rendering pages more fitting to an actual browser, rather than just text-only. Older SEO practices hid links and text through CSS to manipulate search engine ranks. Try that practice now and you will violate Google’s guidelines!

A great thing about CSS, is that it can reduce the heaviness of HTML codes, as now there’s another compartment where style sheet files can sit. Why does SEO like this? The reduction of file transfer size and heavy coding make page load times faster, which is one major asset for a good user experience, which is also what Google likes.

Javascript

This is the code that focuses on how a website behaves. With HTML and CSS, you had the substance and the style. When JavaScript came into the coding scene, it gave websites the opportunity to be dynamic, meaning less of just the static web page code, and more interactivity to a web page – some more oomph. 

From pop-ups to ad displays, this code enables more interactivity. Particularly useful for call-to-actions, like enticing people to sign up to a newsletter.   

With all that aside, this is the code you do have to be careful with when working on SEO. As mentioned, this code can cause a lot of issues if handled incorrectly. This, in theory, is mainly due to search engines not viewing the code the same way humans do. All the important elements of a web page, like tags, links and text, that load with JavaScript – rather than being represented in HTML – become invisible until rendered. So, that all-important crawl from a search engine will initially miss these critical elements.

When using JavaScript, just make sure that any crawlers aren’t blocked from accessing any JavaScript files or resources, ensuring they can view the web pages like any browser can. Be mindful that, because of the second indexing for this code, the crawler can miss those important elements on a page that only become visible once the JavaScript code is executed.

Making search engines read your websites

For a search engine to be able to read your content effectively, it needs to be labelled in a way that organises it into readable elements. A schema of sorts, which is referred to often as ‘structured data’. There are thousands of schema markups to implement as structured data, but it’s important to implement the one that best suits your web pages. Use Google’s Rich Results Test to see how accessible a web page is, and see which rich results can be generated by the structured data within it.

Improving functionality and user experience

Google loves when you provide a high level of user experience, one where website speed is fast, there’s no scroll fatigue, elements work and navigation is easy. And when a search engine rates your website’s functionality and user experience, they will be more likely to show it and rank it highly. They want to be the best search engine after all, and that means providing the best results to their customers. Remember Google is still a business. 

So, get this right and you will be rewarded. It’s also why you can find many websites that perform well in other aspects of SEO – like having great backlinks or referring domains – still be affected in ranking results if they aren’t satisfying Google with user experience. Or, to put it simply, aren’t meeting Google’s webmaster guidelines.

Another sure way to improve user experience is realising that most users are accessing websites and browsing on mobile devices. Take 2019, where there were around 3,986 million unique internet users on mobile out of 4,388 million active internet users, and around 90% of all internet users having mobile access. This is the main reason why Google’s indexing has switched to mobile-first. This means it’s vital to make sure you have a mobile-friendly website to ensure users are getting just as good of an experience on mobile as on desktop. 

How does page speed affect Technical SEO?

Visitors don’t want a slow-loading page, and this is where you often get high bounce rates as people quickly click off and head to a competitor to find a better website with faster speed, one which is often mobile-friendly too (especially important with the above mobile-first indexing point in mind). The algorithms of search engines like Google take this into account and it’s why it can affect SEO greatly. 

So, what are the main factors that can slow down a website and load times?

  • Images – make sure images are in the right format, image alt texts are optimised, images are compressed and a SRCSET attribute allows differentiated versions of images, specified to different situations. Think responsive design
  • Condense and bundle files – code files can be condensed by removing things like spaces and line breaks and abbreviating code variable names. Bundling occurs where you can combine a group of the same coding language files into a single file
  • Reduce redirects – anytime a redirect happens, users (and Google) face more waiting time, so do try to keep these to a minimum
  • JavaScript – as talked about in the code section, this code can get missed initially by web crawlers. Also browsers, before rendering a page, have to build a DOM tree by parsing HTML. If a browser encounters a script like this during the process, it can really slow a website down as it has to stop and execute the code before continuing

How does the robots.txt file relate to Technical SEO?

One clever tool is the robots.txt file, where you allow search robots to follow directions on your website using this file. You have to be careful here to ensure you don’t block your site’s JavaScript or CSS files in the robots.txt file, as this can prevent robots from crawling your site fully, ultimately missing out on important elements. Not what you want search engines and crawlers to be missing out on!

Making sure a website has a sitemap

Following the theme of crawling, a site map is the basic structure of a website that helps search engines locate, crawl and index all of its elements and content. They also inform search engines which pages on a site are the most important. 

Types of sitemaps

  • Image – helps Google locate all images on a site
  • Video – helps Google understand any video content on a page
  • XML – the most common one and links to different pages on a website
  • News – helps Google locate content on websites that are approved for Google News

Site maps will help you and search engines keep track of all pages on your site and especially if you’re a new brand putting out a new website, you can’t rely on only a few backlinks. You will help Google find and locate all of your pages through a site map, so it all helps with your Technical SEO.

If you’re on WordPress, plugins such as Yoast SEO can autogenerate sitemaps for you. You can also use online generator tools. For example, we ran our website through an xml sitemap generator to give you this XML Site Map example.

And that’s crawlability for Technical SEO in a nutshell! As you can see, Technical SEO is quite a big topic and there are many chapters. Once you’ve got your head around accessibility and crawlability, there’s just a bit more to cover, so stay tuned for more content and guides on the topic of SEO by following us on social media – catch us on Facebook, Instagram and LinkedIn.  

Not sure if your website’s SEO is up to scratch? Don’t worry, we’re a stickler for data and know just what to do – thanks to our SEO experts.

Give us a call on 01204 394347 or drop us a message.

No Comments

Post A Comment