Technical SEO: best practices for enhanced search visibility

SEO technology on a Macbook screen

SEO in Switzerland is quite particular, if you want to position yourself on a national scale, it is imperative to implement a strategy that covers the different official languages.

Technical SEO is one of the three main pillars of SEO. It is important to ensure that your site meets the technical expectations and requirements of search engine robots.

Find in this guide precise recommendations and best practices to implement an efficient technical SEO in Switzerland.

And don’t hesitate to download my checklist at the end of the article!

What is technical SEO

Technical SEO represents all the modifications and improvements made to a website in order to rank better on search engine. 

It aims to optimize the indexing of a website, but also its accessibility for robots.

Therefore, Google, Bing, or Yahoo use robots (Crawlers in English) that browse the entirety of websites to understand and index their content in search engines.

The technical SEO can be improved independently of the two other pillars of SEO (On-Page Optimization “Keyword research” and Off-Page Optimization).

The three pillars of SEO (Technical, On-page and Off-page)

The main elements to improve are the code of the site and the speed of the servers where it is hosted.

The importance of technical SEO for search engines

From an optimization point of view, this is the first thing to put in place. A prerequisite that will then work on the other two pillars. 

Indeed, even if your content is of very good quality, with a technical part that is not optimized, the chances of appearing in the results pages are very limited.

The purpose of these optimizations is to build a solid foundation for your SEO strategy. You must therefore work on the technical part of your site by taking into account the following three elements:

  • Performance 
  • Accessibility for robots
  • Indexing

Unlike the other two pillars of SEO, the technical SEO does not require regular work. 

Once a site audit has been established, it will be necessary to ensure that the various updates of the site are not an impact on technical SEO.

To facilitate its implementation, here are the different points to improve in priority.

The general performance of the website

The general performance of a site is defined by the page speed of loading web pages for visitors. 

Also, search engines take into account the user experience in relation to the content displayed.

In other words, the faster a site is and the more accessible the content, the better the overall performance of the site.

On average, 60% of website traffic comes from mobile devices (mobile first). For this reason, search engines promote sites that are adapted for this type of device. 

Mobile First avec la navigation sur un appareil mobile.

A framework specially designed to improve versions called AMP (accelerated mobile pages) was developed a few years ago. 

However, with the standardization of Core Web Vitals, the implementation of AMP is no longer really necessary for technical SEO.

Google offers tools to check if a site is optimized for this specific type of screen, “Mobile friendly”.

Therefore, all content must be able to resize automatically according to the size of the screen.

Images, content, but also buttons or links must be displayed in such a way that they are visible and usable on small screens (Mobile friendly).

In May 2020, Google announced a new update of its algorithm: The “Core Web Vitals” factors.  

Loading speed, a mobile-friendly design and a stable structure of a site are now key factors to aim for the first page in search results.

Google Core Web Vitals

Definition of the Core Web Vitals of a website

Rolled out between June 15 and September 2, 2021, the Core Web Vitals are ranking factors for Google in its search results.

These factors take into account the user experience, loading speed and visual stability of a website. 

Even though Core Web Vitals have experienced a “hype” during their launch, it is agreed that the relevance of the content and the context of a web page are more important in SEO.

Nevertheless, a site optimized for Core Web Vitals is an integral part of the technical SEO strategy.

Google has established three standard factors to define Core Web Vitals:

  • Largest Contentful Paint (LCP) translated by the time it takes to load a web page from a user’s perspective.
  • First Input Delay (FID) is a measure of the time between when the web page is displayed and when the user can interact with the content.
  • Cumulative Layout Shift (CLS) is the stability of a web page content once displayed to the user.

We can analyse and optimize the core web vital of a site through two tools provided by Google:

  • The page speed insight: https://pagespeed.web.dev/
  • The lighthouse tool, available in the Chrome browser

The Google Search Console also offers a dedicated section to evaluate the performance of a site.

Loading speed – Largest Content Paint (LCP)

The connectivity of Internet users has greatly evolved in the last years. Until recently, the idea of being able to use a mobile device to surf the Internet was not a reality. 

Most people were using computers to browse websites with limited page speed.

Road with fast car lights comparable to the speed of a website

Nowadays, portability and connectivity have greatly increased. We can easily consult a video or watch a film in high definition in public transports.

For this reason, the loading speed of a content or a page  (page speed) are now a priority for Google.

However, in case the largest Contentful Paint test is not valid, you can review and improve certain points:

  • The hosting of a site. A good host generally offers a very good loading time of a site. (By the way, I personally use the services of Alphosting and am very satisfied!) 
  • Limit third-party scripts. Some third-party elements (Fonts, Videos, tracking, conversion pixels) can sometimes slow down a site. Making sure that the different scripts are only loaded when they are needed is important.
  • Implementation of “Lazy loading”. This function allows images and other elements to be loaded later so that they only appear when the user scrolls down the page.
  • Review the “heavy” elements of a page. Reducing the weight and size of certain images, graphics, or even files can improve loading time.

First Input Delay (FID)

The second factor is the time it takes for a user to actually interact with a page.

In other words, some pages may display quickly, but the loading of different elements may delay the ability to interact with the content.

Therefore, for the first input delay to be valid, it must be below 100ms. It can be improved with the following elements:

  • Minimize or asynchronously load the javascript. During JS loading, it’s almost impossible for the user to interact with the page. Therefore, it’s interesting to reduce or set some conditions for JS load files to be loaded.  
  • Limit the use of external scripts. In the same way as for the LCP factor, some scripts (heatmap, conversion pixels, etc.) can have a negative impact on the first input delay. Setting up loading conditions can improve the interaction time.
  • Caching in browsers. This allows to load the content of a page faster. And thus, caching improves the loading time of JS files.

Cumulative Layout Shift (CLS)

This last factor takes into account the stability of a page once it is loaded. In other words, the content of a site once displayed should not change

Google has implemented this factor to put an end to some very popular practices.

Unscrupulous webmasters inserted ads in strategic places (in the middle of the navigation or in place of a menu) once the page was loaded.

Thus, the visitor who initially wanted to click on an image or a button was involuntarily clicked on an advertisement.

From now on, sites that use these practices are downgraded and no longer appear in search engines.

To be sure to validate this factor, you must : 

  • Fill in the dimensions of each media in the code (Image, videos, infographics). In this way, once displayed, the user will not have any changes in the size of the elements when browsing the page.
  • Reserve specific locations for advertising. This way, standard formats (banner, rectangle, sckyscraper) do not affect the overall look of the page even if they are empty.

Accessibility for search engine robots

To be sure that all the pages of your site are indexed, it is necessary that the relevant content is accessible to robots.

In simple terms, the robots crawl your site to analyze the content and rank the different pages in the search engines. 

Search engine robots

Thus, it will be able to propose certain parts or pages of your site in its results if your content corresponds to the answer of a visitor’s request. 

Therefore, if certain sections or pages of your website are not accessible to the robots, it is a missed opportunity to appear in the search engines

For this reason, you need to make sure that all the pages relevant to your business are accessible to robots and users. 

Tools available for technical optimization

There are many tools available to help you identify problematic pages. For example, Screamingfrog reproduces the behaviour of Google’s robots. 

You can use the Google Search Console which is providing a lot of information about the different error you may have with your pages.

You can use this tool to check which pages and elements are not yet indexable.

Optimizing the accessibility of a site allows you to cover a greater number of requests and increase its visibility to generate more traffic

The importance and the implementation of the robots.txt file

The robots.txt file is one of the first elements analysed by the robots.

What is robots.txt?

The file gives precise recommendations in terms of pages to explore and files to ignore.  Also, you can add indications on how often you want your pages to be crawled.

Search engine robots

How to optimize the robots.txt file?

Generally, the robots.txt file is related to a tool that allows de-indexing pages. 

However, this is not its primary function. The robots.txt file prevents crawlers from visiting certain pages by limiting access to them.

You must first ensure that the robots.txt file is present on your site. If it is not, search engines consider that the entire content is indexable.

Then, the file must be placed at the root of the site and written in lower case.

It is also advisable to make changes one by one to avoid too great an impact on the indexing of a site.

In terms of good practice, you can ask not to explore the “checkout process” part, the member spaces of your site and all the images or files (JavaScript, CSS, PDF) that you do not want to be indexed.

Optimize the indexing of web pages

This is the last important point of the technical Search engine optimization  part. Once your site is “structurally” optimized, you need to establish indexing rules to improve the visibility potential of your pages.

To do this, you need to define which parts of the website you want to appear in search engines

You can use the “no index” tag on pages that you do not want to be displayed on Google.

Also, you must limit duplicate content. The rule here is simple, each URL of your website must correspond to a unique content

That is to say that each page of your site must be accessible with only one link. 

Create and optimize the sitemap.xml file

The sitemap is one of the main components in the technical SEO strategy. It offers a more efficient way to understand the structure of a site and the relationships between pages. 

What is a sitemap.xml file ?

The xml sitemap can be compared to a map of a website. It lists all the pages, articles, video files and images and also allows classifying them by importance.

Child reading a card

It also gives information on the date of update of a page, as well as the various languages in which it is available.

How to optimize a sitemap.xml file ?

Making sure that all the pages (in other words, the URLs) are present is a very good start. 

Unlike a robots.txt file, there can be several sitemaps for the same website.

For example, if you have a multilingual website as it is often the case in Switzerland, you can create a xml sitemap for each language.

Moreover, you can split your sitemap by type of pages (category, e-commerce, blog, etc.).

Finally, once established, you can submit your different sitemaps in the Google Search Console so that they are taken into account more quickly.

The structure of URLs 

URLs also have a relative impact on technical referencing. 

They allow giving clear information to Google from the point of view of the structure of the website (directories). 

Also, if a keyword is present in the URL, it allows to better understand the context and the content of the page.

What is the structure of URLs?

The structure of a URL involves the domain or subdomain as well as all the directories that follow it. It gives the path to the different categories and pages of a website.

Directional sign

How to optimize the structure of urls?

To optimize a URL, you must first make sure that it is as short as possible. 

It is also necessary that the targeted keywords appear in the address.

Furthermore, you must make sure that the structure of the website is reflected in the structure of the URLs. 

That is to say that each main category must be located as close as possible to the domain and so on. (ex: domain.com/language/category1, domain.com/language/category2, etc.)

Special characters such as accents and capital letters should of course be avoided.

Finally, you should avoid having URLs with linking words or articles. It is recommended to use dashes “-” instead of underscores “_” between each word. 

The breadcrumb

The breadcrumbs are also part of the criteria of the Google algorithm.

Generally implemented to improve the user experience of a website, it also has its importance in technical Search engine optimization.

What is the breadcrumb on a web page ?

The breadcrumb allows users to navigate your website without having to click on the back button of their browser

It is most often present at the top of the page and calls up the top categories of the page in which you are. 

How to optimize the breadcrumb?

To optimize it, you must first ensure that it’s present on all pages. 

It is also important to show all the categories and sub-categories of each page. 

This is for a better understanding of the overall structure of the website for visitors, but also for robots.

Rope representing the Ariane's thread

The Hreflang tag

The Hreflang tag allows identifying and define the language used on each page by search engines.

What is the hreflang tag?

It is an essential element, especially on multilingual sites, and to avoid duplicate content.  

It allows the robots to identify the language for the page they are exploring and to classify it in the corresponding version.

For example, a page in French has a hreflang=”fr” tag. 

The interest of the hreflang tag is also to add versions of the page intended for a specific country.

Therefore, a page in French intended for Switzerland will have a hreflang=”fr-ch”.  

The version intended for France will have a hreflang=”fr-fr”.

This limits the indexing of local content for a country that is not targeted and avoid any duplicate content.

How to optimize the hreflang tag in your technical SEO strategy ?

You can first set up a hreflang tag on all your web pages.

By specifying the language and the country, if your country (like Switzerland) has several national languages.

Finally, you can add all the alternative pages for which the content is available, but in another language.

A page in French for Switzerland has :

A tag hreflang=”fr-ch” but also a tag rel=”alternate” hreflang=”de-ch” if this page also exists in German. 

Flag with several languages

The canonical tag

Duplicate content is an element to avoid when it comes to improving its technical SEO. To do this, we can use a canonical tag on all web pages. 

It is a piece of HTML code to integrate to determine which is the main version of a page and its different variants.

What is the canonical tag?

Thanks to the canonical tag, it is quite easy to identify the main pages and the web pages with similar content that do not need to be indexed.

Typically, an e-commerce site often offers products with different options. 

To avoid problems during indexing, a good practice is to define the main canonical page by default. 

And then add a canonical tag on the other pages that refer to this main page (avoid duplicate content)

This way, search engine spiders can simply identify which page should appear in the results.

How to optimize the canonical tag?

The first step is to select the main pages. 

The URLs in the canonical tag should not have any parameters. A good practice is also to use only lowercase letters (in general).

Finally, you should not use redirect loops because the canonical tag should only contain the final URL.

You can verify if there are any errors on the Google Search Console.

Structured data

Search engines offer more and more answers directly in their results.

Thus, we can very quickly know the price of a flight, the cooking time of a recipe without leaving the search results.

What is structured data?

Structured data allows classifying in a standard way the data of a page.

 This way, search engines can more easily “understand” the context, interest, and meaning of a content.

Specific tags are inserted on certain elements of a page to designate them as specific content.

There are three main standards for structured data (JSON-LD, Microdata and RDFa), but the most widely used is JSON-LD (understood by all search engines).

Structured data is mainly used for event pages, recipes, but also for product pages.

Recipe in search results

How to optimize structured data?

First, you need to make an inventory of your website pages. You can evaluate the different types of structured data available on the shema.org website (link).

Once the data types are selected, you add the different JSON-LD elements in the code of the page.

Finally, you can test the page with a validator to see if the structured data has been taken into account.

301 redirects as part of technical SEO improvements

Managing 301 redirects is an important part of the technical SEO strategy. Especially during a redesign or major changes in the structure of your website.

What is a 301 redirect?

Used when creating web pages with a new URL, 301 allow redirecting the user (and the search engines) to the new content.

Thus, an old web page indexed in search engines that is deleted from a site, will have to be redirected to the new web page.

If you don’t redirect obsolete URLs, a 404 error will appear and maybe duplicate content, which is harmful in the medium and long term for a good technical SEO.

How to use effectively a 301 redirect and control them on the Google search console?

You have to make sure that every URL indexed by Google is correct.

To do this, Google Search Console allows you to analyse in detail all your pages and also provides a list of URLs with 404 errors.

404 error with bad redirection

Some of your pages may have been created temporarily. If these pages have been indexed by search engines, they can sometimes generate duplicate content.

Adding a 301 redirect allows you to limit this problem and especially not to lose traffic

A good management of redirects is essential to maintain a consistent website structure.

Set up a secure https website as part of the technical SEO strategy

To simplify, when a link is clicked, a request is sent to a server where the different elements of a website are hosted.

In the same way, when you submit a form on a contact page, you send data to a server.

Implementing a secure “HTTPS” protocol is the guarantee to have secure data transfers between visitors and the servers where your website is hosted.

What is the https protocol?

It is a protocol that has a significant importance, on the one hand for the visitors of your website, but also for the search engines.

This protocol is generally used to secure data transfers between servers and a user’s browser.

Today, 81.2% of websites use HTTPS protocol. Most browsers issue a warning on websites that do not have a secure protocol.

Google has announced that it is penalizing sites without an SSL certificate and will no longer show them as a priority in its search results.

How to set up a website in HTTPS?

For a website to use the “HTTPS” protocol, you must first obtain an SSL certificate. This certificate will encrypt the data passing from a server to the browser.

Once the certificate is established, you must set up a server-side redirection on all HTTP URLs to the HTTPS protocol.

Finally, you must verify that internal and external links (in the content of the pages) use URLs in HTTPS format.

Secure cadena

The checklist for a successful technical SEO

In order to put all the chances on your side when it comes to the implementation of technical SEO for your website, I have created a checklist to make sure you don’t forget anything.

To summarize 

SEO is for sure all a long-term investment. It is an important part of the digital marketing strategy. 

Setting up an effective an On-Page SEO or a technical SEO strategy takes some time

Technical SEO is one of the first actions to take to make sure your website is indexable and accessible.

By following these recommendations and implementing the optimizations as soon as possible, you can ensure that your website will rank in the top results of search engines. (it’s also important for a successful international SEO strategy)

Of course, even though a successful SEO strategy is a medium to long term investment, technical SEO must be taken into account from the beginning.

However, if technical SEO was not a priority due to lack of time or resources, yesterday was the perfect day to start the improvements, but today is also a great opportunity. 

Don’t hesitate to contact us if you want more information about a site audit or if you have any questions about your technical SEO strategy.