While technical SEO is a topic that only some of us make use of rigorously, it is a part of everybody's life. Well, which part of SEO is not technical if we were to look at it thoroughly?
SEO issues, mistakes, tips and recommendations are all included in today's technical checklist. We wanted to cover, in the most effective way possible, all the elements that are important for making your website user-friendly, efficient, visible in SERP, functional and easy to understand. Therefore, gather all the information you have on your site and let's get better.
II. Website Functionality & Usability
When we talk about speed, there are a few things we need to consider for making your site efficient and easy to access for your users. A faster loading speed time means higher conversions and lower bounce rates. For that, we've selected some mandatory speed optimization suggestions. Using Google's Speed Test, you can perform easy and short analyses of your website's loading speed time.
The tool has improved over the years and now you can see helpful charts for large websites to understand how each website is performing. One example is the Page Load Distributions.
The Page Load Distribution uses two user-centric performance metrics: first, contentful paint (FCP) and DOMContentLoaded (DCL). The contentful paint marks the first bit of content there is on the screen when the browser starts to render pixels. The DOMContentLoaded marks the moment when the DOM is ready and there are no stylesheets that are blocking JavaScript execution. These two metrics show exactly which percentage of the content that loads faster and the one that needs improvement by looking at those pages with average and slow speed (if you follow the chart).
Another example are the speed and optimization indicators which show where each website is situated. In the picture showed below, we can see the FCP and DCP score. These two metrics use the data from the Chrome User Experience. It indicates the page's median FCP (1.8s) and DCL (1.6s) ranks it in the middle third of all pages. That means this page has a low level of optimization because most of its resources are render-blocking.
One of the first actions that come to mind when we talk about website speed is reducing the number of resources. When a user enters your website, a call is made to the server to access the requested files. The larger those files are, the longer it will take to respond to the requested action. The best thing you can do is to delete unnecessary resources and then minimize the overall download size by compressing the remaining resources.
PageSpeed shows which files need to be optimized through the minifying technique. When we talk about resources, we understand HTML, CSS, and JavaScript resources. Basically, the tool will indicate a list of HTML resources, CSS resources, and JavaScript resources, depending on the situation. Below you can see an example of such kind:
For each kind of resources, you have individual options:
Below you can see an example on how to minify your CSS:

Source: www.keycdn.com
There are 3 processes that need to be followed in the minifying process, explained by Ilya Grigorik, Web performance engineer at Google:
Let's take, for example, a photography website that needs to have pictures with a lot of information, such as camera settings, camera type, date, location, author and other information. That information is crucial for the particular website, while for another website it might be irrelevant.
Server response time refers to the period of time it takes to load the HTML code to begin rendering the page from your server. Basicaly when a you enter a page it sends a message to te server and the time it take to show you that information is considered to be the server response time.
There are lots of reasons why a website has a slow response time. Google announces just some of them:
The server response time depends on how much time the Googlebot needs to access the data. Be it 1, 2 ,3 seconds or more, it will convert your visitor or not. Google says that you should keep the server response time under 200ms.
There are 3 steps you need to follow to test and improve the server response time:
Images are another element that can be optimized for reducing the page load. Besides all the information an image has, as mentioned before, it also downloads lots of bytes on a page, making the server take more time than it should to load all the information. Instead, if we optimize the page, the server will perform faster because we removed the additional bytes and irrelevant data. The fewer the downloaded bytes by the browser, the faster a browser can download and render content on the screen.
Since GIF, PNG, and JPEG are the most used types of extension for a picture, there are lots of solutions for compressing images.

Source: www.cssscript.com
Here are a few tips and recommendations to optimize your images:

Source: www.quora.com
4. Reduce the Number of Redirects & Eliminate Redirect LoopRedirects can save you from a lot of trouble regarding link equity/juice and broken pages, but it can also cause you lots of problems if you have tons of them. A large number of redirects will load your websites at a slower speed. The more redirects, the more time a user must spend to get on the landing page.
One other thing worth mentioning is that you need to have only one redirect for a page, otherwise you risk having a redirect loop. A redirect loop is a chain of redirects for the same page, which is misleading because the browser won't know which page to show and will end up giving a pretty nasty error.

Source: www.matrudev.com
In case you have 404 pages, there are lots of ways to customize the page and give some guidelines to the users so you won't lose them. Design a friendly page and send the user back to your homepage or to another relevant and related piece of content.
For finding the broken pages for your website, you can use Search Console by looking at Crawl » Crawl Errors, then click on Not found (if any).
Site Explorer offers a similar feature, pointing out the link juice you are losing (the number of referring domains and links for each broken page).
The browser cache automatically saves resources in the visitor's computer the first time they visit a new website. Basically, when users enter a page, those resources will help them get the desired information at a faster speed, if they return to that page. This way, the page load speed is improved for returning visitors.
For visitors that want to return to a page or visit a new page that in a specific moment can't be accessed, there's the option to view the cached version directly from SERP.
The best way to significantly improve the page speed load is to leverage the browser cache and set it according to your needs.
6. Minimize the Render-Blocking Javascript and CSSWhen you perform a speed test with Google's PageSpeed Insights, you will see this message: Eliminate render-blocking JavaScript and CSS in above-the-fold content in case you have some blocked resources that cause a delay in rendering your page. Besides pointing out the resources, the tool also offers some great technical SEO tips regarding:
You can remove render-blocking JavaScript by following Google's guidelines and avoid or minimize the use of blocking JavaScript using three methods:
If Google detects a page which delays the time to first render because it contains blocking external stylesheets, then you should optimize CSS delivery. In this case, you have two options:
For WordPress users there are simpler solutions:

Source: www.webid-online.com

Source: www.factoriadigital.com
Check the enable box from the Minify option and then in Manual mode. In the end, click on Save all settings and add the scripts and CSS that you want to minify. After that, you're set.
II. Website Functionality & Usability 7. Make Sure Your Resources Are CrawlableHaving not-crawlable resources is a critical SEO technical issue. Crawling is the first step, right before indexing, which comes and puts your content in the user's hands/eyes. Basically, Googlebot crawls the data and then sends it to the indexer which renders the page and after that, if you're lucky, you'll see that page ranking in SERP.

www.slideshare.net/ryanspoon
It is very important that the users see the same content that the Googlebot does.
If your CSS files are closed from indexing, Google won't be able to see the pages like a users does. The same situation applies to Javascript, if it isn't crawlable. With JavaScript, it is a little bit more complicated, especially if your site is heavily built using AJAX. It is necessary to write codes for the server to send an accurate version of the site to Google.
If you're not blocking Googlebot from crawling your JavaScript or CSS files, Google will able to render and understand your web pages like modern browsers.
Google recommends using Fetch as Google to let Googlebot crawl your JavaScript.

Source www.webnots.com
8. Verify the Indexed ContentJames Parsons, expert in content marketing and SEO, explains in an article on AudienceBloom the crucial significance of the indexing phase for a website.
Search Console can provide lots of insightful information regarding the status of your indexed pages. The steps are simple, go to Google Index then to Index Status and you'll be able to see a similar chart to the one shown below:
The ideal situation would be that the number of indexed pages is the same as the total number of the pages within your website, except the ones you don't want to be indexed. Verify if you've set up proper noindex tags. In case there is a big difference, review them and check for blocked resources. If that concluded with an OK message then check if some of the pages weren't crawled, therefore indexed.
In case you didn't see something that was out of the ordinary, test your robots.txt file and check your sitemap. For that check the following steps (9 and 10).
9. Test Your Robots.Txt File to Show Google the Right ContentTesting your robots.txt file helps Googlebot by telling it which pages to crawl and which not to crawl. By using this method, you give access to your data to Google.
You can view your robots.txt file online if you search for http://domainname. com/robots.txt. Make sure the order of your files is right. It should look similar to what you can see in the next picture:
Use the robots.txt Tester tool from Search Console to write or edit robots.txt files for your site. The tool is easy to use and shows you whether your robots.txt file blocks Google web crawlers from specific URLs. The ideal situation would be to have no errors:
The errors appear when Google is unable to crawl the specific URL due to a robots.txt restriction. There are multiple reasons for that and Google names just some of them:
The common issues that appear when Googlebot is blocked to access your website happen because:
After you've checked the issues and found out which are the blocked resources pointed in the Tester tool, you can test again and see if your website is ok.
10. Review Your Sitemap to Avoid Being OutdatedAn XML Sitemap explains to Google how your website is organized. An example you can see in the picture below:

Source: statcounter.com
Crawlers will read and understand how a website is structured in a more intelligible way. A good structure means better crawling. Use dynamic XML sitemaps for bigger sites. Don't try to manually keep all in sync between robots.txt, meta robots, and the XML sitemaps.
Search Console comes to rescue once again. In the Crawl section, you can find the Sitemap report, where you can add, manage and test your sitemap file.
Up to this point, you have two options: test a new sitemap or test a previously added one. In the first case:
In the second case, you can test an already submitted sitemap, click on Test and check the results.
There are three things you need to do in the situation explained in the second situation.
Hashbang URLs (URLs that have the #! in them in) can be checked and tested in Fetch as Google now. John Mueller acknowledged that Google has the ability to fetch & render hashbang URL's via the Search Console.
Google stopped supporting them on March 30, 2014, and that changed when it announced on October 14, 2015, that it deprecates their AJAX crawling system. At the moment hashbang URLs can be tested.
Below you can see two situations for the same website. In the first picture, you can see the list of resources before using the fetch and render feature with the hashbang URL and in the second one you can see the situation after the fetch and render action was performed.

Source: www.tldrseo.com
12. Optimize Your Crawl BudgetThe term "crawl budget" appeared for the first time on January 16, 2017, when Gary Illyes explained what it means for Google.
Crawl budget means how many resources are allocated for crawling by a server or how many pages are crawled by the search engines in a specific period of time. Google says that there is nothing to worry if the pages tend to be crawled every day. The issues appear on bigger sites. It is very important to optimize your crawl budget.
Maria Cieślak, SEO expert, explains in an article on DeepCrawl the importance of optimizing your crawl budget.
The crawl limit rate comes into discussion, which limits the maximum fetching rate for a given site.
The actions recommended for optimizing the crawl budget are:
Site migration is a recommended operation in case the website is changed completely and the same domain won't be used anymore. Setting up the 301 redirects can be applied in case you make a switchover from HTTP to HTTPS and want to preserve the link equity.
In case of a site migration, it is crucial to set up correctly the redirects. To avoid losing lots of links and have broken pages, it is best to follow a correct 301 redirection procedure. For that, you need to take into consideration the next recommendations. For the vast majority, we've already covered some of them in the previous steps:
Since we've talked about the redirection plan for migrating a site, it is best to understand why Google doesn't recommend using meta refresh for moving a website. There are three ways to define redirects:
Aseem Kishore, owner of Help Desk Geek.com, explains why it is better not to use this meta refresh technique:
When possible, always try to use HTTP redirects, and don't use a <meta> element. HTTP redirection is the preferred option, but sometimes the web developer doesn't have control of the server or can't control it. And they must use other methods. Although HTML redirection is one of them, Google strongly discourages web developers to use it.
If a developer uses the HTTP redirects and forgets the HTML redirects, they aren't identical anymore and might end up in an infinite loop, which leads to other problems.
In case you want to move a site, Google guidelines recommend to follow the next steps:
Creating a flash site without a redirect to the HTML version is a big SEO mistake. Flash content might have an appealing look, but just like JavaScript and AJAX, it is difficult to render. The crawler needs all the help it can get to crawl the data and send it to the indexer. The Flash site must have a redirect to the HTML version.
If you have a pretty site, what's the point if Google can't read it and show it the same way you'd want it to? Flash websites might tell a beautiful story, but it's all for nothing if Google can't render it. HTML is the answer! Build an HTML version with SWFObject 2.0. This tool helps you optimize flash content.
16. Use Hreflang for Multi-Language WebsitesHreflang tags are used for language and regional URLs. It is recommended to use the rel="alternate" hreflang="x" attributes to serve the correct language or regional URL in Search results in the next situations:
Maile Ohye, former Developer Programs Tech Lead, explains how site owners can expand to new languages variations and keep the search engines friendly:
Based on these options, you can apply multiple hreflang tags to a single URL. Make sure, though, the provided hreflang is valid:
We've documented a complete guideline on the vital hreflang & multi-language website mistakes that most webmasters make, that we recommend you to follow.
17. Use the Secure Protocol – HTTPSOn August 6, 2014, Google announced that HTTPS protocol is on their new ranking factors list and it is recommended to all the sites to move from HTTP to HTTPS.
HTTPS (Hypertext Transfer Protocol Secure) encryptes the data and doesn't allow it to be modified or corrupted during transfer, while protecting it against man-in-the-middle attacks. Besides, the improvement in data security has other benefits, such as:
If you use the HTTPS, will see a lock before the URL in the navigation bar:
In case your website doesn't use the HTTPS protocol, you'll see an information icon and if you click on it, a new message will alert you that the connection is not safe, therefore the website is not secure.
While it is best to move from HTTP to HTTPS, it is crucial to find the best way to recover all your data after moving your website. For instance, lots of users complained they lost all of their shares after moving the website and the same thing happened to us.
After we experienced the same issue, we've created a guideline on how to recover Facebook (and Google+) shares after an https migration that you could easily follow:
We've talked on this topic various times before because it is important to have easy-to-follow URLs. Avoid having query parameters in URL. You can't keep track of that URL in Analytics, Search Console and so on. Not to mention it is difficult to do link building. You might lose linking opportunities because of your URLs appearance.

Source blogspot.com
If you're a WordPress user, you have the option to personalize and set up your permalink structure. If you take a look at the next picture, you can see the options you have for your URL structure.
Building friendly URLs is not so hard, you can follow the next 3 tips:
Building easy-to-read and focus-keyword-URLs you are thinking about your users and therefore focusing on user experience. David Farkas has the same vision on the matter:
III. Content Optimization19. Replace Broken Images
Sometimes the images from a webpage aren't available, so a broken image is displayed in the client's browser. It can happen to everybody. There are lots of reasons for that. And it is not a pretty situation. You know the saying: A picture is worth a thousand words and a missing picture with an ugly icon with a message will say something as well…
A solution would be to add an onerror handler on the IMG tag:
<img src="http://www.example.com/broken_url.jpg"onerror="this.src='path_to_default_image'" />
1
<img src="http://www.example.com/broken_url.jpg"onerror="this.src='path_to_default_image'" />
Some webmasters say that Chrome and Firefox recognize when images aren't loaded and log it to the console, while others have other opinions.
Sam Deering, web developer specialized in JavaScript & jQuery, offers some great steps to resolve these issues:
Internal links are the connection between your pages and, due to them, you can build a strong website architecture by spreading link juice, or link equity, as others refer to it.
Creating connections between similar pieces of content creates the terminology of Silo content. This method presumes to create groups of topics and content based on keywords and defines a hierarchy.

Source: www.seoclarity.net
There are a lot of advantages for building internal links because it:
The more relevant pages are combined with each other when crawled repeatedly, and as the crawling frequency rises, so does the overall rank in search engines.

When you audit internal links, there are four things that need to be checked:
When we talk about technical SEO, we also think of duplicate content which is a serious problem. Be prepared and review your HTML Improvements from Search Console to remove the duplicates.
Keep unique and relevant title tags, descriptions within your website by looking into the Search Console at Search Appearance » HTML Improvements.
In Search Console, you can find a list of all the duplicate content leading you to the pages that need improvement. Remove or review each element and craft other titles and meta descriptions. Google loves fresh and unique content. Panda algorithm confirms it.
Another option would be to apply the canonical tag to pages with duplicate content. The tag will show to the search engines which is the original source with your rel=canonical tag. Canonicalizing irrelevant URLs to avoid content duplication is a recommended practice.
Jayson DeMers, Founder & CEO of AudienceBloom, considers that duplicate content can affect your website and discourage search engines to rank your website and it can also lead to bad user experience, as he says on Forbes.
22. Use Structured Data to Highlight Your ContentStructured is the way to make Google understand your content and help the user choose and get directly on the page they are interested in through rich search results. If a website uses structured markup data, Google might display it in SERP as you can see in the next picture:
Beside rich snippets, structured data can be used for:

Source: www.link-assistant.com
The language for structured data is schema.org. You can highlight your content using structured data. Schema.org helps webmasters mark up their pages in ways that can be understood by the major search engines.
If you want to get in rich search results your site's page must use one of three supported formats:
After you highlight your content using structured data, it is recommended to test it using the Google Structured Data Testing Tool. Testing it will give you great directions to see if you set it right or if you didn't comply with Google's guidelines because you can get penalized for spammy structured markup.
Google doesn't guarantee the appearance of each content highlighted using structured data markup.
23. Keep a Reasonable Number of Links On-PagePeople from the web community often associate pages with 100 links or more with "link farms". Also, UX has a significant impact on the number of links on a single page. A piece of content abundant of links will distract the users and fail to offer them any piece of information because most of it is linked. You need to add links only where you think it is relevant and it can offer extra information or you need to specify the source.
Patrick Sexton, Googlebot Whisperer, explains in an article on Varvy why it is important to keep a reasonable amount of links per page:
In general, the more links on the page, the higher the need to keep that page more organized in order for the user to get the information they came for on that page. Also, be careful to search for natural ways to add links and don't violate Google's guidelines for building links. The same recommendation applies for internal links.
24. Avoid Canonicalizing Blog Pages to the Root of the BlogJohn Mueller said in one Google Webmaster Hangout that Google's doesn't encourage to canonicalize blog subpages to the root of the blog as a preferred version. Subpages aren't a true copy of the blog's main page so doing that has no logic.
You can listen to the whole conversion from minute 16:28:
Even if Google sees the canonical tag, it will ignore it because it thinks it's a webmaster's mistake.
IV. User-Friendlier WebsiteGoogle cares about UX, so why wouldn't you? Many experts think that UX is crucial in the future of SEO, especially with all the evolution of machine learning technology. David Freeman, Search Engine Land Columnist, has a strong opinion on the role of UX:
25. Set up Your AMP the Right Way – Mobile FriendlierGoogle recommends using AMP (Accelerated Mobile Pages) to improve the UX, highly valued by the company. Since the Google AMP change will affect a lot of sites, it is best to understand the way it works and the right way to set it up/install it on different platforms: WordPress, Drupal, Joomla, Concrete5, OpenCart or generate custom AMP Implementation.
On this topic, we created a guideline on how to implement AMP because it is a process which needs full understanding. Google AMP doesn't directly affect SEO, but indirect factors that result from AMP can.
26. Add Breadcrumbs for a Better NavigationBreadcrumbs, used by Hansel and Gretel to find their way back home, are implemented by websites with the same purpose, to lead the user through the website. They help the visitors understand where are they located on the website and give directions for an easier accessibility.

Source: www.smashingmagazine.com
Breadcrumbs can improve the user-experience and help search engines have a clear picture of the site structure. Fulfilling the need to a second navigation on the website, breadcrumbs shouldn't replace the primary navigation though.
Another advantage of them is that they reduce the number of actions and clicks a user must take on a page. Instead of going back and forth, they can easily use the link level/category to go where they want. A technique can be applied to big websites or e-commerce sites.
W3Schools exemplifies how to add breadcrumbs in two steps.
<ul class="breadcrumb"> <li><a href="#">Home</a></li> <li><a href="#">Pictures</a></li> <li><a href="#">Summer 15</a></li> <li>Italy</li> </ul>
<ul class="breadcrumb">
<li><a href="#">Home</a></li>
<li><a href="#">Pictures</a></li>
<li><a href="#">Summer 15</a></li>
<li>Italy</li>
</ul>
/* Style the list */ ul.breadcrumb { padding: 10px 16px; list-style: none; background-color: #eee; } /* Display list items side by side */ ul.breadcrumb li { display: inline; font-size: 18px; } /* Add a slash symbol (/) before/behind each list item */ ul.breadcrumb li+li:before { padding: 8px; color: black; content: "/\00a0"; } /* Add a color to all links inside the list */ ul.breadcrumb li a { color: #0275d8; text-decoration: none; } /* Add a color on mouse-over */ ul.breadcrumb li a:hover { color: #01447e; text-decoration: underline; }
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
/* Style the list */
ul.breadcrumb {
padding: 10px 16px;
list-style: none;
background-color: #eee;
}
/* Display list items side by side */
ul.breadcrumb li {
display: inline;
font-size: 18px;
}
/* Add a slash symbol (/) before/behind each list item */
ul.breadcrumb li+li:before {
padding: 8px;
color: black;
content: "/\00a0";
}
/* Add a color to all links inside the list */
ul.breadcrumb li a {
color: #0275d8;
text-decoration: none;
}
/* Add a color on mouse-over */
ul.breadcrumb li a:hover {
color: #01447e;
text-decoration: underline;
}
If you want a simpler solution, you can use plugins for WordPress, such as Breadcrumb NavXT Plugin or Yoast SEO.
Go to SEO in Dashboard, then click on Advanced and select Enable Breadcrumbs » Save changes. This method will apply the default setting for your breadcrumbs.
ConclusionFirstly, this SEO guide offers solutions and points out directions on how to make a website fast and decrease the loading time by following the recommendations on Google Speed Insights and Google developers' guidelines.
Secondly, we went through the functional elements of a website, by trying to check and resolve issues related to crawling errors, indexing status, using redirects and making a website accessible to Google.
Thirdly, we looked for improving and optimizing the content by resolving critical technical SEO issues. We discussed on how to remove duplicate content, replace missing information and images, make a strong architecture website, highlighting our content and make it visible to Google.
Lastly, we pointed out two issues regarding the mobile-friendliness sites and navigational websites.

0 comments: