The importance of fixing technical SEO errors isn't widely understood. Some believe it is essential while others believe it is only mildly important. Whatever the level of significance is, it is a fact that technical errors have an impact on your ability to rank well on Google and other popular search engine results pages.
Whenever we provide businesses with a Website SEO Audit, we frequently run into the same issues time and time again.
Some sites have 100s, if not 1000s of errors, while others have less than 20. Whatever the total number of errors is, every site has some potential for on-site improvement. Ultimately, you want your site to rank as high as possible on search engines, so there is no reason to not fix some of these errors if they exist on your site.
4xx and 5xx Status Codes
4xx and 5xx status codes are considered by many to be "critical on-site SEO errors".
The 4xx status code is an error on the client-side, meaning that the web browser is making a request to the server for a file, but the server is responding with a 404 to indicate the page couldn't be located. In this case, the client browser doesn't know if the page is missing temporarily or permanently.
The 5xx status code is similar to the 4xx status code, however, the error is on the server and not on the client side. Usually, the cause is the same as the 4xx status code, however, the cause can sometimes be a problem with your web server, which will cause your entire site to no longer be available to visitors.
We have found that 4xx and 5xx errors are somewhat common on sites and can occur for many reasons; however, the most common cause is due to broken links. Whenever your site has a link to a file that doesn't exist, a web visitor is likely to see a 404 (4xx status code) error. We recommend that you always check that each link on your site returns a valid request from the server for images, videos, HTML files, and server files (PHP, ASPX, CFM, etc.). If your server is down and the client is receiving a 5xx status code, make sure to find out the cause and fix the problem as soon as possible.
Pages with crawler "warnings" have some sort of instructions to block search engines from crawling pages on your site. These instructions are essential for telling search engines to not crawl pages or directories that have information that doesn't need to be indexed, but if they are used incorrectly, they will block essential pages or directories that you want to be indexed.
The robots meta tag is a useful piece of information concerning your site's ability to be indexed. For example, if you include this tag in your header
<meta name="robots" content="noindex" />, you are telling search engines to not crawl the specific page where it occurs. However, within dynamic web applications like Wordpress, it is likely that if this code exists on one page, it exists on all pages, which significantly impacts the amount of web traffic your site will receive.
Another place to check for this error is within the file called Robots.txt. This file is placed within your root directory and acts as the same guide as the robots meta tag. If you find a line of code in your Robots.txt file like this
Disallow: /, you are telling search engines to not crawl any content in your root directory (any content on your site).
We always recommend checking your site's headers and your Robots.txt files to make sure that only pages that you don't want to be indexed have the declaration of "noindex" or "Disallow: *file name*".
The HTML title tag,
<title>My Page Title</title> specifies the title of a web page. These specific titles are displayed on SERPs (Search Engine Results Pages) as clickable headers for an individual result. Think of your web page's title as an accurate description of the page and its content.
A title tag is a major element as it is what users first see in SERPs for your page. We recommend title tags that are between 50-60 characters, accurately describe the content on the page, and are not stuffed with keywords.
Pro Tip: The best title has the following components:
Brand | Primary Keyword Secondary Keyword
The HTML meta description tag,
<meta name="description" content="My really cool, important meta description about my blog post."> gives a description or summary of the content on the page. They are anywhere between 1 sentence to 2-3 sentences long and contain relevant keywords and information about the page. They are also shown on SERPs under the page's URL.
The consensus that meta descriptions are not important ranking factors is one we strongly disagree with. Here is what Moz has to say.
Google announced in September of 2009 that neither meta descriptions nor meta keywords factor into Google's ranking algorithms for web search.
Meta descriptions can, however, impact a page's CTR (click-through-rate) on Google which can positively impact a page's ability to rank.
The optimal length for a meta description is anywhere from 50-300 characters. Make sure you write compelling, comprehensive descriptions that are not stuffed with irrelevant keywords. Additionally, don't use double quotes in your meta descriptions. Additionally, Google will cut off your meta-description anywhere double quotes appear, so make sure to use HTML entities in their place.
Whenever possible, avoid the following content issues.
- Duplicate Pages or Duplicate Content
- Duplicate Title Tags
- Thin Content (50 words or less on a page)
- Slow page speed or load time
In conclusion, there are countless factors that go into a page's ability to rank. While on-site errors aren't the only mark against your site, they are important to fix. Whenever you have the ability to help your site rank higher, take the opportunity and you will soon see your efforts pay off. Contact us with any further questions!