Website Technical audit

Technical audit – the basic stage of internal site optimization.

Website audit helps to determine the technical readiness of the site to promote and identify major errors that may interfere with the indexing and ranking of your resource by search engines.

The main points that are important to pay attention to when conducting a technical audit:

Main site mirror (are mirrors glued to www and without www to the site, which mirror is the main one). For example: http://bostonseocompany.site
Correct handling of 404 errors If the page does not exist or has been deleted, a page with 404 errors is required. It is important to check that when accessing it, the response of the server is HTTP / 1.1 404 NotFound. Otherwise, such a page may be included in the PS index.

Canonical addresses (rel = “canonical”).

Allows you to specify the main version of the document for the search engine of a group of pages with the same content. Incorrect use may result in pages being pushed out of the index.
CNC setting on the site. “Human-understandable” URLs are used in page optimization for both usability and search engines (additional entry of keywords into URLs, highlighting in output)
Navigation through scripts. For a link to another page of the site use the tag <a>, which is true from an SEO point of view. But sometimes other technologies are used for this, in particular, Flash or JavaScript. Robots do not follow such links, so if you want the internal website optimization to bring effect, you should not use them. If this cannot be avoided, then it is worth duplicating such links using the <a> tag.

Remove broken links. No need to “spray” the page weight on materials with 404 errors and send the user / robot to non-existent urls.
Page loading speed. Slow loading of the page negatively affects the attitude of both the user and the search engine to the site. The user can leave without waiting for the full page load, and the search robot can delete or pessimize the page.
Cross-browser. To reduce the failure rate by users, it is important to understand how the pages of your site are displayed in different browsers and, if necessary, make corrections to the code.
Cloaking Using various optimization methods, as a result of which search robots and users see different content on the pages of the site. Cloaking is considered to be exclusively the “black” method of website optimization, which all search engines struggle with. One of the methods of cloaking can include the hidden text on the site.

Boston SEO Company

Region of the website. To rank by geo-dependent queries, the site must be assigned a target region.
Many redirects. The use of multiple redirects is not recommended – they can be used only when it is really necessary. For redirecting users from old pages to new ones, it is better to use 301 redirects.
The lack of technical duplicate pages. Each page of the site should be accessible only by one physical address, if this is not the case – you must remove the page or close it from indexing in robots.txt. Also, the page address should not contain session identifiers, cgi-parameter lists.

Proper use of the noindex tag, the nofollow attribute. Internal optimization involves checking the correct indexing of pages and links to a resource.
Reliable hosting. The availability of the site 24×7, the hoster’s rapid response to the problem of site downtime is an important component of the promotion.
Optimization for MU (Mobil-friendly). Searching information on the Internet using smartphones is becoming common for users, so now search engines in mobile issuance prefer sites optimized for mobile devices.
Generating the correct robots.txt file.

ROBOTS.TXT – is a file in the root directory of the site containing instructions for search robots.
In the robots.txt file, you can limit the indexing of pages that should not get into search engines (for example, duplicates or search pages on a site), as well as specify the main site mirror, page speed, and the path to the site map in xml format.

If you are internally optimizing a new site, you will most likely have to create a robots.txt from scratch and place it in the root directory. It is easy to do this – just open the notepad and insert the necessary directives into it. After that, the file must be saved and uploaded to the root directory of the site.

What exactly should be in the file?
Robots.txt must have at least two directives:

User-agent: – indicates for which search engine the instructions are intended. In order to specify the instruction for all search engines at once, it is necessary to use an asterisk in the User-agent – *

Disallow: – indicates which site directory should not be indexed. For example, you can close a file or folder from indexing by specifying the following directives:

Disallow: /file.html
Disallow: / papka /

If you do not want to close the site page from indexing, then it is enough to indicate the following:

User-agent: *
Disallow:

Leave a Reply

Your email address will not be published. Required fields are marked *