The article will talk about a new and well-established program, allowing to conduct high-quality multilateral site audit. Despite the popularity in the west, far from all seo specialists, and even more so website owners know how to use Screaming Frog SEO Spider and what questions it gives answers to. We decided to fill this gap and publish a small instruction for SEO Frog Screaming and tell how to work with it.
- General information about Screaming Frog
- Getting Started at Screaming Frog SEO Spider: Data Filling Instructions
- Basic features of Screaming Frog SEO Spider
- Response codes
- Page Titles, Meta Description
- H1 and H2
- Additional popular Screaming Frog features
General information about Screaming Frog
Service for site audit Skriming Frog will tell almost everything about the resource, plus to this, helps in the work, for example, can generate Sitemap file. The program was created by a British programmer who for several years engaged in SEO optimization of sites. Perhaps it was thanks to the extensive experience of the SEO specialist that he managed to create such a holistic and useful product.
Screaming Frog is installed on the computer and analyzes resources using its own user server. Depending on your preferences you can choose a limited free version, which allows you to analyze up to 500 pages of the site, or full paid version.
In addition to limiting the number of scanned pages on the site, the free version of Screaming Frog is distinguished by a set of available functions. It can be used as a test of the program, but for a full-fledged audit of the site it is better to still acquire an annual license. Given that most of the analysis services that you had to use earlier and which do not have such a wide functionality can be abandoned, in the end you will save more.
The program has only an English version, but everything is very simple and clear here, so dealing with service tools will be easy.
Getting Started at Screaming Frog SEO Spider: Data Filling Instructions
Actually, everything is simple here. To add a site to start data collection, you must enter the site address in a field specially designated for this and click on the Start button. The program will automatically load all the information and spread it on the shelves, that is, on the tabs, for convenient study.
If required scan individual folders / pages site, then in the menu of the Screaming Frog program you need to go to the Configurations tab > Include and list them.
Conversely, if necessary exclude certain sections when generating data for resource analysis, go to the Configurations > Exclude section. Here you can specify specific URLs, exclude folders, pages and files that contain specific words or characters.
Consider more several basic parsing settings in Screaming Frog SEO Spider:
- If you want to speed up the site’s parsing process or simply there is no need to analyze certain file types, in the Configurations section > Spider > Crawl can exclude them.
- Similarly, if you need to analyze only certain page parameters, for example, Title, Description and H1, then in the Extraction tab you can enable them, and the unnecessary one can be excluded.
- In the Limit tab in the Limit Crawl Depth setting, you can specify the limitations for the depth of the parsing depending on page overlay.
- For Screaming Frog parser to analyze information as well as search engine robots, check the details in the Advanced tab in front of the Respect Noindex and Respect NoFollow fields. In addition, this will reduce the time of resource analysis.
- In the Preferences tab, you can specify the desired parameters for Title, Description and other scanned items.
- If the site is still closed from indexing so that the Scriming Frog program cruler can analyze, you must select the “Ignore robots.txt” option in the Configurations tab > Robots.txt.
Now let's get acquainted with the main functionality of Screaming Frog and find out how to use the information provided by the service.
Basic features of Screaming Frog SEO Spider
The main tools for analyzing the site are laid out in the sections of the service presented by individual tabs. We will go through them.
Here presented information about internal resource links. Depending on the settings, the data on the pages can be sorted alphabetically or in order of priority in accordance with the page attachment, starting from the main page of the site.
To study detailed information about each page, you need to click on its address in the analysis results. Here will be presented data on the weight of the page, the server response code, the number of words, graphic objects, headers, meta tags, snapp, links - internal and external, as well as other information that will help evaluate whether all the data is in place and whether it is necessary to adjust it.
Familiar with information about external links. The data of this tab allows you to analyze the links that are sent from your own resource to a third-party site. Here there is also information about encodings, types of links, location of the page containing external links and other data.
If the main domain has subdomains and they are not included in the settings, then Screaming Frog links to them will be perceived as external. To prevent this from happening, we recommend that you pre-mark the Crawl All Subdomains item on the Configuration menu. In this case, the system will take into account sub-sub-items as part of the main resource.
Here you can see separately pages with http and https protocol. If the site is located on https, and analysis of the Screaming Frog program showed the pages from http, check their server response code (Status Code must be 301) and the absence of links to them in the Inlinks tab.
Response Codes Tab
Input server response codes. At the top of the page you can set the filter, depending on the class of the state of the server response and other parameters. For example, you can check:
- which pages are locked in the robots.txt file - Blocked by robots.txt;
- for which there is no answer - No response;
- which are automatically redirected to other pages - Redirection (3xx);
- what errors are made on the client side - Client Error (4XX);
- and on the server side - Server Error (5XX).
After analyzing the information provided, you can find errors in the codes and correct them or delete those links that lead to nonexistent pages.
If you need to find out about the pages with which something is wrong, go to this tab and see which Screaming Frog pages suspect errors. Checking these pages can greatly simplify the life of the optimizer. List of problem pages can replenish those for which:
- incorrect URL;
- dynamic address;
- there is a duplicate, etc.
Page Titles, Meta Description tabs
Dedicated to the meta-tags of the site, information about Title and Description of the pages is collected here, the writing of which makes mistakes. The most important is meta tag information:
- matching H1 headers;
- non-unique (duplicate on different pages);
- too short or excessively long;
- their absence, etc.
All these are errors that require adjustment.
H1 and H2 tabs
Demonstrate everything data related headings on each page of the site. As in the case of meta-tags, here you can see the violations in writing these elements of the site and make the necessary changes in a timely manner.
Screaming Frog SEO Spider analysis section presents data on all resource-used images. Very useful information for optimizing the site, thanks to which you can detect unreasonably “heavy” objects, reduce the “weight” of the resource and accelerate its loading.
It is registered here canonical links (tag rel = "canonical"), that fix problems when placing the same content on different pages. Their analysis will allow you to find errors, such as: several rel = canonical attributes on one page, a page closed from indexing, etc., is indicated by the canonical one page.
Additional popular Screaming Frog features
The most commonly used Screaming Frog function, which allows you to obtain data provided by the service in a convenient and familiar form - upload to a separate Excel file. To do this, click on the Export button, which is available on each screen of the service.
Very convenient and simple create a Sitemap.xml file, necessary to increase the efficiency of the site scanning process by search engine bots. There is a Sitemap section in Screaming Frog where you can create this file.
Testing Screaming Frog SEO Spider, our experts noted that this is very convenient, understandable and useful tool for auditing and optimizing a site that is definitely worth the attention of SEO professionals and owners of any sites. A large amount of important information presented in a convenient format for perception, and a simple interface make the program an excellent solution for optimizers, both experienced and beginners. We recommend trying to download the test version for testing and test the main features of the service.
And what tools do you use to audit the site?