Screaming Frog Log File Analyser 3.0 » Developer.Team - Developers Paradise!

Screaming Frog Log File Analyser 3.0

Screaming Frog Log File Analyser 3.0
Screaming Frog Log File Analyser 3.0


The Screaming Frog SEO Log File Analyser allows you to upload your log files, verify search engine bots, identify crawled URLs and analyse search bot data and behaviour for invaluable SEO insight. Download for free, or purchase a licence to upload more log events and create additional projects.

What can you do with the SEO Log File Analyser?
The Log File Analyser is light, but extremely powerful – able to process, store and analyse millions of lines of log file event data in a smart database. It gathers key log file data to allow SEOs to make informed decisions. Some of the common uses include –

Identify Crawled URLs
View and analyse exactly which URLs Googlebot & other search bots are able to crawl, when and how frequently.

Discover Crawl Frequency
Get insight to which search bots crawl most frequently, how many URLs are crawled each day and the total number of bot events.

Find Broken Links & Errors
Discover all response codes, broken links and errors that search engine bots have encountered while crawling your site.

Audit Redirects
Find temporary and permanent redirects encountered by search bots, that might be different to those in a browser or simulated crawl.

Improve Crawl Budget
Analyse your most and least crawled URLs & directories of the site, to identify waste and improve crawl efficiency.

Identify Large & Slow Pages
Review the average bytes downloaded & time taken to identify large pages or performance issues.

Find Uncrawled & Orphan Pages
Import a list of URLs and match against log file data, to identify orphan or unknown pages or URLs which Googlebot hasn't crawled.

Combine & Compare Any Data
Import and match any data with a 'URLs' column against log file data. So import crawls, directives, or external link data for advanced analysis.

Crawled URLs – View and analyse exactly which URLs have been crawled by search bots, such as Googlebot and Bingbot.
Crawl Frequency – Analyse the most & least frequently crawled URLs by search bot user-agents.
Full Event Data – Access full log file event data for every URL discovered in the logs by timestamp.
Errors – Identify client side errors, such as broken links and server errors (4XX, 5XX response codes).
Redirects – View permanent & temporary redirects (302, or 301 responses).
Inconsistent Response Codes – Quickly view URLs with inconsistent response codes over a period of time.
Time Of Last Response – View exactly when a search bot last crawled a URL (and the first, as well as every other event!).
Average Bytes – Analyse the average bytes of every crawled URL direct from log file event data.
Average Response Time (ms) – Discover the average response time of every URL.
Referers – View the number of referer events for every URL discovered.
Directories – Analyse the most and least frequently crawled directories and sections of the site.
Uncrawled URLs – Import a list of URLs & discover URLs which have not been crawled.
Orphan URLs – Import a list of URLs & discover which are in log data, but not known by you.
Analyse Bots Over Time – Upload multiple log files over at once or over time, to analyse and measure bot activity.
Compare Any Data – Upload any data with a ‘URLs’ header to automatically match against log file data and analyse.
Verify Search Bots – Automatically verify search bots such as Googlebot, and view IPs spoofing requests.

Only for V.I.P
Warning! You are not allowed to view this text.