Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Anyone used Screaming frog SEO Spider for Google optimization?

         

born2run

3:10 am on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi,

There is a company that sells Screaming frog seo spider a SEO tool.

Anyone used this? Is it any good for Google SEO ? Thanks in advance!

aakk9999

3:25 am on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



We have discussed this tool in both of our Favourite Seo Tools threads and I suggest that you firstly head there:

Favourite SEO Tools [webmasterworld.com]
2013 Favourite SEO tools [webmasterworld.com]

I have been using it for over 4 years and I find it excellent. It originally started as a Xenu with a nicer interface, but over the time many new features have been added. I personally cannot imagine doing Site Audit without it now.


Mods note: Lets stick to the tool review. Blatantly promotional posts will be deleted.

born2run

4:35 am on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks for the rec. Anyone else using this software? Any configuration tips? Thanks again!

netmeg

1:17 pm on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Its one of two tools I consider mandatory. As far as configuring - depends on what you're doing. Start with it as-is and see what you get.

EssayPartner

2:20 pm on Dec 15, 2014 (gmt 0)

10+ Year Member



I use it to know how many external and internal links totaly site has and that's all.

aakk9999

3:19 pm on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



For starters, I use it to:

- check response codes (internal and external)
- check there is no internal redirects
- check external redirects
- check if there is internal and external links that go 404 and to determine where they are
- check canonicals are correct
- check meta noindex and other meta
- check I do not have unexpected URLs or that there is no infinite URL space
- check <title> duplication, length, missing titles
- check where <title> is the same as <h1>
- check meta descriptions
- check anchor text in links
- check for duplicate pages using hash column
- as I work with multilingual site, sorting by language folder gives me all URLs per language
- a crude view on performance by running Screaming Frog with 8 threads concurrently on 2-3 machines and let them hit the server
- check which pages get built really slowly server side by looking at response time (using a few runs and comparing which pages are slow in all of them)
- creating sitemap
- finding the instance of a certain text in HTML when investigating something
- checking incoming and outgoing links for every page
- and for other things

It has been invaluable for the domain redevelopments, domain migrations and site audits as it saves lots of time. But as with any tool, it is as good as the person using it. You need to know what you are looking for and also what to do with the results.

WhateverWorks

4:05 pm on Dec 15, 2014 (gmt 0)

10+ Year Member



It seems to me Screaming Frog only spiders a very reduced sample of my urls (comparing to Xenu). Am I doing something wrong?

netmeg

4:13 pm on Dec 15, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's a paid tool. The free version only spiders 500 URLs.

All what aakk9999 said plus you can find pages that don't have a particular code (like Google Analytics or Tag Manager or a conversion code), find out how many clicks away from the home page your content is, everything dumps into Excel and it's invaluable for creating redirects when you're migrating to a new site.

(Also, Dan Sharp who is one of the founders, is very accessible on Twitter and very open to suggestions and improvements, which has been great)

born2run

3:16 am on Dec 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks guys for the recs. Is there a mac ver of Xenu?

aristotle

7:01 pm on Dec 17, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



aakk9999 wrote:
check where <title> is the same as <h1>

Is there some kind of taboo against using the same term for both the title and the <h1> header?

I've been doing it a long time on a lot of articles, and it evidently has never caused any ranking problems. Many of those articles have ranked number 1 in Google for that same term for years.

lucy24

8:58 pm on Dec 17, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Is there a mac ver of Xenu?

Not unless one has been recently added. But there's a downloadable version of the w3 link checker that you can run locally. If I remember rightly, it involves Terminal at some point :(

Planet13

10:26 pm on Dec 17, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If I remember rightly, it involves Terminal at some point


:)

aakk9999

10:28 pm on Dec 17, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Is there some kind of taboo against using the same term for both the title and the <h1> header

Not a taboo, more like a missed opportunity :)

aristotle

12:07 am on Dec 18, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks Sandra - you make a good point - that's a good way to look at it. Occasionally I make them different for that reason too.

But I usually make them the same because I don't want anything to confuse or distract the readers, which might happen if they saw an unexpected title at the top of the article.

born2run

3:06 am on Dec 21, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi does this SEO Spider app by screaming frog allow for wild card searches of their results?

For eg search for: example.com/*.jpg

* being the wildcard char. Thanks!

phranque

6:01 am on Dec 21, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



if you have the paid version you can specify a regular expression for urls to include or exclude in the crawl.

born2run

12:17 pm on Dec 21, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



No not the urls to include/exclude. After the audit is done the app shows a table of urls.

I'm referring to that table there is a search box where you can enter part of any url. I was wondering if that search box allowed wildcard searches as well and if so what's the syntax? Thanks!

aakk9999

1:43 pm on Dec 21, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I am not sure what you can do in unpaid version, but in the paid one, once you have a table of URLs, the search box on the top right will allow regular expressions.

So, taking your example, to search for: example.com/*.jpg

One of ways to achieve the above is to specify the search pattern in the search box like this: example.com/.*\.jpg where .* is your wildcar pattern and \. is escaped dot.

(I am sure Lucy has a more efficient regular expressions pattern to use, but the above is simple and it works.)

born2run

2:31 pm on Dec 21, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks what if i want to search for example.com/*jpg

Can you give me the search format? There is no dot between * and jpg

aakk9999

5:22 pm on Dec 21, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Then use this: example.com/.*jpg
This will give you results for anything that has string "jpg" anywhere after the domain name.

If "jpg" must be at the end of URL, then search for example.com/.*jpg$

tangor

5:35 pm on Dec 21, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Try using an anchor ^jpg and forget *

I'm sure lucy24 will have a better answer.

born2run

12:50 pm on Dec 23, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks guys I am grateful.

aakk9999

6:52 pm on Dec 23, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Try using an anchor ^jpg and forget *

This will not work - the result will be no URLs (I have just tried it in Screaming Frog search box, to filter the list of URLs that were the result of crawl.)

Entering example.com/.*jpg in the Screaming Frog search box above the list of URLs filters the list in the way born2run wants. Since this is just a filtering of already crawled results, using inefficient .* probably does not matter much.