homepage Welcome to WebmasterWorld Guest from 54.196.159.11
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Empty Dance Card - urls that never show on any SERP
tedster




msg:3162882
 8:01 pm on Nov 20, 2006 (gmt 0)

It seems to me that Google is doing more and more to measure domain quality and not just URL quality. So lst week at PubCon I was playing with an idea, and I wonder if anyone has any response to it.

If you have a Webmaster Tools account, you can see that Google is keeping track of what amounts to "impressions" -- what search terms are returning your domain's URLs on the SERP, whether they are clicked on or not.

So what about a URL that never gets tapped for any SERP at all, over some period of time - its dance card is empty. Could having too many of these in a domain be a negative algorithm factor?

[edited by: tedster at 8:33 pm (utc) on Nov. 20, 2006]

 

idolw




msg:3162914
 8:24 pm on Nov 20, 2006 (gmt 0)

heh, this is what i was thinking about for some time now.
all of us selling something have sites with lots of pages of cool content but we mostly try to put our landing selling pages at the top of the SERPs.
so our sites have lots of links to few pages that sell and no or almost no links to the content we put around thye landing selling pages. pretty artificial but it works.
I guess more links to other pages can let you gain more trust. however, this is the tough part as we always prefer to get links to what we sell and would prefer not to put too much link building effort to anything around.
I don't believe the situation described above may be very negative, but if some links to non-selling but informative pages may not only give more trust but also be helpful for brankding purposes.
in fact, this is what i am testing now on a new site to see if it can help rank quicker.

tedster




msg:3162930
 8:38 pm on Nov 20, 2006 (gmt 0)

I just had a light go off when I noted the ease with which Webmaster Tools can give you this data -- data that you cannot get from any other avenue except straight from Google. I can't help but think that they are accumulating this information for some purpose or other. Call me a cynic, but I don't think they built that functionality just for webmaster's convenience alone.

Also I've got to say thanks to the GWT team - it is nice to get the information.

idolw




msg:3162953
 8:58 pm on Nov 20, 2006 (gmt 0)

once they get 80% of webmaster use their analytics tools they will start using user behaviour on a huge scale. thanks to that data the only page most of us will have in google index will be the landing page ;)

tedster




msg:3162956
 9:01 pm on Nov 20, 2006 (gmt 0)

Ah, but they have the data anyway, whether you use GWT or not, right?

BigDave




msg:3163023
 10:03 pm on Nov 20, 2006 (gmt 0)

My response would be a qualified "yes, but..."

To me, the more interesting question would be, "If they are using this information, where are they using it?"

They could certainly use that information son specific popular searches. The could even use it for less popular searches within the same theme. Would it make sense for them to use it in the very long tail, you know, those searches that are the equivalent of supplemental pages?

And now that you have brought up the possibility, get ready for the posts of those concerned that this one factor is suddenly *the* important factor when it comes to their ranking. They are going to start going to libraries and searching and clicking on their results to try and boost their positions.

goubarev




msg:3163054
 10:27 pm on Nov 20, 2006 (gmt 0)

Oh, of course, the first time I saw it - I had no doubt this is where it's going... and I can't wait until it gets there!

Yeh, Google is smart, but if one thinks about it... what would be the next step? or How would take advantage of this situation? -- The next step is to "get as many click on your listing as possible"! Sounds very familiar, but nobody is doing this!

Example: Ask any SEO if it's a good idea to put "Click here for ..." in the title of the page?!

I am sure in a couple of years the "rules" of getting to the top of Free SERPs would be almost the same as getting to the top Adwords...

idolw




msg:3163058
 10:32 pm on Nov 20, 2006 (gmt 0)

Ah, but they have the data anyway, whether you use GWT or not, right?

not really, tedster. they only see those hitting the back button.
OK, and those clicking the ad on MFA sites. need to switch to YPN for R&D department financing ;)

idolw




msg:3163063
 10:37 pm on Nov 20, 2006 (gmt 0)

I am sure in a couple of years the "rules" of getting to the top of Free SERPs would be almost the same as getting to the top Adwords...

Free SERPs? what is it? i thought we were spending millions a month to get the free #1 spot.

goubarev




msg:3163126
 11:50 pm on Nov 20, 2006 (gmt 0)

heheh, idolw.
Yep, you'r right...
not the "free SERPs"...
but the SERPs where one don't play per click...

tedster




msg:3163199
 12:58 am on Nov 21, 2006 (gmt 0)

not really, tedster. they only see those hitting the back button

Guess I wasn't clear enough, sorry. What I'm talking about has nothing to do with clicks or measuring traffic in any way. It's just whether a certain URL shows up at some position or other in a real search result - in other words, "organic impressions", no matter if that result gets clicked on or not.

That kind of data definitely does not require a GWT account for Google to measure it. It only requires a GWT account for you to get that kind of feedback.

If you have a lot of URLs in the index that are never tapped for any search at all -- ad say that the number of such URLs is far beyond the mean for all websites -- then is that poor showing a negative reflection on your domain as a whole? I've never heard anyone anywhere talk about this factor. But it's clear as a bell that Google records it, or else they could not give a GWT report on it. Whether (and how) they use that data is another story.

idolw




msg:3163637
 12:48 pm on Nov 21, 2006 (gmt 0)

they always say they rank pages not sites. but we know the truth.
i say: go get some links to other pages of the site and see if it makes any difference to your site's shape.
At the moment, I believe it can be helpful, especially for more general terms in your niche, so called "trophy keywords".

tedster




msg:3163766
 3:06 pm on Nov 21, 2006 (gmt 0)

PageRank is only a URL relative factor, but several Googlers at PubCon, including Matt Cutts, mentioned something about using "an increasing number of site-wide measures" or words to that effect.

annej




msg:3163786
 3:26 pm on Nov 21, 2006 (gmt 0)

"an increasing number of site-wide measures"

Now that has me curious. Any hints as to what these might be?

tedster




msg:3163799
 3:37 pm on Nov 21, 2006 (gmt 0)

Given the context of that comment, one factor relates to history of the domain and its marketing/linking profile over time.

Also, in one session, Matt made it clear that ownership of other domains that are involved in something highly dodgy might raise at least a question about an apparently clean domain owned by the same person. In fact, he read out some other domain names to one site owner and basically asked what they were doing with those domains -- it was a kind of warning that many in the room took note of.

theBear




msg:3163835
 4:00 pm on Nov 21, 2006 (gmt 0)

The so called empty dance card in and of itself is meaningless.

If you were to place a page about a topic on the web and provide links to it it may take some time for a search request to come along that the S/E would even consider the page worthy of being included in a SERP.

If however the page should appear in a commonly searched for SERP and ranked high and then doesn't get clicked you'd have something to think about if you were the S/E.

Likewise if people still click on the page's link and the link appears in position 734 the S/E might wish to do a bit of thinking on the matter.

Now I'm going back to the sidelines and and watch folks go insane participating in the datacenter watch threads.

Halfdeck




msg:3163848
 4:07 pm on Nov 21, 2006 (gmt 0)

Matt made it clear that ownership of other domains that are involved in something highly dodgy might raise at least a question about an apparently clean domain owned by the same person.

This implies to me Google is definitely using domain registration data, since IP isn't enough to determine domain ownership, though you can go a long way I suppose by analyzing IPs and linking patterns.

Genie




msg:3163854
 4:15 pm on Nov 21, 2006 (gmt 0)

What a co-incidence that you bring this up, tedster. I was just thinking the other day that SERPs-views plus click data would be really helpful to Google in exactly the way theBear suggests. I certainly hope that they act on it.

Right now Google is ranking my site high for elements of the url that have nothing to do with the content of the site. Needless to say no-one is clicking on my site in such search results. On the other hand people are clicking on my pages for certain relevant keywords, even though they rank way down for them.

But imagine how complex it must be to analyse all this data for billions of pages and searches, and then adjust the algorithm to improve SERPs.

[edited by: Genie at 4:17 pm (utc) on Nov. 21, 2006]

theBear




msg:3164139
 8:03 pm on Nov 21, 2006 (gmt 0)

"But imagine how complex it must be to analyse all this data for billions of pages and searches, and then adjust the algorithm to improve SERPs."

Yep it ain't a small undertaking.

jatar_k




msg:3164151
 8:14 pm on Nov 21, 2006 (gmt 0)

You could make the case, or at least understand the assumption, that having too many pages with an 'empty dance card' could weigh down the assumed value of the rest of your pages.

If your SERP-less pages reached a certain threshhold of total page percentage then you could also see how it would show some pages being over optimized and the site as a whole being less useful for the user.

Analyze a little click data from GWT and all of a sudden that site gets a penalty.

then take into account that G knows your other domains and they happen to deal with many unrelated topics and those sites may, or may not, have a similar 'empty dance card' page percentage (similar footprint) and then a whole network of sites gets weighted.

this is all obviously gross extrapolation but there is a strange logic to the whole thing

I see what you're getting at Ted, interesting way of looking a it.

Gee Ted, maybe it's all about identifying a site's footprint. ;)

BigDave




msg:3164165
 8:33 pm on Nov 21, 2006 (gmt 0)

"an increasing number of site-wide measures" or words to that effect.

I never understood the argument that "Google ranks pages, not sites" based solely on the fact that PR is based on pages. There are so many quality indicators that only make sense on a site wide basis.

theBear




msg:3164205
 9:06 pm on Nov 21, 2006 (gmt 0)

I would expect a site to have a fairly large serpless vs serped footprint if they are new or even if old and added a lot of new "stuff" (highly technical word for content)....

I still have problems with that aspect of what appears to go on from time to time.

annej




msg:3164217
 9:21 pm on Nov 21, 2006 (gmt 0)

But imagine how complex it must be to analyse all this data

I think there is no doubt that Google has ways to look at a lot more than single pages. The question is are they doing it on a wide spread basis or just when something alerts them to possible problems with individual pages or sites.

jatar_k




msg:3164218
 9:21 pm on Nov 21, 2006 (gmt 0)

I would expect a site to have a fairly large serpless vs serped footprint if they are new or even if old and added a lot of new "stuff"

exactly, now identify 'an increasing number of site-wide measures' and think of

page age (maybe even number of significant changes)
domain age (to that owner)
site history
site wide ranking and clickthroughs
I am sure others could be added

you can see how a profile emerges and the value (to G) of adding more site/domain signals

if a disproportionate number of pages on your site are unserped and unvisited then G could use other factors to decide whether that means your site is less or more relevant

theBear




msg:3164327
 11:24 pm on Nov 21, 2006 (gmt 0)

"if a disproportionate number of pages on your site are unserped and unvisited then G could use other factors to decide whether that means your site is less or more relevant"

Ah but you see being relevant doesn't require that you be an old, well linked, or even technically clean site.

That is but one aspect of the never ending battle that is currently being fought.

However those automagic aspects of shelving pages in file 13 or putting them on page 1 slot 1 can't automagicly take such information into account.

Even detailed click data could prove subject to the same manipulation as other aspects of world wide wobbley or is it still wait.

jatar_k




msg:3164336
 11:34 pm on Nov 21, 2006 (gmt 0)

>> Ah but you see being relevant doesn't require that you be an old, well linked, or even technically clean site.

very true, but

if you are old and well linked then the majority of the time you would be seen as relevant, G's just playing the odds and using site wide factors to possibly increase their odds of identifying relevant sites/pages

theBear




msg:3164345
 11:47 pm on Nov 21, 2006 (gmt 0)

Been there, done that, got the t-shirt, sweat shirt, hat, coffee mug, umbrella, etc... and boot imprint on the hind quarters.

Like I say automagic, is automagic, and nothing about automagic implies correctness or anything else.

Oh don't get me wrong jatar_k, I understand proxy indicators, and a whole lot more.

However I came from a world where the very last $0.01 was accounted for, all of the time, every time.

BigDave




msg:3164354
 12:05 am on Nov 22, 2006 (gmt 0)

Giving this a little thought, I realized a similar area where this could very well explain what a lot of us have been seeing, though there are certainly a lot of other explanations as well.

Many of us have noticed how the longer our sites rank well for long tail searches in a theme, the better our ranking on the more general searches.

So what if, rather than using the data as tedster postulates, they are using your historical ranking in more specific searches to adjust your ranking in less specific searches.

Spending some months ranking for "Fuzzy green widgets" will then help you for searches on "fuzzy widgets" and "green widgets. Once those crawl to the top and stay there for a few months, "widgets" will start working its way up in the SERPs.

This does happen, the question is, is this the reason that it happens, and does it affect other things as well.

theBear




msg:3164391
 12:46 am on Nov 22, 2006 (gmt 0)

BigDave

I think you are seeing that over time Google understands that your content actually covers various aspects of a topic.

I've also noticed that there is frequently a lag between ranking (even slightly) on most common phrases and begining to pick up a lot of long tail coverage followed at last by higher rankings on the more common phrases. But if the cause is in fact the long tail or just the amount of time it takes to fully calculate the effects of all of the factors in ranking I can't say.

A lot of folks will say that they checked their rankings and nothing changed except traffic levels, some later on have remarked that eventually traffic levels returned.

Acts like everything is following a periodic wave form.

[edited by: theBear at 12:58 am (utc) on Nov. 22, 2006]

tedster




msg:3164585
 5:35 am on Nov 22, 2006 (gmt 0)

The way I've been thinking about Google lately, despite the hundreds of individual factors involved in rankings, everything they measure seems to fall into three buckets: relevance, quality and trust. The musings that started me posting this thread came from wondering what signals of quality they might use, especially domain-wide signals. I think you can have poor quality and still get high marks for trust.

Quick click-backs for another SERP choice would be a URL specific signal of low-quality, but I'm just pondering what else there might be.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved