Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Updates and SERP Changes - July 2009 - part 2

         

tedster

12:16 am on Jul 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



< continued from [webmasterworld.com...] >

its not the trademarked ghost dataset that went missing, and it wasn't a rebuild like the halloween update.

No, but the overall technique has a familiar feel to it. More than one dataset may be involved this time - and perhaps many more. Interesting that three weeks ago we were hearing reports of googlebot spidering like crazy, and in recent days, reports of googlebot not even showing up for some sites.

[edited by: tedster at 5:09 am (utc) on July 15, 2009]

cangoou

11:10 am on Jul 19, 2009 (gmt 0)

10+ Year Member



Don't forget Google AdSense - if you have AdSense on your pages, Google knows too whats going on.

@CainIV: Could you give 3 Datacenters with the 3 types of SERPs on it please? No matter what I'm checking I only find 2 of them. Or are they moving/rotating?

tedster

4:02 pm on Jul 19, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The current "bounce rate" debate around the SEO community just goes on and on. From my reading, people are using this term to point to two very different metrics.

1. SEARCH RESULTS: The same user comes from, and bounces back to, the Google SERP where they try a different click.
2. ANALYTICS: A user enters a page on the site from anywhere at all, and leaves without visiting a second page on that site.

#1 is a potential (if cloudy) metric for Google's algo, but #2 is not. Using the Analytics bounce rate to improve a single site's stickiness is a very useful webmastering activity. In that case, each metric is coming from one site -- and whether it trends up or down has meaning.

But using Analytics to compare bounce rates across two different sites is like asking "Which is better, a camera or a bicycle?"

Hissingsid

7:25 pm on Jul 19, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Came back from a week away and not only have we gone back to #1 on .co.uk for our two main target terms but also have mini site links. I think I may go away for another week and see if the rest come back.

Seriously, I'm seeing the top $ terms being affected differently to the lesser terms. I wonder how much Adwords data is being used in developing the semantic dictionary used in the UK.

Cheers

Sid

santapaws

9:26 pm on Jul 19, 2009 (gmt 0)

10+ Year Member



"Which is better, a camera or a bicycle?"

but you could ask what is better a camera or bicycle when i must meet the following criteria

ken_b

10:04 pm on Jul 19, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Bounce rate:
1. SEARCH RESULTS: The same user comes from, and bounces back to, the Google SERP where they try a different click.
2. ANALYTICS: A user enters a page on the site from anywhere at all, and leaves without visiting a second page on that site.

#1 is a potential (if cloudy) metric for Google's algo, but #2 is not.

But Google gets a look at our bounce rates for visitors from other sources [using the same or similar search terms] when we use Analytics.

Is that useful to G, perhaps. Does it affect our G serps? May be worth pondering a bit.

tedster

10:32 pm on Jul 19, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



People have been pondering that ever since Google bought Urchin and turned it into GA. And all along the way since then, Google reps have specifically denied the connection. As the recent debacle with changing search referers showed, the two Google teams (organic search and analytics) don't even talk to each other!

Another factor - the algo is trying to rank billions of pages in comparison to each other. How could Google use a metric that is only available for a percentage of those pages and not the others?

Even if I learned that Google is using bounce rate as a ranking factor [they're not right now], what about that information is actionable? What could I do with that tidbit to improve my rankings and traffic? I am not going to try spoofing a lower bounce rate. And I already monitor bounce rates and work to improve them anyway.

Plus, I can't see bounce rates for my competitors - none of them leave their server logs open for me :(

CainIV

1:05 am on Jul 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"search phrase semantics, seasonality and click through rate"

the original idea I was trying to get across was not specifically bounce rate, which has (and will forever) be debated until Google actually comes out and reports new information about it.

the idea was more around 'shaping' results based on a combination of semantics and data Google has on items such as click through rate in the SERP's (not necessarily bounce rates at a home page, although this could be used soon enough)

tedster

2:13 am on Jul 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well, there you have it. Most certainly that is the kind of measurement that Google uses to determine how well their efforts stack up. Not whether webmasters complain about the SERPs going to hell - we've been complaining for a long time:

Results seem to have stabilised this morning so have been running some checks.

The spam level is *high* [have they switched to Northern Light's algo?], doorways, refreshes, hidden text, cloaking it's all there.

Try a few searches and wander through the cache listings, it's ugly.

July 2000 update Thread [webmasterworld.com]

CainIV

9:45 pm on Jul 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



On a side note, seeing some heavy duty genres getting thrown upside down and all of their pocket change dumped out. Very interesting.

tedster

9:51 pm on Jul 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm also seeing rankings returned for some pages that have been literally missing for months - and the reason they went missing were mostly inscrutable to me.

I'm leaning toward this explanation: some heavy-handed filters that were doing a lot of collateral damage were changed. They were not just dialed back, but they were replaced by a different "calculation". Gathering the data for that new filter would have been part of the motive for the extra spidering we saw, just before this update started.

CainIV

10:54 pm on Jul 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Nice observation tedster.

I am noticing on one index that a significantly large number of index pages for some prominent websites have gone missing. I know this is a classic symptom of an update, and was definitely something we were seeing last year during the last 3-5 days of that update.

Tack into that another round of -50 from what I am seeing. Anyone else seeing these changes as of the last 12-18 hours?

tedster

1:23 am on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



...large number of index pages for some prominent websites have gone missing

Another anomaly that I'm seeing - an unusually high number of bugs, where the same url is listed twice on a site: operator query. I wouldn't be surprised if the recipe for this update needs some further improvement.

Love2Blog

1:32 am on Jul 21, 2009 (gmt 0)

10+ Year Member



I am seeing my site listed twice on the same query one listing is the index and the next an inner page, both listed in a row. I am also suddenly seeing a ton of .uk sites that I usually do not see in my SERPS, I am in the US

jkdt0077

9:10 am on Jul 21, 2009 (gmt 0)

10+ Year Member



The keywords I'm checking don't seem to have changed for 2 weeks now, which suggests the update has finished? Although, it seems you lot are still seeing changes quite frequently.

Either way, if these results stay the way they are currently then my default homepage will most definitely be changing to Bing.

For our main keyword we are now ranking #23, from #3 a few weeks ago. Above our site now, we find:
2 x US sites
2 x Companies that have been in Administration for several months
1 x Wikipedia article

All of which were nowhere to be seen before this update. The rest of the sites above us are either junk or giants of the industry. The idea of traffic based rankings seems to be fairly spot on for this particular keyword.

Shaddows

9:26 am on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



All of which were nowhere to be seen before this update. The rest of the sites above us are either junk or giants of the industry. The idea of traffic based rankings seems to be fairly spot on for this particular keyword.

Eh? So you get less traffic than a bunch of junk and bankrupt companies?

The main problem I have with traffic data (other than data collection and/or manipulation) is that it would affect all terms for a page (or site or domain, depending on your belief). I do not see that. It seems to me that search results are more semantically dependant, and much less site-dependant than before.

jkdt0077

9:48 am on Jul 21, 2009 (gmt 0)

10+ Year Member



Eh? So you get less traffic than a bunch of junk and bankrupt companies?

No I didn't say rankings were only based on traffic, however the sites that are in administration were considered to be giants and had massive amounts of traffic.

It just seems like those sites that are up there at the top right now, who don't appear to work on their on-page SEO very much, are being affected by something else - which could be the traffic they receive from offline marketing (TV, radio etc..)

...And there are some junk sites mixed in for good measure.

It's just a mess really. I don't claim to know how they're producing these rankings I'm just suggesting things based on what I can see for the keywords I measure.

lethal0r

10:31 am on Jul 21, 2009 (gmt 0)

10+ Year Member



not over yet, no way. one of my sites still aint back and some website which is just a broken template shows #1 for a search on my domain name with no tld.

drall

12:57 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Has anyone noticed lower spidering from googlebot the last week or so? I am seeing this on many sites.

iamhrh

2:34 pm on Jul 21, 2009 (gmt 0)

10+ Year Member



Hi everyone. We're still -50ish from the May 14th round of penalties/changes/whatever this is. We haven't seen the results in our niche change for what seems like several weeks now. We used to be number one for several keywords and we are not ahead of #42 for *any* of them now. This is so frustrating!
Has anyone from the May 14th round recovered? Does anyone still believe this has to do with the devaluation of back links?
Also, this I'm just curious about... has anyone talked with others outside of this forum about all of this? Considering how badly some of us have been hit, you would think that there would be more conversation about this all over the web... not just here. Maybe I'm missing it?
Drall, as far as spidering goes, we have seen MUCH lower spidering over the past month and a half or so.

drall

2:44 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Same here iamhrh, across many sites with completely different topics. Also looks like tbpr is updating again, I am seeing major drops on many big properties out there. Lots of sites I am watching are dropping from 6/7 to 3/4.

ddogg

4:27 pm on Jul 21, 2009 (gmt 0)

10+ Year Member



Lots of fluctuations still in my niche, this update is definitely not over yet.

whitenight

4:41 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks tedster, for handling that octave of the spiraling dna time/space helix....

You know how much i dislike it. :)

this update is definitely not over yet.

This is a strangest update I've seen.

The update is definitely "over" in some sectors.
But I'm also seeing the weird "unfinished" fluctuations in many others.

Very, very odd.

Another buggy rollout?

Different datasets using different algos?!

Has anyone noticed lower spidering from googlebot the last week or so? I am seeing this on many sites

Yes, my thought is that when gbot fully respiders, we'll get the rest of the sectors finished with the update
but who knows.

some heavy-handed filters that were doing a lot of collateral damage were changed

It would be wise for G to correct this so that's a good thing.

It seems to me that search results are more semantically dependant, and much less site-dependant than before.

Yes, shaddows, we're seeing the same thing here.

petehall

5:04 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The current "bounce rate" debate around the SEO community just goes on and on. From my reading, people are using this term to point to two very different metrics.
1. SEARCH RESULTS: The same user comes from, and bounces back to, the Google SERP where they try a different click.
2. ANALYTICS: A user enters a page on the site from anywhere at all, and leaves without visiting a second page on that site.

#1 is a potential (if cloudy) metric for Google's algo, but #2 is not. Using the Analytics bounce rate to improve a single site's stickiness is a very useful webmastering activity. In that case, each metric is coming from one site -- and whether it trends up or down has meaning.

But using Analytics to compare bounce rates across two different sites is like asking "Which is better, a camera or a bicycle?"

I think a bounce rate is quite an easy idea to implement.

If the % of users which click back for a particular site on a search phrase increases, the likely hood that that site holds useful / useable content for the user is reduced.

Either way the ranking on that phrase (and that phrase only) is affected in a negative way if the bounce % increases.

CainIV

5:11 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"The update is definitely "over" in some sectors."

I was all with you on this one, until late last night, when a particular keyword set I watch saw the number one website drop from position 1 (they have been there for years) to -50 for everything including their company name, and 2 new competitors moved from 10+ to 5 + 6.

Very strange, it is almost as though they are applying this update across keyword 'blocks'? Some sets still remain absolutely unchanged.

whitenight

5:28 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Very strange, it is almost as though they are applying this update across keyword 'blocks'? Some sets still remain absolutely unchanged.

This would corelate with the observations of Google trying to find the best pages vs. site dependant authority.

Perhaps they ARE trying to implement different "algos" for different "blocks" of keywords which would obviously need unique criterion.

It would also explain the need for different "penalties/filters" (of course, i argue they still haven't gotten this right) for those unique keyword/industry "blocks".

edited to add - if this is the case, and they successfully implement it, this would eliminate many of the Yo-Yo problems people are having.

tedster

5:45 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The semantic component to this update might be two-fold:

1. Different analysis of the backlinks, weighted more by the topic of the linking page (or page segment) rather than going so heavily with anchor text.

2. A new taxonomy for query types, again based on a recalculation of semantic relationships.

Love2Blog

6:26 pm on Jul 21, 2009 (gmt 0)

10+ Year Member



My sites still wavering from pages 5-6 on a daily basis, nothing has returned since early June -50, seeing different results on page 1 on a daily basis for several KW sets I watch. Constant flux.

The only explanation I can find for my sites when comparing to competitors is a devaluing of backlinks, especially for those with the targeted anchor text.

tedster

6:47 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Love2Blog, if you fell to page 5 from page 1, and it happened in early June - that sounds like a penalty. Yes, it's most likely about backlinks but it started before this update began, and that's why I lean towards a penalty and not just a new algo factor.

--------

I keep thinking back to that rumor I reported in part one of the July Updates thread [webmasterworld.com]. A lot of the ranking changes I am seeing line up with the idea of "less weight for less relevant links" -- or maybe "more weight for more relevant links".

Wish I could track down the actual source of the rumor ;)

CainIV

8:21 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"1. Different analysis of the backlinks, weighted more by the topic of the linking page (or page segment) rather than going so heavily with anchor text.

2. A new taxonomy for query types, again based on a recalculation of semantic relationships."

Definitely there is a backlink component here judging by the -50 wave I am seeing.

Perhaps when the ratio of inbound links (to any given website that are given a low weight score vs. the topic of the page) are low versus the number of topically related pages then the website loses from trust, which correlates to positioning.

This might work to explain the -50, which could show that a very small ratio of topically related links versus unrelated links triggers that penalty. this would also explain why the newest wave of -50's seemed to start happening in increased occurrences late May.

Absolutely I would agree that a new taxonomy is likely being built. This works in direct correlation (in fact, it governs) inbound link assessment IMHO. Once the taxonomy is developed, the rules are rewritten down the line.

Robert Charlton

9:53 pm on Jul 21, 2009 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



1. Different analysis of the backlinks, weighted more by the topic of the linking page (or page segment) rather than going so heavily with anchor text.

This is in line with what I'm seeing, based on several competitive searches I watch in different market areas.

It appears that less weight is now being given for links with relevant anchor text but which come from less relevant pages. I'm also seeing that these rankings are currently fairly volatile. I've seen these pages drop from mid first page down to as far as the third page, then move up to the second page and then up to the first page. Now they appear to be fluctuating between page one and page two.

Pages on the same sites with more solid backlinks have remained rock solid.

This 209 message thread spans 7 pages: 209