Forum Moderators: Robert Charlton & goodroi
its not the trademarked ghost dataset that went missing, and it wasn't a rebuild like the halloween update.
No, but the overall technique has a familiar feel to it. More than one dataset may be involved this time - and perhaps many more. Interesting that three weeks ago we were hearing reports of googlebot spidering like crazy, and in recent days, reports of googlebot not even showing up for some sites.
[edited by: tedster at 5:09 am (utc) on July 15, 2009]
1. SEARCH RESULTS: The same user comes from, and bounces back to, the Google SERP where they try a different click.
2. ANALYTICS: A user enters a page on the site from anywhere at all, and leaves without visiting a second page on that site.
#1 is a potential (if cloudy) metric for Google's algo, but #2 is not. Using the Analytics bounce rate to improve a single site's stickiness is a very useful webmastering activity. In that case, each metric is coming from one site -- and whether it trends up or down has meaning.
But using Analytics to compare bounce rates across two different sites is like asking "Which is better, a camera or a bicycle?"
Seriously, I'm seeing the top $ terms being affected differently to the lesser terms. I wonder how much Adwords data is being used in developing the semantic dictionary used in the UK.
Cheers
Sid
1. SEARCH RESULTS: The same user comes from, and bounces back to, the Google SERP where they try a different click.
2. ANALYTICS: A user enters a page on the site from anywhere at all, and leaves without visiting a second page on that site.#1 is a potential (if cloudy) metric for Google's algo, but #2 is not.
But Google gets a look at our bounce rates for visitors from other sources [using the same or similar search terms] when we use Analytics.
Is that useful to G, perhaps. Does it affect our G serps? May be worth pondering a bit.
Another factor - the algo is trying to rank billions of pages in comparison to each other. How could Google use a metric that is only available for a percentage of those pages and not the others?
Even if I learned that Google is using bounce rate as a ranking factor [they're not right now], what about that information is actionable? What could I do with that tidbit to improve my rankings and traffic? I am not going to try spoofing a lower bounce rate. And I already monitor bounce rates and work to improve them anyway.
Plus, I can't see bounce rates for my competitors - none of them leave their server logs open for me :(
the original idea I was trying to get across was not specifically bounce rate, which has (and will forever) be debated until Google actually comes out and reports new information about it.
the idea was more around 'shaping' results based on a combination of semantics and data Google has on items such as click through rate in the SERP's (not necessarily bounce rates at a home page, although this could be used soon enough)
Results seem to have stabilised this morning so have been running some checks.The spam level is *high* [have they switched to Northern Light's algo?], doorways, refreshes, hidden text, cloaking it's all there.
Try a few searches and wander through the cache listings, it's ugly.
July 2000 update Thread [webmasterworld.com]
I'm leaning toward this explanation: some heavy-handed filters that were doing a lot of collateral damage were changed. They were not just dialed back, but they were replaced by a different "calculation". Gathering the data for that new filter would have been part of the motive for the extra spidering we saw, just before this update started.
I am noticing on one index that a significantly large number of index pages for some prominent websites have gone missing. I know this is a classic symptom of an update, and was definitely something we were seeing last year during the last 3-5 days of that update.
Tack into that another round of -50 from what I am seeing. Anyone else seeing these changes as of the last 12-18 hours?
Either way, if these results stay the way they are currently then my default homepage will most definitely be changing to Bing.
For our main keyword we are now ranking #23, from #3 a few weeks ago. Above our site now, we find:
2 x US sites
2 x Companies that have been in Administration for several months
1 x Wikipedia article
All of which were nowhere to be seen before this update. The rest of the sites above us are either junk or giants of the industry. The idea of traffic based rankings seems to be fairly spot on for this particular keyword.
All of which were nowhere to be seen before this update. The rest of the sites above us are either junk or giants of the industry. The idea of traffic based rankings seems to be fairly spot on for this particular keyword.
Eh? So you get less traffic than a bunch of junk and bankrupt companies?
The main problem I have with traffic data (other than data collection and/or manipulation) is that it would affect all terms for a page (or site or domain, depending on your belief). I do not see that. It seems to me that search results are more semantically dependant, and much less site-dependant than before.
Eh? So you get less traffic than a bunch of junk and bankrupt companies?
No I didn't say rankings were only based on traffic, however the sites that are in administration were considered to be giants and had massive amounts of traffic.
It just seems like those sites that are up there at the top right now, who don't appear to work on their on-page SEO very much, are being affected by something else - which could be the traffic they receive from offline marketing (TV, radio etc..)
...And there are some junk sites mixed in for good measure.
It's just a mess really. I don't claim to know how they're producing these rankings I'm just suggesting things based on what I can see for the keywords I measure.
You know how much i dislike it. :)
this update is definitely not over yet.
This is a strangest update I've seen.
The update is definitely "over" in some sectors.
But I'm also seeing the weird "unfinished" fluctuations in many others.
Very, very odd.
Another buggy rollout?
Different datasets using different algos?!
Has anyone noticed lower spidering from googlebot the last week or so? I am seeing this on many sites
Yes, my thought is that when gbot fully respiders, we'll get the rest of the sectors finished with the update
but who knows.
some heavy-handed filters that were doing a lot of collateral damage were changed
It seems to me that search results are more semantically dependant, and much less site-dependant than before.
Yes, shaddows, we're seeing the same thing here.
The current "bounce rate" debate around the SEO community just goes on and on. From my reading, people are using this term to point to two very different metrics.
1. SEARCH RESULTS: The same user comes from, and bounces back to, the Google SERP where they try a different click.
2. ANALYTICS: A user enters a page on the site from anywhere at all, and leaves without visiting a second page on that site.#1 is a potential (if cloudy) metric for Google's algo, but #2 is not. Using the Analytics bounce rate to improve a single site's stickiness is a very useful webmastering activity. In that case, each metric is coming from one site -- and whether it trends up or down has meaning.
But using Analytics to compare bounce rates across two different sites is like asking "Which is better, a camera or a bicycle?"
I think a bounce rate is quite an easy idea to implement.
If the % of users which click back for a particular site on a search phrase increases, the likely hood that that site holds useful / useable content for the user is reduced.
Either way the ranking on that phrase (and that phrase only) is affected in a negative way if the bounce % increases.
I was all with you on this one, until late last night, when a particular keyword set I watch saw the number one website drop from position 1 (they have been there for years) to -50 for everything including their company name, and 2 new competitors moved from 10+ to 5 + 6.
Very strange, it is almost as though they are applying this update across keyword 'blocks'? Some sets still remain absolutely unchanged.
Very strange, it is almost as though they are applying this update across keyword 'blocks'? Some sets still remain absolutely unchanged.
This would corelate with the observations of Google trying to find the best pages vs. site dependant authority.
Perhaps they ARE trying to implement different "algos" for different "blocks" of keywords which would obviously need unique criterion.
It would also explain the need for different "penalties/filters" (of course, i argue they still haven't gotten this right) for those unique keyword/industry "blocks".
edited to add - if this is the case, and they successfully implement it, this would eliminate many of the Yo-Yo problems people are having.
1. Different analysis of the backlinks, weighted more by the topic of the linking page (or page segment) rather than going so heavily with anchor text.
2. A new taxonomy for query types, again based on a recalculation of semantic relationships.
The only explanation I can find for my sites when comparing to competitors is a devaluing of backlinks, especially for those with the targeted anchor text.
--------
I keep thinking back to that rumor I reported in part one of the July Updates thread [webmasterworld.com]. A lot of the ranking changes I am seeing line up with the idea of "less weight for less relevant links" -- or maybe "more weight for more relevant links".
Wish I could track down the actual source of the rumor ;)
2. A new taxonomy for query types, again based on a recalculation of semantic relationships."
Definitely there is a backlink component here judging by the -50 wave I am seeing.
Perhaps when the ratio of inbound links (to any given website that are given a low weight score vs. the topic of the page) are low versus the number of topically related pages then the website loses from trust, which correlates to positioning.
This might work to explain the -50, which could show that a very small ratio of topically related links versus unrelated links triggers that penalty. this would also explain why the newest wave of -50's seemed to start happening in increased occurrences late May.
Absolutely I would agree that a new taxonomy is likely being built. This works in direct correlation (in fact, it governs) inbound link assessment IMHO. Once the taxonomy is developed, the rules are rewritten down the line.
1. Different analysis of the backlinks, weighted more by the topic of the linking page (or page segment) rather than going so heavily with anchor text.
This is in line with what I'm seeing, based on several competitive searches I watch in different market areas.
It appears that less weight is now being given for links with relevant anchor text but which come from less relevant pages. I'm also seeing that these rankings are currently fairly volatile. I've seen these pages drop from mid first page down to as far as the third page, then move up to the second page and then up to the first page. Now they appear to be fluctuating between page one and page two.
Pages on the same sites with more solid backlinks have remained rock solid.