Welcome to WebmasterWorld Guest from 54.166.122.69

Google.com SERP Changes - July 2008

   
4:22 pm on Jun 30, 2008 (gmt 0)

5+ Year Member



< continued from [webmasterworld.com...] >

Has anyone recovered from June 4th catastrophe?

[edited by: tedster at 5:43 am (utc) on July 1, 2008]

5:18 am on Jul 13, 2008 (gmt 0)

5+ Year Member



When things get churned up, does that balance change?

Not the SERPs I'm watching - shuffling merely re-orders what's already there. Not much coming in or moving out of top 10.

At first I thought this may be a test, but as time goes on I'm starting to wonder if this isn't a more permenant change? Still not seeing this in wide circulation on SERPs though.

7:13 am on Jul 13, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I'm still leaning toward the collection of some kind of statisitical information. One guess I've had involves sending different versions of the SERP to different teams of human reviewers. Also, those SERPs where page 1 is cycling betwee two states, that could be automated data collection. The question then would be what variation is being measured between the two states, and how can your website not be caught in that wave.
8:10 am on Jul 13, 2008 (gmt 0)

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



...it would be good to figure out if Google is testing your url for it's performance, or testing two different versions of the algorithm.

It's hard to say whether they're watching particular urls so much as watching large numbers of pages as they react to algo changes, evaluating each set of changes, and then noting the combinations that give greatest user satisfaction.

On some searches I'm watching, eg, I'm getting the impression that they're turning the dupe filters higher and lower, while at the same time they're cycling everything except the top one or two pages up and down within various ranges in response to other factors they're evaluating. This on top of the fabled 200 factors, some of which might be stable at any one time.

12:34 pm on Jul 13, 2008 (gmt 0)

5+ Year Member



Our site just tanked today at 3 AM. We have two link list style sites with submitted links and content. Our other site was hit on June 27/06 then returned July 15/06. Traffic stayed still the next year on June 29/07 when traffic tanked once again and has never returned. Now this newer site that is similar in style tank again today 7/13/08. Each time the google traffic went down by about 70%. I really wish I knew what was going on. I understand ranks change but why are they changing by such large amounts. I guess all I can do is keep my fingers crossed that traffic comes back soon.

[edited by: Northstar at 12:36 pm (utc) on July 13, 2008]

5:31 pm on Jul 13, 2008 (gmt 0)

5+ Year Member



As it pertains to serp changes/site loss, here is an interesting observation. I couldn't find anything about this posted except general concerns about webmasters losing sites from the serps 'where they see it'.

For testing I used 10 computers at 4 different locations. At home location, where 2 of the 10 computers were used, only one computer cannot see rankings.

1. Competitor site A ranked first page for all major terms, but ranks nowhere only when searched from one of the 2 computers at home testing. All other locations other offices fine. (noticed around June and still in affect)

2. Our site B ranked first page for all major terms, but ranks nowhere only when searched from same computer location. (noticed in July, but traffic is down slightly since June but holding). Site is 'discretionary' spending dependent so we're blaming the economy.

3. Client site C ranked first page for all major terms. Client has never(in over a year) been able to see his site ranked from his IP location but all other testing locations fine.

4. Other sites we develop can be viewed from all locations even the one 'home' computer mentioned. At least now.

What I find odd is the inability to view specific rankings of specific sites at any one computer time and time again. (at least based on the current testing environment). The above mentioned sites are all ranked well, strong PR's, alexa ratings under 500k and are 5-6 years old.

Interesting: Google Site links are available for all sites, except missing when viewed from that one computer.

My one computer can't be the sole viewer of the rankings loss, how many others are out there that cannot see what all my other locations see.

I realize this is a very specific observation and we're dealing with 1000's of DC's but it's real.

Has anyone ran a test like this? Any insight into what could be wrong or is this a rogue wave about to pummel these sites? Just a tweaking of their algo? Maybe Google is penalizing me for running to many queries from my computer. (kidding)

11:21 pm on Jul 13, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



This may not be at all relevant, but bear with me....

I used Google's Keyword Tool a few days ago, and noticed that in Firefox it defaulted to English, United States but in Mozilla Seamonkey it defaults to English, United Kingdom.

In Mozilla Seamonkey I had edited the language preferences to be:
[en-gb]
[en]
[en-us]

but in Firefox they had been left at default (assume [en-us] only).

So, browsers do return hidden information to Google, stuff that you might not realise is signalling something about your location, stuff that you might not know about.

Whether it has any effect on SERPs, I have no idea, but I suspect that in some case it might do.

11:37 pm on Jul 13, 2008 (gmt 0)

5+ Year Member



mrrob: make sure you are NOT logged in to your Gogle WMT account, or any other Google account, like Google Base/Froogle, etc. They will all provide different SERP results on that computer vs. when you are logged off. Could this be the reason for the discrepancies you are seeing?
11:42 pm on Jul 13, 2008 (gmt 0)

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 5+ Year Member



I don't suppose any of this is due to google screwing up site verification? Or is that just a side effect?

For the third or fourth time in a couple of months we've had to re-verify our nine webmaster-listed sites. No problem - google verified them immediately. The sites range from a few pages to several thousand.

The odd point was: the reason they were all un-verified was that the sites all returned as the "error" when last tested:

"We've detected that your site's home page returns a status of 200 (Success) in the header."

Which I always understood was a GOOD status!

I have to say this doesn't appear to have affected any of the sites' positions significantly but the un-verify probably only happened about 5 days ago during a site scan, so maybe not had time to percolate yet.

For the record: all sites except one have a google sitemap.

A well-established site not included in the webmaster list (and with no sitemap) changed position for ONE set of keywords during the past five hours or so, down from 41 to 44. The actual position isn't too surprising. Other more relevant keywords currently put the site near the top. No detectable positional difference between two computers at the same IP (linux and windows).

We will keep a better eye on the positions than we have done.

11:50 pm on Jul 13, 2008 (gmt 0)

5+ Year Member



I ran into that verify error a couple days ago, told me the home page returned a status of 200 (Success) but it was unverified.

This has actually been going on with me for a number of months - sites keep coming up un-verfied and I just click to re-verify them with no problem. I hadn't gotten the 200 (Success) thing until recently however. I'm using meta tags to verify at the moment, think I'm going to try switching to the other method.

12:09 am on Jul 14, 2008 (gmt 0)

5+ Year Member



Congratulations sahm, my rankings are still no where. Do you have any dupe meta title/desc errors ?

WMT says that I have 73 pages (out of 1,000's) that have duplicate descriptions. I do need to follow up and fix those.

It also says I have about 2,000 duplicate title tags, which are almost all in my message forums, not the main content of my site.

My returned rankings have held through the weekend, I'm keeping my fingers crossed.

12:13 am on Jul 14, 2008 (gmt 0)

5+ Year Member



How long did the traffic go for?
2:49 am on Jul 14, 2008 (gmt 0)

5+ Year Member



helpnow: I'm completely logged out of WMT. Also, site has been verified for some time.

dstiles: Same IP location, 2 different computers. Consistently different results.

I did conduct a datacenter check again earlier, this time I checked 30 of them, none of them showed the results I see at my one 'rogue' computer.

6:02 am on Jul 14, 2008 (gmt 0)

5+ Year Member



How long did the traffic go for?

38 days...

6:50 am on Jul 14, 2008 (gmt 0)

5+ Year Member



So, browsers do return hidden information to Google, stuff that you might not realise is signalling something about your location, stuff that you might not know about.

FYI - FF sends the language settings with every request. This isn't hidden information. Check the headers and you'll see the language settings going out with each and every request.

I don't suppose any of this is due to google screwing up site verification? Or is that just a side effect?

I'm pretty familiar with the verification issues in GWT. This problem is very widespread, so I think you can safely discount that theory.
8:33 am on Jul 14, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



What I meant by "hidden" is that your average user cannot see that information merely by looking at the normal stuff that appears in the browser window... it's not an option they can select on the site they are visiting, and it isn't a parameter seen in the search URL sent to Google (there are several language and location parameters that can be added to the Google search URL). It's a difference in browser settings "under the hood" which could affect things, without you realising that that is where Google is getting a hint from.
11:28 am on Jul 14, 2008 (gmt 0)

5+ Year Member



I wonder if this could be a penalty for duplicate meta titles and descriptions. The program I use produces pages with duplicate meta on continuing pages. Both of my sites that were effected by this drop in traffic us the same program and have issues with duplicate meta. So far this is the only thing I have found that the two site have in common.
11:50 am on Jul 14, 2008 (gmt 0)

5+ Year Member



Same here, a bug in a recent VBseo upgrade caused my forum to produce 1,000s of duplicate pages, 7 days after the upgrade 0 traffic down from 6k. This is the only thing that has changed on my site. I only show at the very bottom of results, like last page. I fixed it a few days ago along with lots of other on-page stuff, time will tell.

[edited by: Dave_Hybrid at 11:54 am (utc) on July 14, 2008]

2:42 pm on Jul 14, 2008 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



-> I wonder if this could be a penalty for duplicate meta titles and descriptions. <-

What about the non WMT Homepage owners? Are they not affected by this penalty?
Is google only looking on WMT Homepages? Does google think WMT Users are Seo´s or business-realted pages? This would be easy for google to guess who is a proffesional WM?

Are you using WMT ? I do ! An lost >50% on two domains!

3:45 pm on Jul 14, 2008 (gmt 0)

5+ Year Member



I can't beleave just using WMT would cause a penalty. Last year when my site dropped in google I wasn't using WMT at all. I just started using it after the drop in traffic.
3:47 pm on Jul 14, 2008 (gmt 0)

10+ Year Member



helpnow We also have a full recovery today from June 4 - we are better than before. We made many changes... I can document them if anyone cares. Our 40 days and 40 nights in the wilderness are over.

We'd also be interested in looking at the changes you made. If you'd be so kind, please let us know. We lost most of our google traffic gradually over the course of last month.

Many thanks,
ianama

4:41 pm on Jul 14, 2008 (gmt 0)

5+ Year Member



When I first saw a drop in traffic from Google I put it down to these stupid proxy sites. I have no problem with them generally but when they get their urls indexed over yours it is going to cause dup content problems but reading this thread makes me think I may be wrong.

I've also been monitoring my server logs recently for GoogleBot entries and over the past couple of weeks Googlebot is just reading my sitemapindex and the child sitemaps. Very few of my actual web pages are being crawled.

Could this be related to my sites dropping from the index?

< continued here [webmasterworld.com...] >

[edited by: tedster at 8:04 am (utc) on July 15, 2008]

This 201 message thread spans 7 pages: 201
 

Featured Threads

Hot Threads This Week

Hot Threads This Month