Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

27 June screwup - theory

         

donelson

10:26 pm on Jul 8, 2006 (gmt 0)

10+ Year Member



We have a bunch of sites that have been badly affected by the 27 June screwup.

However, we have one site that is still #1 for the two main keywords.

I have looked at various theories, to no avail so far.

Here's another ---

Do any of you have badly affected sites in which the home page has AdSense with pictures right above the AdSense banner?

I have four pix semi-aligned above the three- or four-text AdSense listings.

Google actually wrote me an email a while back saying this was okay as long as the pictures were not intended to mislead visitors, just to "draw the eye" to the AdSense area.

BUT, the site I have that's not affected by the 27 June screwup does NOT have these pix above the AdSense area.

Yes, another screwy theory --- anyone else think this might be a problem?

steveb

9:38 pm on Jul 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"So if your site dropped that means all the other sites are doing better in the existing algo."

No it doesn't mean that at all.

June 27th was just another screw up. Problems with sites obviously has zero to do with an algo. What algo decides that a supplemental result that doesn't exist should replace the main index page for a domain, or that random pages froma domain will be pinned at the bottom of site: search results and rank horribly while the rest of the domain does just fine?

This has happened seven or eight times now, and while something new could happen next, the most likely thing to assume is things will next happen as they have in the past... some screwups this time will be fixed next time, while other screwups will take place.

Martin40

10:06 pm on Jul 10, 2006 (gmt 0)

10+ Year Member



So how about www <<>> non-www 301 being seen as PR-spamming?

It should be obvious though, that it's not an algo change that caused 6/27. If it's a data refresh on an existing algo then the 6/27 results should have been seen before 6/27.

[edited by: Martin40 at 10:09 pm (utc) on July 10, 2006]

almar

10:13 pm on Jul 10, 2006 (gmt 0)

10+ Year Member



We finally recovered from the Jagger(s) stabbing in the back to regain our placement in the serps. Now that we're financially starting to recover from that loss, we're being pinched by the June 27 screw-up. Our daily Adsense revenue could barely buy us dinner this week; it hurts Google's pocketbook as much as ours. What gives? When will search and ad every be on the same page at Google. If you think they are, ask our Gooogle salesperson who is forever clueless as to how the engine works.

ashear

10:29 pm on Jul 10, 2006 (gmt 0)

10+ Year Member



It seems to me that Google would benefit from opening up a beta search. Before any major update, allow the public to have a chance to say "This looks great" or "This sucks and ask why".

They seem to want help with all of their other products, why not search?

ontrack

11:22 pm on Jul 10, 2006 (gmt 0)

10+ Year Member



>> 64.233.189.104 I'm back too without any site: problems

It's interesting for me, my ranking is still much worse than before but my index shows up first for site:www.mysite.com but not for site:mysite.com

Can someone explain what the difference is between the results for site:www.mysite.com and just site:mysite.com?

I pay to have a hosting company host my site, I don't do it myself, I've never done anything special regarding www or no www, is there something important I am missing that I should ask them to do on the server?

Thanks :)

Northstar

12:44 am on Jul 11, 2006 (gmt 0)

10+ Year Member



Looking on google today it looks like some of my page that were cached on June 27th were cached again on July 6. Do they cache pages very often? Whatever happen on June 27 wasn't a normal cache of pages was it? I haven't see any change in traffic. It is still way way down.

SuddenlySara

1:06 am on Jul 11, 2006 (gmt 0)



we should just call this update "clean up"

Northstar

1:10 am on Jul 11, 2006 (gmt 0)

10+ Year Member



I just checked 64.233.189.104 and i'm back to index and main pages first. I hope this is a good sign.

trinorthlighting

1:29 am on Jul 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



steveb,

If you read matts blog it was a data refresh on the existing algo. That means they did not have all the data. Once the data dropped back in, serps adjusted. A lot of the "deindexed" sites came back into the index.

So, like I said, if you took a drop in the serps, that means other sites outrank you in the exisiting algo.
I am just pointing out that matt stated that the changes on the 27th are going to stick.

Matt also said their will be another data refresh in two weeks. They obviously realized they were missing some data.

steveb

1:43 am on Jul 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"That means they did not have all the data."

No it doesn't. Where did you get that idea?

"So, like I said, if you took a drop in the serps, that means other sites outrank you in the exisiting algo."

And that's not what you said. You said "So if your site dropped that means all the other sites are doing better in the existing algo" which is illogical at best. I don't know what you think you mean to say, but a data refresh means the data was refreshed. Like Matt said it was not an algo change but merely a data refresh, and obviously refreshing the data can mean some will be added, some will be lost, and some will be misinterpreted.

SuddenlySara

1:51 am on Jul 11, 2006 (gmt 0)



trinorthlighting, you need to read between the lines at matt's blog. he is not going to give you what is happening. he does not know. let's not use his blog to figure their search results here.

trinorthlighting

2:26 am on Jul 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Steve,

I was refering to a lot of the big daddy dropped data being reindexed and the supplemental index being recrawled and refreshed. So, with newer data now in the index, if you suddenly fell in the serps, then you really need to look at what sites replaced you.

Just keep in mind, during big daddy when a lot of sites dropped, people actually looked at their sites and found mistakes and fixed them, I know I did. I was busy making meta's unique, getting w3c compliant (This helped me find a lot of simple errors that were preventing me from being spidered properly) etc.... This all means people actually improved their sites and finally googlebot caught the newer data. I started changing my site up right at the beginning of the dropped pages era of big daddy and finally on the 27th google finally caught it.


kidder

3:50 am on Jul 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



trinorthlighting - I'm not sure what market sectors your watching but search results should tell you this whole thing is far from done. Google is about doing search well -right now it is far form doing that. I've seen quite few sites resurface since the 27th but many of them are just gap fill from where I sit.

There is still a big problem when older sites drop off the charts altogether.

There is still a big problem when thousands of sites went supplimental overnight on the 27th.

There is still a big problem while google is reporting on site page counts that are not even within 20% of actual.

The site command is still a big problem reporting supps before index.

You can bet your left one Google is not even halfway happy about how things stand right now, you can also bet we are not looking at a finished product this week. Expect further changes, nothing is more certain.

elbimbo

4:52 am on Jul 11, 2006 (gmt 0)

10+ Year Member



64.233.189.104
Here too site comand shows index first and for some key words back in top 20 but not in top 10 like before 27th.

CainIV

6:34 am on Jul 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member




So, with newer data now in the index, if you suddenly fell in the serps, then you really need to look at what sites replaced you.

Not true whatsoever, especially in the case of canonical errors with perfectly fine sites and hyphenated domain problems.

Contrary to belief, being W3c compliant, as good as it is for the web world, has nothing to do with ranking well.

Dayo_UK

8:42 am on Jul 11, 2006 (gmt 0)



So, with newer data now in the index, if you suddenly fell in the serps, then you really need to look at what sites replaced you.

Agree with CainIV

Not true whatsoever - if we were talking about a change where some sites appeared at the top of the serps while others lost ground to them because they had more BLs, Better PR, Better Title Tags, Better Keyword Density etc then fine.

But the fact is that the sites that have lost position are now not being read (I suppose this term fits) properly by Googlebot and one of the side effects of this is easy to see as the homepage is no longer top in a site:domain.com search.

This is not a new problem with Google and has been discussed lots of times and G have even talked about fixing it - but that was a year ago and still we are no nearer.

As Steveb says it is logical to assume that we will continue on this path of screw up after screw up as Google are unable to fix this problem to date.

Just Guessing

9:19 am on Jul 11, 2006 (gmt 0)

10+ Year Member



Guessing (as ever): "refreshing data used by an existing algorithm" = computing site-wide quality factor

How often does Google compute site-wide things like:
% of IBLs that are trusted
% of OBLs that are relevant
% of OBLs that are affiliate links
% of OBLs that are reciprocal
% of pages in the site that are near duplicates

Also "refreshing data used by an existing algorithm" doesn't state whether the same parameter settings (weightings?) were used in the existing algorithm.

side-wide quality factor < X ==> splat

Just Guessing

steveb

9:29 am on Jul 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"So, with newer data now in the index, if you suddenly fell in the serps, then you really need to look at what sites replaced you."

I can't imagine how the heck you come up with that. It should be clear at this point that first you need to look at your domain. Other domains are entirely irrelevant to this phenomenon. This is not an update, so there is nothing algorithmic to talk about. It's about one batch of data replacing another, which means some errors were fixed and some errors were made, and in this case it seems clear that many more errors were made than were fixed by the data being refreshed. Talk about algos or penalties or ranking is not just missing the point, but to go off in unrelated directions.

Northstar

10:53 am on Jul 11, 2006 (gmt 0)

10+ Year Member



This makes no sense. Last night my site was showing up normal for both google.com and 66.102.9.104. Now this morning it is back to being messed up again with supplements first. What is going on?

ScottD

12:15 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



I think Just Guessing is on the right track in some degree.

It's a data update that happened (MC) but what is the data? And how is that used?

The site: problem is not new, just the sites that are suffering from it are new to the problem.

However, at the same time, the site: problem is screwy - it makes no sense that Google list some obscure or even supplemental page before the home page.

Still, you can't fix Google, so maybe concentrate on what Google is seeing in your "data"

trinorthlighting

12:57 pm on Jul 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Its not about back links, my homepage has none, very little page rank as well. I know in the sector I am watching, I was about the only one who made good solid changes to the pages. (Mostly with improving internal linking structure and following google's guidelines to the T)

As far as the W3C, it helped me catch title tags that were not closed, bad href tags, etc... Once I fixed those and the pages were recrawled, they were no longer supplemental.
Unclosed title tags is a killer in google. I learned that lesson the hard way.

I see in my sector very updated cached supplemental pages where before the 27th the supplemental results were june-august. Now they are more recent. Supplementals do factor into the equation, as that index was recrawled/rescored it is going to cause changes. Those changes are finally hitting with the data refresh.

It is not that any of you have bad sites, most of your sites I looked at in the past and they are good. What I am saying is look at the sites that replaced you and the changes they made. See if you see a trend.

Tinus

1:49 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



My theory,
Supplementals first is not a sign of problems of Google but is the effect of the filter Google uses on their data. When the filter is placed on your site, this will be the effect.

One of my sites suffered supplementals first since 22 sept. and traffic from google was low. On the 27 juni changes came and it was homepage and non supps first for me. I have extremely well results now.
I think use of the filter of 22th of sept. is lifted from the sites suffering it and applied again on new data. Those days the filter was assumed to be triggered by dup. content and internal linking (using to much keywords in them).

My ten cents.

Tinus

2:01 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



<<.... Still, you can't fix Google, so maybe concentrate on what Google is seeing in your "data">>

Wise words!

Jakpot

2:12 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



<<.... Still, you can't fix Google, so maybe concentrate on what Google is seeing in your "data">>
Wise words!

If I could only define what they are seeing

mimo

2:39 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



But the fact is that the sites that have lost position are now not being read
I hope it will be like as you said.

Well I wait 1 month before write my opinion.
I manage different websites created with same html template.
For some old site and some relevant keywords (that stay in first 3 position in last year) I disappear or dropped from ranking. For other new sites I boost in first position; this is very strange.

The ranking that I see gives advantage to:
- Affiliate program (duplicate content)
- Keywords Stuffing in Title and H1 on top
- Keywords density (more 8% for keyword)

I hope in Matt Cutts that said I’d give is to make sure that the site adds value and has original content.

M3Guy

3:09 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



>>>we should just call this update "clean up"

You can call this update / refresh many things, but a 'clean up' is not one.

There are many options available to you, just enter a word of you choise in the stars.

This update is a monumental '**** up' that no one will admit to and will keep being reffering to as a data refresh

skweb

3:16 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



There is something else I am seeing on my portfolio of affected websites: the pages that have gone supplemental are not there because they don't have good incoming links - they are showing wrong page titles. As others are reporting, rather than using just the text from the title tag, for some strange reasons, it is also using snippets from the text below it. I don't buy the theory that pages have gone supplemental because they didn't have enough links. I have websites that were launched just weeks ago with hardly any incoming links and nothing has happened to them - actually they rank much higher than I would expect.

Lorel

4:36 pm on Jul 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



ScottD:
whilst suffering fom the site: problem I actually got an answer from Google confirming that the site was not suffering because of any penalty, so I don't think it's that

Google also doesn't think it's a penalty to be hijacked--when your home page and other important pages completely disappear because someone has set up 302 redirects to them.

Re. the changes in site command --

I have a client that lost up to 80% of his pages since April (they fluctuate up and down) but just recently while the www version of his domain was showing only the home page in site command now the page count is the same as the non-www version of his domain (still about 57% of pages missing and most of them are totally unique content--none of them are directly linked from home page however).

This site has been up for about 9 years, had a 302 redirect set up in Feb and this is the first that Google has finally credited the site:www.domain command with more than the home page. Now both commands show the same amount of pages.

So it appears to me it is finally fixing the cannonicalization issue for this site although about 2/3 of the pages are still missing.

BTW, this site has been ranking #1 for most of their terms even before the design and while pages are missing, so those pages have not been penalized--it is hurting in traffic however (down by 1/3) because of missing pages.

Martin40

8:41 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



So it appears to me it is finally fixing the cannonicalization issue

So if the're messing with canonicalization, could a www <<>> non-www 301 put an algo (beta) off base?

See also [webmasterworld.com...]

[edited by: Martin40 at 8:51 pm (utc) on July 11, 2006]

Martin40

9:25 pm on Jul 11, 2006 (gmt 0)

10+ Year Member



It's also interesting to see that a 2.5 year old Apache 301 tutorial is now featured on this forum's homepage (July 4th). Coincidence?

[edited by: Martin40 at 9:32 pm (utc) on July 11, 2006]

This 192 message thread spans 7 pages: 192