Welcome to WebmasterWorld Guest from 126.96.36.199
So: if your pages are still supplemental, feel free to write to sesnyc06 [at] gmail.com with the subject line of "stillsupplemental" (all one word), and I'll ask someone to check the emails out.
Hope that helps, and I'm glad that lots of people are seeing a full recovery,
[edited by: Brett_Tabke at 5:20 pm (utc) on Mar. 22, 2006]
So I remain quietly confident.
Yeah .... part of me does to, but silence is part of not knowing .... and that's got a lot of us jumping up and down.
More communication from Matt's team would help the webmasters ....but i guess what else can they say!
joined:Dec 29, 2003
1] Doing a site: search on BD shows all but the first 15 or so as supplemental on BD
2] Doing individual keyword searches shows most up as non-supplemental, for results for the same site.
3] Doing a search for site-name only without www and .com shows no supplementals, and 1000's of results.
From this, it appears that the site: search has a glitch which shows up many more supplementals than there are in reality.
Bear in mind, these are observations on one site only, so it would be interesting to see if others notice the same.
I only bring this up because this site appears to be suffering from the same problems that most of the posters here have experienced.
My first take on this website is that it has a serious canonical issues (www and non.www pages are duplicated throughout the site).
Any thoughts guys on this post as i wish to establish what to do next to assist sites that have been stuffed by this algo change (albiet being possibly not finished or f@cked up or whatever by google) but with a severe traffic fall off on a number of authority sites its either try and fix the sites to fit what google wants PDQ or sack staff :-
Have fellow webmasters noticed a drop in rank for pages with the following:-
1. Dynamic pages ( a preferance for google to list a static thats not so relevent rather than a dynamic page that is highly relevent, regardless of backlinks)
2. Drop pages that have java script that may contain windows with advertisng links?
3. Drop pages that may just have a high number of internal links on them (following perhaps a turn of the link dial to accept only pages with lower number of outbounds than previously acceptable.
4. That in general PR is worth next to nothing now as a PR0 can outrank a PR5 and a PR6 dont count for much now either?
Im trying to get my head round this update. One of our sites and a few of a number of other ones we work on were clearly stuffed by google and i cant understand it.
one example:- A dynamic page rich in content that was relevent to a key word string that was previously ranking ok position 8 in the serps was dropped. The page is a PR5 has backlinks to it from two PR7 authority sites (not paid for, genuine) on topic, amoungst many others back links meanwhile, a static page that has possibly few backlinks to it other than internal ones and no where near the content ranks position 18 from the site instead.
In other situations various dynamic pages that were like say "blue widgets in widgetville" that ranked top 5 for that term now have vanished and instead we see maybe in position 33 "pink widgets in widgetville" - its like google prefers to list pages not so relevent.
Its like google has decided that if you have more than one page that contains the keyword string it will go for the weakest one and drop the prime ones into the supplemental bin.
One final thing - a page about something real specific on a site which is one of about 7 pages on the net full stop - ranks 1 out of the 7 as a supplemental result - whats that about?
Any thoughts fellow webmasters can give me would be appreciated because doing nothing is not an option now
Yep - no doubt in a million years that the site in question has a canonical url problem.
If you look at <rk> values for that site then they are returning again - although they are still split from non-www to www - which is a bit worrying for the fix that is supposed to be forthcoming. But if Google can at least rank/crawl pages even if there is a non-www/www version it would be a start.
I have not seen a site go supplemental yet that is not a canonical url related problem, although others are saying that they are sure they have not got a canonical problem.
Would love a sticky if anyone thinks there site has been hit but is not due to this issue.
Let Google rank expired domains / 404's / copied sites / not at all related sites on top. Kick honest sites to supplemental club.
Google will go the Internet Explorer way.
Cya all. I'll continue what I have been doing. I don't need to SEO my site to help search engine list it.
Are there others in this situation? Any thoughts on this situation?
Yes this describes my situation exactly, also.
That describes my situation:
My old site (online since 2000) have 900k sites in non BD, 4.000k sites in BD listed and is in suppl. hell since early March 2006.
My new site, online since Jan 2006, (1.900k sites in BD) dropped overnight in traffic -90%. No suppl. hell but big changes on BD DCs (~600k sites ... 9.000k (!) on the original BDDC (188.8.131.52)).
HTH, Greetings from Germany
[edited by: tedster at 7:13 pm (utc) on Mar. 19, 2006]
[edit reason] charter [webmasterworld.com] [/edit]
I stuck to my guns with Google and got past their usual canned response rubbish to find out the site is not penalised in anyway. Apparently I should just sit tight and all will be ok.
Traffic is not too adversely affected yet as it looks like the dcs rotate and lots of times the old (good results) get shown. I just hope the the new (screwed up) results are not the ones that stick when this all concludes.
I set up a shopping cart on a site that had been doing pretty good for a few years (all products were listed on static pages) and soon as the shopping cart went up all the shopping cart pages turned supplemental and all major keywords tanked, even though I had set up robots.txt for Google to stay out of that directory. I have since put rel=nofollow on every link to the shopping cart, so we'll see if that helps. The pages all validate and there are no other problems with the site, not even hijackers. The only culprit I see is supplementals.
joined:Dec 29, 2003
Matt Cutts Said,
March 19, 2006 @ 9:10 pm
Gary, Iím still talking about this issue with the crawl/indexing folks. Iím still working on it.
This open line of communication has to be good news. Matt is responding quickly to my requests for updates every few days. I have asked for a further update as soon as he believes they have a fix.
I see this as positive for the time being at least.
All supped. I dont plan on doing it as I would rather wait this out, but what would happen if you wanted to start over with a new domain name...could your pages then get reindexed into the normal index over time using the new domain?
Thanks for the load of stickies I have had.
Everyone so far I have seen has split PR between the non-www and the www (eg Canonical problems).
Looks like a different stage of the same problems that have blighted Google for so long.
Where are you folks? News in matts blog, only 9 Hours old:
"Abhilash, I checked on one site that was in the supplemental situation and it had gotten ~240 regular pages back. It may take some of the sites a little while to be crawled again, but Iím trying to keep it foremost in the minds of the crawl folks."
Now i generate new hope, that this glitch will be fixed.
Greetings from Germany,
Google sitemaps showed yesterday that it didn't visit my sitemap for a week. But it did visit it. I resended the sitemap and now it is working correctly again.
There's something wrong at the moment the data gets passed trough i think.
March 21, 2006 @ 12:00 am
Abhilash, I checked on one site that was in the supplemental situation and it had gotten ~240 regular pages back. It may take some of the sites a little while to be crawled again, but Iím trying to keep it foremost in the minds of the crawl folks."