Forum Moderators: Robert Charlton & goodroi
This time I prefer to keep calm about this update. So instead of start a thread about how evil-ish is Google, I decided to start a thread to discover the cause of all this mess.
I'll try to explain all SEO relevant characteristics of my affected site (I have other sites not affected by this update), and I hope more people do the same, so we can find a pattern and act consequently.
Morphology:
The site is three years old, and is structured in folder, and every folder is about a different theme and they are not related.
Every folder has articles (unique content), a discussion forum, a links section and in some cases, photo galleries.
Most of the articles have a thread in the forum to discuss about it. And the first user comments are displayed under the article. After the comments, there is a link to the related forum thread, so the discussion can go on without disturbing too much.
The article has a link to the thread, but the forum thread has no link back to the article.
I run Adsense ads in all the pages of the site.
Inbound links:
4 of the subwebs ( folders) of the site have an inbound link from 4 different DMOZ categories.
There are some (maybe 3 or 4) link exchanges, but from/to related sites.
Outbound links:
All the outbound links are to 'good' sites. The outbound links are usually only in the links section, and some directly from the articles.
Inner linkage:
The main page of the domain links to all the folders of the site.
All the pages in the folders have a link to the rest of pages of the same folder.
In addition, the footer of all the pages have links to the rest of the root of the other folders.
Every folder has a valid sitemap submitted to google a few months ago.
Special folders:
One of the mini-sites (folders) is a 'free photo album' application, so there are a lot of pages with the same text, but with a different picture.
Other mini-site is a directory of hotels and restaurants of a city in Spain, so again there are a lot of 'similar' pages.
Evolution in the serps:
The last two months the number of indexed pages in google has been growing after being in the supplemental hell.
The position in the serps for a open broad of searches was quite good, always in the first page for my targeted keywords and variations.
Panic actions:
I know I should have stayed away from making changes now, but.... Today I've created a robots.txt that exclude googlebot from indexing the images of the free photo album site, and the details of every hotel and restaurant from the spanish city site.
---------------
Any similarity with your affected sites?
[edited by: tedster at 8:07 pm (utc) on June 28, 2006]
[edited by: FinanceGirls at 6:26 pm (utc) on June 28, 2006]
P.S. Listen to Tigger, he was a very good friend to me, and many others, during Jagger last year, he knows the ropes.
Best Wishes
Colin :-)
My main 6 year old site has dropped to the abyss. I would never have imagined that this small site would ever rank that high. Something must be terribly wrong.
It isn't a white hat site, it's a brilliant pure white helmet site!
I really hope it's a (recoverable) mistake on Goog's side, I really hope so. Do I sound desperate? I am.
Them that has the Gold, makes the Rules.
If too many advertisers or adsense publishers are losing money on this, then SO IS GOOGLE.
And no matter what Anyone says about "editorial independence", that independence is never allowed to badly affect the sacred Revenue Stream. Otherwise the shareholders revolt.
The last time we went through this (much too recently), everything eventually settled back into (mostly) the same rankings we had before.
At this point, I am just going to bend over and enjoy (?) being scr*wed by Google AGAIN for a while. Sigh.
WOW.. this is bad... just WOW ... i hope google is looking at this as we speak
I have an old, established site, with 2 dmoz listings, a wiki listing, etc.
checking my site: command, all my results are supplemental, though all the pages are still listed. curiously, the main index page is not at the top of the command list, though it's still in the list of results.
[edited by: FrostyMug at 6:49 pm (utc) on June 28, 2006]
I saw quite a few of my high traffic sites plummet in google referrals mid-day (east coast usa) on the 27th. My site command shows plenty, if not all, pages indexed just the traffic is GONE.
Is there another thread (I didnt see any) that discusses WTH happened on june 27th, or is this the thread? :p
Is there another thread?
In addition to this present thread, "June 27- changes are a complete disaster", there is June 27 - we fully recovered from traffic drop [webmasterworld.com]
Apparently the new results are quite the mixed bag. I don't think we have any conclusions so far.
[edited by: tedster at 7:10 pm (utc) on June 28, 2006]
One thing I noticed was that the sites being served up right now, don't have any of the affiliate links in them
I'm seeing that as well which is what happened 1/5 and lasted for 2 days - give it a little time, I hope it will recover again
One of my sites that made it to the top yesterday has zero affiliate links. My main site which dropped has many.
One of my sites that made it to the top yesterday has zero affiliate links. My main site which dropped has many.
I also do not belive this theory.
I use only one affiliate on only 3% of one of my domains.
Most of my domains with far less Google traffic have no affiliate links.
Only Google AdSense. I think Google does not have anything against AdSense.
One of my sites that made it to the top yesterday has zero affiliate links. My main site which dropped has many.
I've got pages with affiliate links (in some cases, pages that consist only of annotated affiliate links) that continue to rank #1 for competitive phrases. So I don't think affiliate links per se are a reason to be dropped. (Affiliate sites might or might not be a different story.)
One oddity, though: I have a few pages of annotated affiliate links that went missing from page 1 of the SERPs a year ago and have reappeared and disappeared at various times ever since. Right now they seem to be missing again, but that could change tomorrow or next month.
My take, big daddy is using a newer mozilla bot and it is very picky about coding. How does your coding look on your sites?
When I was going through my site I found a bunch of href errors. Once those were cleaned up, bam I was back in the index.
All of my traffic came back, I have pages in the index with zero page rank, etc...
I cleaned up code and became W3C compliant. My site came right back in a higher serp position.
Google have a good thing going, and they're not going to mess it up because "you won't play ball with W3C"
Remember, SHAREHOLDERS rule Google now.
I believe that this is (another) stupid Google mistake, just like the last TWO.
(Unfortunately, it's going to cost us AND THEM a pretty penny!)
The new mozilla bot is not as forgiving for bad code. One of my sites are proof of that. Here is what made me drop out of the index originally:
1. Title tags not being closed
2. Header tags not being closed
3. Href tag capatalization I had a capital A href instead of small a href
My pages slowly started dropping out when the newer mozilla bot came around. Then once big daddy was released, my pages totally dropped.
I fixed the simple errors and I was reindexed...
I am saying is that the bad code is what causes a lot of sites to drop. One of my sites is proof of that. Check your coding for errors, especially internal pages.
A good indicator is look at your webstats. If you see google bot just visiting one page at a time and not following links (Especially when there are new links on that particular page) then you might have some code erroring googlebot out.
[edited by: trinorthlighting at 9:29 pm (utc) on June 28, 2006]
So, you are saying that Google cares more about W3C than content? I don't believe it.
I think the point is that finding and fixing coding errors can help you a lot in getting pages fully spidered and properly indexed -- not that you get brownie points for W3C validation.
I have probably checked several hundred domains that were having Google troubles since Big Daddy rolled out. In almost every case I saw either basic coding errors (especially bad table tags or unclosed tags in general) or a lack of really basic "best practices" (unique titles and meta descriptions, and unique url structures).
It is worth the time to write good code. The whole danged search engine is automated and algorithmic. Why just hope that your particular errors are already accounted for?
If you do all this, then Google may still glitch up on you -- but the situation is simpler and more likely to get fixed quickly.
You said in your original post you have 3-4 links from link exchanges.
Even though they might be relevant, they could be hurting as well.
3 or 4 links don't seem too much reason for a downfall of 200 positions, but well, it's a possibility.
Is there anyone without any reciprocal link that has been hit?
3 or 4 links don't seem too much reason for a downfall of 200 positions, but well, it's a possibility.Is there anyone without any reciprocal link that has been hit?
Matt Cutts recently posted an article about a site that had site-wide links to other sites in it's footers. Matt's conclusion: paid links!
My footers look exactly like that, but the links are all my own sites. But a bot wouldn't know that.
Anyone with links in his/her footers?
[edited by: Martin40 at 10:03 pm (utc) on June 28, 2006]