Forum Moderators: Robert Charlton & goodroi
Here’s the email I sent Google and the response I got:
[Email excerpts of ANY type or length are not allowed on
WebmasterWorld. There are no exceptions to this rule.
Terms of Service #9 [webmasterworld.com]]
[edited by: tedster at 12:50 am (utc) on April 21, 2006]
Mine are custom made in Dreamweaver, totally unique and not the cookie cutter commercial plug and play type.
You work hard for several years, up late at night, falling asleep at the computer, to create a great site with unique content. The bots come along and you are as excited as a new Father...
.. then they take it all away in one single weekend.
It's like having your house stolen, removed off the foundations with no explanation... except for the toilet... because that's where all our hard work has just gone.
This is a wake-up call guys.. we have to be more independent. We are being treated like little puppets on strings, one snip and we are dead.
Martin.
G SMaps won't fix this! something is not right with G I'm seeing pages vanishing by the day and the few other webmasters that I chat to within my market all all seeing the same thing
I agree with you and there is something not right with G index. I saw a lot of strange problems since last weekend. For example, a 300,000 results keyword has only two pages of serp in some datacenter; a same web page (completely same, one URL) listed twice on page1 and page2 (they fixed this). A sandboxed website of mine was shown as "Domain.Tld" instead of "title of domain.tld" when searching "site:www.domain.tld" on 64.233.187.104 dc.
One of my personal fan site uses Google API as in-house search engine and I found the serp of my website comes back to early of Jan.
It's really about time G put there hands up and said something is not working and a fix is being worked on!
They are busy working on... see the top-right corner of this page <g>.
It used to be the case that you worked hard at ranking your sites, not its a battle just to keep them in the index!
Same stuff happening to some of my sites, ones gone from 1.2 million to 364 pages indexed
______________________________________________
Could someone please explain to me in plain English how it is humanly possible to have 1.2 million pages on Google?
Surely there is some serious automation going on that Google has figured out.
yandos, could you please send me the URL privately. My comprehension of how one produces over 1 million pages is obviously needing some serious education.
Thankyou.
Martin.
My comprehension of how one produces over 1 million pages is obviously needing some serious education.
all 3rd level pages have disappeared.
maybe it's a duplicate content filter gone mad? i call that a pretty silly fight against duplicate spam, if that is what google is trying to achieve.
what we see now is homepages appearing instead of exact deep linked matches, forcing the user to click through to the wanted information. and honest publishers being hit instead of spammers. i would imagine that if people really want to inflate the index with their pages, they already have changed (automatically) a few words to say 85% quote of duplication or shuffled the whole nonsense or generated another template to make the page look different to the bot in order to show their copied content, right?
what a mess. i liked the previous practice of filtering per relevancy and pulling irrelevant results to the end of the serps (for all i care they could kick them out completely as well).
however, one sentence in googles' answer to a previous poster cought my attention:
In general, webmasters can improve the rank of their sites by increasing
the number of high-quality sites that link to their pages.
so you have to get valuable inbound links to your deep pages, a sitemap or deep-link from your own website does not suffice? this would be crazy.
have google given up their relevancy criteria instead of automatically cutting out valuable deep content? like "enough already, we've seen it all.."
let's hope that it's only an unsuccessful test.
It's quite possible. Think 10 year old forums, joke sites, recipe sites, blogs or anything old with user created content. There are a lot of sites that that fit that description and they are perfectly legitimate.
I've noticed that many of the posts about pages gone missing are from people who have massive sites (not all). I can't help but think that G is finally clearing out some of the dross. To be honest - if you have a million pages, and you're not Wikipedia, most of it was generated to increase SE traffic. We all know this here. So does Google.
I've noticed that many of the posts about pages gone missing are from people who have massive sites (not all). I can't help but think that G is finally clearing out some of the dross. To be honest - if you have a million pages, and you're not Wikipedia, most of it was generated to increase SE traffic. We all know this here. So does Google.
[edited by: Atomic at 2:08 am (utc) on April 21, 2006]
Har de, you are funny!
I'd say with several hundred thousand pages and having a backlink from NASA... that you are doing pretty good and certainly no dummy.
Since we can't benefit from URL's in here, could you please send me the URL to your site privately?
Thanks, there really is life beyond Pluto.
Martin.
I've noticed that many of the posts about pages gone missing are from people who have massive sites (not all). I can't help but think that G is finally clearing out some of the dross. To be honest - if you have a million pages, and you're not Wikipedia, most of it was generated to increase SE traffic. We all know this here. So does Google.
You've hit the nail on the head here. Please don't take this the wrong way, but it is precisely dumb talk like that that does appear to be winning the day at Google nowadays.
The notion that you can reliably generalise about the "quality" of a website by a simple count of its pages, or by its rate of growth, or by its linking structure, or by its subject matter, or by its age...is just plain ridiculous. Of course there is SPAM out there, but the term is far too liberally applied. All too often people use the term to describe any website that is unlike their own.
Ironically, by many of it's own metrics Google is the biggest Spam-site out there. 99.9999999% of Google is just scraped duplicate content is it not?
I'll have to chip into this topic as well.
Some of our client web sites have content disappearing, titles mixed up with other sites sharing the same server.
I've just applied a dedicated IP address to some of the sites to help fix the shared titles.
This happened 2 weeks ago when the PR was being updated. Today the titles are still wrong but its too early for them to update it.
Issues we can see so far are:
Wrong Titles some reporting missing titles altogether
Pages disappearing
Incorrect PR values probably based on missing content/pages
Huge flux in SERP's
What to do?
Nothing, just like florida and others, take a drink, enjoy spring, decorate the house, write a will, try something new and leave google to dance.