Welcome to WebmasterWorld Guest from 126.96.36.199
complain about a real problem
Can't see the sandbox being a "real" problem, or a problem at all. It's there to stop the SPAM. If I start a new company called Mesothelioma Lawyers Ltd you reckon I should show up in the top 500 purely because that's my company name?
Sure, the "sandbox", whatever it is, hurts some. It hurts people who are creating sites for free traffic. Many of them are spammers/freeloaders. It also hurts others. They - particularly anyone starting a new business with a business plan that relies on free SE traffic - are probably better off staying unemployed (or employed if they can find a job). Any new site starting off on the premise that free traffic will sustain it deserves to fail.
So, if you remove the sandbox as a reasonable cause for complaint, and remove most of the other whining, we'd reduce this thread to one page and those that can't even be bothered to read it will get a personal reply from Googleguy because he owes them.
Of course that there is not enough room for everybody in the top 10 but in the top 10000 for my brand name there should be room for me.
For a search term related to my biz I can understand if in front of me are 100 sites that are better then me but if the first 1000 spots are blogs that have links posted by spammers I can not. Just garbage sites that have nothing to do with that search word came up.
Come one, you gotta admit that Google its broken.
[edited by: Kangol at 7:54 pm (utc) on Feb. 15, 2005]
Google can't give recommendations for getting favorable results in SERPs, because there isn't room for everyone in the top 10 or 100 results for a competitive keyphrase.
Very true but there does not seem to be any room in the [strong]top thousand[strong] for the sites that are still in the sandbox after one year.
IMHO, the current Google Webmaster guidelines (including the "quality guidelines") are clear and helpful enough. Google could boil the entire page down into one sentence: "Build it and make it crawlable, and they'll come."
Crawlable is not the problem. Featurable is.
Bogus back links from blogs, forums, guest books are the primary problem in my eyes. Google needs to figure something out for devalueing these links or serps will never get better.
That said I do use google for obscure longer phrase searching. they are still good for that, but for any of the competitive areas forget it.
Another site on the first page was a link farm. The rest was wiki and complete spam.
I've never seen results this bad from any search engine.
Just wanna note that I'm out of the sandbox - after nearly one year in it - now in the new big index. Hope this is truth for others too / will happen to you too now!
links are king.
Nope not for my site. I have low PR 3 and only 10 good links not from very high PR sites but in my topic. My pages are content pure with a very decent related Amazon link at the bottom only (dropped a decent Adsense before Allegra :-)). Also need to note that my site is very small in my competive area against very big spammy competitors for example from China.
And I'm back with my top pages where they were on my ancient Geocities site this new site was derived from. I mean back on the results of 2002 with my new dedicated site now after nearly a year in the sandbox.
joined:Dec 29, 2003
joined:Dec 29, 2003
joined:Dec 29, 2003
I doubt the dupe penalty kicks in at 50%.
joined:Dec 29, 2003
"has anybody had an educated guess at whereabouts it would kick in? "
To the people who have been penalized: what is your page similarity? I mean, 70%, 80%, 90% similar? More, less?
Personally I don't think I've been "penalised" as such, I think I'm just collateral damage from a lousy algorithm.
My worst hit site has unique, hand-written content pages. Other than the side-menus etc there is zero dupe content within the site. I have not found any redirect hijacks or direct copies of the site - a few screen scrapers taking snippets, nothing more.
For me a dupe content filter is not the cause of the drop.
My one consolation is that this must be hurting Google as much as it is me. In the last week I have recovered about half of the traffic I lost to Allegra. It's coming via massively increased (in real terms) referrals from (mainly) Yahoo then also MSN and minor engines.
Judging by my logs, Googla Vista is losing traffic fast.
Why should a PR Zero page feature high in the SERPS? knocking out a higher PR page?
Either a high PR means better content or it doesnt?.
Also why doesnt Google take more notice of its own directory?. If it knows a human has agreed that a site fits into X cat it should feature high in the same X cat search term.
Also doorway pages should carry PR zero and directory sites should not feature imo. Take the lot of them out and put all directory sites under "Directory Sites" related search only. They just clog the index with more Cr@p.
Anyway, rant over. Its certainly time google did something to move towards quality content results for once. Content should always be king if you ask me.
The site is now back in the serps and getting traffic again. But it's not getting even close to the traffic it was before the hijacking.
I am positive the hijacking's are causing dupe penalties and have added random stuff to all my old and new sites to prevent them from getting dupe penalties again.
All of my sites have stopped losing rank and every week they get more and more traffic, slowly but steadily rising again.
Disallow that section with robots.txt? Then remove the disallow after the pages have been redone? Will Google restore the rest of the site after it notices that part has been disallowed?
Leave your site the way it is on your present domain. Get another domain, move your site to it, then copy and paste, search and replace the heck out of your site on the new domain. Change the order of keywords in title/header. Combine pages, throw whatever on it. Feed it to Google.
Let me explain a bit.
I've had several sites banned/penalized by Google and they've never made it back. Leave your site for the other engines. MSN, Yahoo and Ask Jeeves can bring you some decent traffic. Don't go making big changes on your site for Google.
Mentioned in different thread I did well for "obscure place", but if switched search to "obscure place famous city" my site near vanished.
Now, can include "famous place" in searches for a few terms (haven't tested many), and site doing reasonably, more as I'd anticipate.
Visits nicely up, too; and Google now well outperforming Yahoo for sending visits (reverse of previous situation). Change from around 7 Feb.
Hope this continues; certainly set to encourage me to add more, keep on developing the site.
Onward, to glory! (and not, err, back to the sandbox quagmire... fingers crossed)
The recovery started 3 days ago with approx 100 to 300 G hits being added daily to a day’s total unique.
Heres how my logs look like (3 last days) average unique for both sites. They used to avrage 3000 unique p/day pre Allegra:
¦¦¦¦¦¦ (800 unique)
¦¦¦¦¦¦ (700 unique)
¦¦¦¦¦¦¦ (870 unique)
¦¦¦¦¦¦¦ (900 unique)
¦¦¦¦¦¦¦¦¦ (1400 unique)
¦¦¦¦¦¦¦¦¦¦ (1500 unique)
¦¦¦¦¦¦¦¦¦¦¦ (1750 unique)
It is like the pages are moving up the G dial again.
I've said it before and I’ll say it again. Hold on to your horses. This update is not over and numbers are still being crunched. I don't think anyone can draw a definite conclusion re Allegra. It doe's not look like a major algo change. Simply a case of a glitch or way too much data to crunch.
The often bad SERPs with redirect untitled pages and/or link farms appearing at 1st position for various key terms further support my theory that the SERPs are full of raw spider data that hasn’t been processed via the normal algo, PR and spam filters, yet.
Why are they made public and clog the “surfer Joe index” is another good question which may strongly support/indicate that the glitch theory may be the case here.
Nice clean results on Y though for the same keyphrase...check it out.
Now wouldn’t G would have filtered this page from the SERPs if it had a chance to properly process it against the algo?
My conclusion, It is just A major (and hopefully temporary) glitch!
[edited by: ciml at 12:06 pm (utc) on Feb. 16, 2005]
[edit reason] Examplified [/edit]
joined:Dec 29, 2003
second, what DCs are you doing that good?
Yoru theory about the data not being processed yet makes sense, since it seems like our sites (who can;t rank for our domain) get no backlink, PR or any other credit at all. It's either not being processed or simply ignored because of a filter.