homepage Welcome to WebmasterWorld Guest from 54.196.62.23
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 610 message thread spans 21 pages: < < 610 ( 1 ... 7 8 9 10 11 12 13 14 15 16 [17] 18 19 20 21 > >     
Update Allegra Part 2 Update 2-2-2005
Macro

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27999 posted 11:51 am on Feb 9, 2005 (gmt 0)

complain about a real problem

Can't see the sandbox being a "real" problem, or a problem at all. It's there to stop the SPAM. If I start a new company called Mesothelioma Lawyers Ltd you reckon I should show up in the top 500 purely because that's my company name?

Sure, the "sandbox", whatever it is, hurts some. It hurts people who are creating sites for free traffic. Many of them are spammers/freeloaders. It also hurts others. They - particularly anyone starting a new business with a business plan that relies on free SE traffic - are probably better off staying unemployed (or employed if they can find a job). Any new site starting off on the premise that free traffic will sustain it deserves to fail.

So, if you remove the sandbox as a reasonable cause for complaint, and remove most of the other whining, we'd reduce this thread to one page and those that can't even be bothered to read it will get a personal reply from Googleguy because he owes them.

 

Kangol

5+ Year Member



 
Msg#: 27999 posted 7:53 pm on Feb 15, 2005 (gmt 0)

Europeforvisitors,
I see that you have great appreciation for Google. I also like this engine but this last update is a mess. Things are wrong there. The results are bad from any point of view.

Of course that there is not enough room for everybody in the top 10 but in the top 10000 for my brand name there should be room for me.

For a search term related to my biz I can understand if in front of me are 100 sites that are better then me but if the first 1000 spots are blogs that have links posted by spammers I can not. Just garbage sites that have nothing to do with that search word came up.

Come one, you gotta admit that Google its broken.

[edited by: Kangol at 7:54 pm (utc) on Feb. 15, 2005]

BeeDeeDubbleU

WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 27999 posted 7:53 pm on Feb 15, 2005 (gmt 0)

Google can't give recommendations for getting favorable results in SERPs, because there isn't room for everyone in the top 10 or 100 results for a competitive keyphrase.

Very true but there does not seem to be any room in the [strong]top thousand[strong] for the sites that are still in the sandbox after one year.

IMHO, the current Google Webmaster guidelines (including the "quality guidelines") are clear and helpful enough. Google could boil the entire page down into one sentence: "Build it and make it crawlable, and they'll come."

Crawlable is not the problem. Featurable is.

tama

10+ Year Member



 
Msg#: 27999 posted 9:01 pm on Feb 15, 2005 (gmt 0)

Google is definitely broken. Check out the results for "<snip>." Nothing but wiki and complete spam.

[edited by: ciml at 9:16 pm (utc) on Feb. 15, 2005]
[edit reason] No specifics please. [/edit]

webhound

10+ Year Member



 
Msg#: 27999 posted 9:27 pm on Feb 15, 2005 (gmt 0)

this update, in our categories, is the same as it has been for the last year. links are king. regardless of how crappy the site is.

Bogus back links from blogs, forums, guest books are the primary problem in my eyes. Google needs to figure something out for devalueing these links or serps will never get better.

That said I do use google for obscure longer phrase searching. they are still good for that, but for any of the competitive areas forget it.

tama

10+ Year Member



 
Msg#: 27999 posted 9:28 pm on Feb 15, 2005 (gmt 0)

Sorry about the term being mentioned.

Kangol

5+ Year Member



 
Msg#: 27999 posted 9:32 pm on Feb 15, 2005 (gmt 0)

WOW,
10 out of 10 nonsense results for the search term you mention. I cannot believe how bad those results are. I wonder what actual users that search for this think.

tama

10+ Year Member



 
Msg#: 27999 posted 9:34 pm on Feb 15, 2005 (gmt 0)

#1 for a very competitive term was a 404 which simply said "Page Does Not Exist." No PR, no backlinks, no content.

Another site on the first page was a link farm. The rest was wiki and complete spam.

I've never seen results this bad from any search engine.

theBear

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27999 posted 9:38 pm on Feb 15, 2005 (gmt 0)

Maybe they ran out of sort room when sorting the results into priority order and forgot to check the return code from the sort ;) . Or the wiki was a cloaked Wookie ;).

xcomm

10+ Year Member



 
Msg#: 27999 posted 9:53 pm on Feb 15, 2005 (gmt 0)

BeeDeeDubbleU,

Just wanna note that I'm out of the sandbox - after nearly one year in it - now in the new big index. Hope this is truth for others too / will happen to you too now!

webhound,

links are king.

Nope not for my site. I have low PR 3 and only 10 good links not from very high PR sites but in my topic. My pages are content pure with a very decent related Amazon link at the bottom only (dropped a decent Adsense before Allegra :-)). Also need to note that my site is very small in my competive area against very big spammy competitors for example from China.
And I'm back with my top pages where they were on my ancient Geocities site this new site was derived from. I mean back on the results of 2002 with my new dedicated site now after nearly a year in the sandbox.

walkman



 
Msg#: 27999 posted 10:20 pm on Feb 15, 2005 (gmt 0)

To the people who have been penalized: what is your page similarity? I mean, 70%, 80%, 90% similar? More, less?

glitterball

10+ Year Member



 
Msg#: 27999 posted 10:42 pm on Feb 15, 2005 (gmt 0)

Hi walkman,

Is there a tool somewhere that can give me a percentage similarity figure for the pages on my website?

walkman



 
Msg#: 27999 posted 10:44 pm on Feb 15, 2005 (gmt 0)

I woudl search for "Similar Page Checker" or something similar. Search Yahoo of course ;)

Jakpot

10+ Year Member



 
Msg#: 27999 posted 10:52 pm on Feb 15, 2005 (gmt 0)

links are king

Not for the keywords I follow and neither is pr.
On what basis they are ranked is a mystery.

glitterball

10+ Year Member



 
Msg#: 27999 posted 10:52 pm on Feb 15, 2005 (gmt 0)

Thanks for that.

My site's pages range from about 22% to about 50% similar to each other (there is some crossover where the same product is in more than one category).

xrtza

5+ Year Member



 
Msg#: 27999 posted 10:56 pm on Feb 15, 2005 (gmt 0)

Duplicate content is an area that is always possible but my gut tells me otherwise. The name game thing seems to me more like a spellchecker. If your company name isn't in Google's dictionary then it can't be a keyword. Branding is getting harder to establish. Anyone see any trademarked names not showing up for brand/company search. Just a thought... There might be a lot of spamming in this area in an attempt to use close proximity text relationship association. Search engines learn as they go and for example if the text elephant widget is used enough eventually the engines will make a conection and assume elephant is a type of widget. Keyword suggestion tools will reveal some of these. My company name is very distinct and if you use any commercial keyword tools and put in my company name all my keywords are there. Originally zero suggestions related to my comany name now they are right on target. I think maybe they want to make this not so easy to hi-jack someones brand or establish a new one by flooding references to make the association. Ok maybe I am out there.... Just a thought... be nice I am new here.
LOL

leoo24

10+ Year Member



 
Msg#: 27999 posted 10:56 pm on Feb 15, 2005 (gmt 0)

i have a real estate site that was knocked out of the serps this and dec17th , it is a directory of estate agents across hundreds of towns, unfortunately each of those pages is very similar, %70 - %80, so i can put the site's fall down to that.
So i've now got to spend a good while filling those pages with individual text to try and reduce that %

walkman



 
Msg#: 27999 posted 11:07 pm on Feb 15, 2005 (gmt 0)

"My site's pages range from about 22% to about 50% similar to each other (there is some crossover where the same product is in more than one category)."

I doubt the dupe penalty kicks in at 50%.

leoo24

10+ Year Member



 
Msg#: 27999 posted 11:13 pm on Feb 15, 2005 (gmt 0)

has anybody had an educated guess at whereabouts it would kick in?

walkman



 
Msg#: 27999 posted 11:28 pm on Feb 15, 2005 (gmt 0)

considering many sites who have templates but just brief listing of jobs, real estate, scripts, shopping sites with many products: I would say about 80-85%.

"has anybody had an educated guess at whereabouts it would kick in? "

valeyard

10+ Year Member



 
Msg#: 27999 posted 11:53 pm on Feb 15, 2005 (gmt 0)

To the people who have been penalized: what is your page similarity? I mean, 70%, 80%, 90% similar? More, less?

Personally I don't think I've been "penalised" as such, I think I'm just collateral damage from a lousy algorithm.

My worst hit site has unique, hand-written content pages. Other than the side-menus etc there is zero dupe content within the site. I have not found any redirect hijacks or direct copies of the site - a few screen scrapers taking snippets, nothing more.

For me a dupe content filter is not the cause of the drop.

My one consolation is that this must be hurting Google as much as it is me. In the last week I have recovered about half of the traffic I lost to Allegra. It's coming via massively increased (in real terms) referrals from (mainly) Yahoo then also MSN and minor engines.

Judging by my logs, Googla Vista is losing traffic fast.

RichTC

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 27999 posted 12:21 am on Feb 16, 2005 (gmt 0)

Perhaps google should up the Page rank Bar System to say score out of 100 rather than 9 and give a higher page rank to sites deep rich in content.

Why should a PR Zero page feature high in the SERPS? knocking out a higher PR page?

Either a high PR means better content or it doesnt?.

Also why doesnt Google take more notice of its own directory?. If it knows a human has agreed that a site fits into X cat it should feature high in the same X cat search term.

Also doorway pages should carry PR zero and directory sites should not feature imo. Take the lot of them out and put all directory sites under "Directory Sites" related search only. They just clog the index with more Cr@p.

Anyway, rant over. Its certainly time google did something to move towards quality content results for once. Content should always be king if you ask me.

eyezshine

10+ Year Member



 
Msg#: 27999 posted 12:22 am on Feb 16, 2005 (gmt 0)

From what I've seen, my site that was hijacked I just added 4 random quotes to the bottom of all my pages which may have changed the page about 5-10% from the hijacker page and that seemed to do the trick.

The site is now back in the serps and getting traffic again. But it's not getting even close to the traffic it was before the hijacking.

I am positive the hijacking's are causing dupe penalties and have added random stuff to all my old and new sites to prevent them from getting dupe penalties again.

All of my sites have stopped losing rank and every week they get more and more traffic, slowly but steadily rising again.

BeeDeeDubbleU

WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 27999 posted 12:25 am on Feb 16, 2005 (gmt 0)

Just wanna note that I'm out of the sandbox - after nearly one year in it - now in the new big index. Hope this is truth for others too / will happen to you too now!

Well done Xcomm. When did this happen and have you been out for a few days?

AndyA

5+ Year Member



 
Msg#: 27999 posted 12:58 am on Feb 16, 2005 (gmt 0)

Some of the pages in my site my be a little too close for comfort in setting off the duplicate filter. What's the best way, short term, to ensure the fastest recovery?

Disallow that section with robots.txt? Then remove the disallow after the pages have been redone? Will Google restore the rest of the site after it notices that part has been disallowed?

guitaristinus

10+ Year Member



 
Msg#: 27999 posted 1:11 am on Feb 16, 2005 (gmt 0)

AndyA,

Leave your site the way it is on your present domain. Get another domain, move your site to it, then copy and paste, search and replace the heck out of your site on the new domain. Change the order of keywords in title/header. Combine pages, throw whatever on it. Feed it to Google.

Let me explain a bit.

I've had several sites banned/penalized by Google and they've never made it back. Leave your site for the other engines. MSN, Yahoo and Ask Jeeves can bring you some decent traffic. Don't go making big changes on your site for Google.

docbird

10+ Year Member



 
Msg#: 27999 posted 1:27 am on Feb 16, 2005 (gmt 0)

Seems I'm out of sandbox - for the time being. (For site that went live last March; an older site appears little changed.)

Mentioned in different thread I did well for "obscure place", but if switched search to "obscure place famous city" my site near vanished.
Now, can include "famous place" in searches for a few terms (haven't tested many), and site doing reasonably, more as I'd anticipate.

Visits nicely up, too; and Google now well outperforming Yahoo for sending visits (reverse of previous situation). Change from around 7 Feb.

Hope this continues; certainly set to encourage me to add more, keep on developing the site.
Onward, to glory! (and not, err, back to the sandbox quagmire... fingers crossed)

androidtech

10+ Year Member



 
Msg#: 27999 posted 2:16 am on Feb 16, 2005 (gmt 0)

Google is making me love Microsoft (MSN search).

That statement although true, is bizarre on way too many levels.

max_mm

10+ Year Member



 
Msg#: 27999 posted 3:24 am on Feb 16, 2005 (gmt 0)

I have some good news for a change. The traffic on two of my sites that got hit most started to creep back again.

The recovery started 3 days ago with approx 100 to 300 G hits being added daily to a day’s total unique.

Heres how my logs look like (3 last days) average unique for both sites. They used to avrage 3000 unique p/day pre Allegra:

¦¦¦¦¦¦ (800 unique)
¦¦¦¦¦¦ (700 unique)
¦¦¦¦¦¦¦ (870 unique)
¦¦¦¦¦¦¦ (900 unique)
¦¦¦¦¦¦¦¦¦ (1400 unique)
¦¦¦¦¦¦¦¦¦¦ (1500 unique)
¦¦¦¦¦¦¦¦¦¦¦ (1750 unique)

It is like the pages are moving up the G dial again.

I've said it before and I’ll say it again. Hold on to your horses. This update is not over and numbers are still being crunched. I don't think anyone can draw a definite conclusion re Allegra. It doe's not look like a major algo change. Simply a case of a glitch or way too much data to crunch.

The often bad SERPs with redirect untitled pages and/or link farms appearing at 1st position for various key terms further support my theory that the SERPs are full of raw spider data that hasn’t been processed via the normal algo, PR and spam filters, yet.

Why are they made public and clog the “surfer Joe index” is another good question which may strongly support/indicate that the glitch theory may be the case here.

Patience!

max_mm

10+ Year Member



 
Msg#: 27999 posted 3:58 am on Feb 16, 2005 (gmt 0)

P.S.
"<current news story>" (no quotes) on G still brings up an untitled redirect document at first position. Page Cache date 14 Feb.

Nice clean results on Y though for the same keyphrase...check it out.

Now wouldn’t G would have filtered this page from the SERPs if it had a chance to properly process it against the algo?

My conclusion, It is just A major (and hopefully temporary) glitch!

[edited by: ciml at 12:06 pm (utc) on Feb. 16, 2005]
[edit reason] Examplified [/edit]

walkman



 
Msg#: 27999 posted 4:40 am on Feb 16, 2005 (gmt 0)

max_mm,
good news are not allowed in this thread ;)

first congrats,
second, what DCs are you doing that good?

Yoru theory about the data not being processed yet makes sense, since it seems like our sites (who can;t rank for our domain) get no backlink, PR or any other credit at all. It's either not being processed or simply ignored because of a filter.

max_mm

10+ Year Member



 
Msg#: 27999 posted 5:08 am on Feb 16, 2005 (gmt 0)

Thanks walkman. I just hope it lasts.

Re what DC's:
The hits data in my post is from my server logs. I gave up checking any DCs for my keywords. They still fluctuate alot...waste of time/no point.

Simply follow your server logs closely for google referrals...they should (hopefully) start to peak slowly as your keywords move up the dial and G continue to crunch the PR numbers and finalise this update.

Hope you (and everyone else affected) will experience return to normal business soon!

This 610 message thread spans 21 pages: < < 610 ( 1 ... 7 8 9 10 11 12 13 14 15 16 [17] 18 19 20 21 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved