homepage Welcome to WebmasterWorld Guest from 54.205.205.47
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 105 message thread spans 4 pages: < < 105 ( 1 2 3 [4]     
Post Austin SERPS Starting to Improve
Keep turning the crank back google
customdy




msg:211403
 1:54 am on Feb 5, 2004 (gmt 0)

In the last few hours I have noticed a moderate improvement in keywords that were heavily filtered in Florida and Austin. One of my competitors that domain name is keyword1_keyword2 is now back in top 10, he was gone in both Florida and Austin, doesn't look like he made any changes.

We are now back to page #2 or #3 on most 2 word searches, we have reduced keyword density but I think it is more of a tweek that Google is doing.

Keep in coming Google.

 

Hissingsid




msg:211493
 5:15 pm on Feb 6, 2004 (gmt 0)

No matter how G changes the algo, there will allway be ways to spam and cheat. ut if they continue with this one, the web will sooon be full of those baloon sites I was describing earlier on this thread.

Moving deck chairs around on the Titanic!

Can you sink a search engine?

Best wishes

Sid

VERY Long PS
Has anyone noticed that links that lead somewhere but not to anywhere that Googlebot can go seem to be being given the same weight in this algo as real open links. Most of the outbound links from these crappy directories are via some form of script with a long query string in the URL. As a side issue there's even one of these URLs listed in SERPs above me which redirects to my site.

The main point is that Google bot seems to note that there are links to somewhere with the right anchor text in them but it can't be checking where they are going it just uses them to feed the ranking algo irespective of whether they are blind alleys or not.

There seems to be two kinds of these links ones to a script on the current domain or an associated domain with a query string which is fed into the redirection script and ones on the affiliate domain in which the query string tells the click counter at the other end who sent the referral. Both of these kinds of link seem to be being counted by Google in this new algo.

Perhaps this is specifically the new Spam. If this is about DomainPark this makes sense because all of the links on the pages generated by Google DomainPark would be this kind of link but would have the right keywords in the anchor text.

What do you think?

Best wishes again

Sid

subway




msg:211494
 5:41 pm on Feb 6, 2004 (gmt 0)

Has anyone got any theories about precisely *why* these pages with hundreds of unrelated outgoing links are performing so well in this daft algo?

Yes...

A search engine ranks sites by keeping the cr*p at the bottom - and then sending the relevant non cr*p to the top. (simplified by true). When you have a filter applied to very content rich, SEO'd sites (that give the user what they are looking for), and the filter is probably 50x more harsh than it needs to be - you get the once good and relevant sites pushed to the bottom, exposing a "vacuum of cr*piness" that escape the filter - probably because they lack backward links but are still heavily spammy - that rise to the top. The “spammy directories” have always been there, just never so noticeable before.

[edited by: subway at 6:02 pm (utc) on Feb. 6, 2004]

thumpcyc




msg:211495
 5:53 pm on Feb 6, 2004 (gmt 0)

This Thread, "Post Austin SERPS Starting to Improve"

"Post" as in after?

Is the Austin update over?

Are there are still G DNS entries missing in action?
Are There still multiple sets of serps? (or is it now everflux)

Has the fat lady sung already, Is GG joining in the tune?

I have no answers for the big Gs current state of being, just questions. The main one, is Austin Over?

Thumpcyc

drewls




msg:211496
 6:03 pm on Feb 6, 2004 (gmt 0)

From where I sit, Austin isn't any more over than it was on the day it started. There are still different sets of SERPs. They're still hiding things from us. (though, I expect that to continue indefinitely, as Google appears to have decided to create an adversarial relationship with webmasters, rather than the former, more friendly and open one.)

pavlin




msg:211497
 6:21 pm on Feb 6, 2004 (gmt 0)

Hissingsid:
Google bot seems to note that there are links to somewhere with the right anchor text in them but it can't be checking where they are going

Why would GG have problems following the links?

Zeberdee




msg:211498
 6:25 pm on Feb 6, 2004 (gmt 0)

"as Google appears to have decided to create an adversarial relationship with webmasters"

Don't forget the adversarial relationship which they have created with many searchers.

Does anyone know when we can expect our pages to be removed from Google? We put that noindex,nofollow tag for googlebot all over our site on Tuesday but we still appear for all sorts of unrelated searches.

When will that filter though so that we have gone completely?

caveman




msg:211499
 6:29 pm on Feb 6, 2004 (gmt 0)

There are still different sets of SERPs.

Consider the possibility that this is now normal.

They're still hiding things from us.

This has always been normal. :-)

Chelsea




msg:211500
 6:32 pm on Feb 6, 2004 (gmt 0)

Google appears to have decided to create an adversarial relationship with webmasters

I would be cautious about posting this sort of message, because it is precisely what may be used to justify search results that continue to decline in quality.

caveman




msg:211501
 9:03 pm on Feb 6, 2004 (gmt 0)

LOL...

G may have more than one goal WRT the SERP's they're serving right now, but irritating webmasters isn't one of them.

steveb




msg:211502
 9:08 pm on Feb 6, 2004 (gmt 0)

To repeat, what I noticed about +a, +www and +keyword was that they were completely different than two days ago (except that I had never checked +keyword before).

They have no relationship whatsoever to pre-austing, pre-florida or pre-palezoic.

Previously, +a seemed to pull up the old results that were vaguely similar to normal results and vaguely similar to the old anchor text trash algo.

Now, I see results that seem to be algorithmically valuing "a" as if it was gold. It brings up sites where "...keyword a..." isin the title.

In other words, +a will never show old algorithm or "unfiltered" results. It will not show anything that can be sanely compared to the present "normal" results. Previously adding +a changed little. Now adding +a changes things to a different type of searching altogether.

But then, +www shouldn't change as much, since "...keyword www..." isn't as often in any page titles as "a". What I see here is a completely different ranking algorithm. There isn't a noticeable effect on "lost" sites.

Finally keyword +keyword... what the heck should this do anyway? Make it so the keyword appears at least twice on a page? Value double keyword density? Highly value sites that have the keyword twice in a title? Beats me. But what I get is yet another very different group of results... although these are at least vaguely similar to "normal" results -- except that two of the worst seo'ed sites on the planet that are genuine niche authority sites happen to rank drastically higher this way (like moving from 100+ to top twenty).

I dunno, but this is completely different than a couple days ago. Minimally +a is worthless for any comparison or speculating now.

MyWifeSays




msg:211503
 9:12 pm on Feb 6, 2004 (gmt 0)

Zeberdee,

You can get your pages out in 24 hours. Try [services.google.com:8882...]

Hissingsid




msg:211504
 10:09 pm on Feb 6, 2004 (gmt 0)

Why would GG have problems following the links?

I thought that preventing Googlebot following links and thereby passing on PR was why this kind of obfuscated query was invented. Are you suggesting that robots are hitting PPC links and that Overture, Espotting etc are charging for the clicks?

One odd thing about some kinds of redirected URLs is that they actually get into the index as though they are themselves pages but they have no content and Google cannot list any description etc with the link to what it is purporting to be a page but is really a URL which ultimately links to a page. If Googlebot had followed it properly rather than stopping and indexing the URL as a page then it would have some content to also add to the index.

Best wishes

Sid

pavlin




msg:211505
 10:15 pm on Feb 6, 2004 (gmt 0)

I can't get the point. How exactly do they prevent Googlebot to follow the links?

centrifugal




msg:211506
 10:42 pm on Feb 6, 2004 (gmt 0)

>>I dunno, but this is completely different than a couple days ago. Minimally +a is worthless for any comparison or speculating now.

Maybe because G intentionally made it so webmasters would be even more baffled? Just a guess, but it seems in keeping with the rest of G's actions as of late.

zgb999




msg:211507
 1:17 pm on Feb 8, 2004 (gmt 0)

"Finally keyword +keyword... what the heck should this do anyway? "

steveb
From what I am seeing it looks like this querry kicks out some pages that got their SERP because of some synonyms. keyword +keyword seems to preferr pages that use keyword and not a synonym of it.

This 105 message thread spans 4 pages: < < 105 ( 1 2 3 [4]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved