|No matter how G changes the algo, there will allway be ways to spam and cheat. ut if they continue with this one, the web will sooon be full of those baloon sites I was describing earlier on this thread. |
Moving deck chairs around on the Titanic!
Can you sink a search engine?
VERY Long PS
Has anyone noticed that links that lead somewhere but not to anywhere that Googlebot can go seem to be being given the same weight in this algo as real open links. Most of the outbound links from these crappy directories are via some form of script with a long query string in the URL. As a side issue there's even one of these URLs listed in SERPs above me which redirects to my site.
The main point is that Google bot seems to note that there are links to somewhere with the right anchor text in them but it can't be checking where they are going it just uses them to feed the ranking algo irespective of whether they are blind alleys or not.
There seems to be two kinds of these links ones to a script on the current domain or an associated domain with a query string which is fed into the redirection script and ones on the affiliate domain in which the query string tells the click counter at the other end who sent the referral. Both of these kinds of link seem to be being counted by Google in this new algo.
Perhaps this is specifically the new Spam. If this is about DomainPark this makes sense because all of the links on the pages generated by Google DomainPark would be this kind of link but would have the right keywords in the anchor text.
What do you think?
Best wishes again
|Has anyone got any theories about precisely *why* these pages with hundreds of unrelated outgoing links are performing so well in this daft algo? |
A search engine ranks sites by keeping the cr*p at the bottom - and then sending the relevant non cr*p to the top. (simplified by true). When you have a filter applied to very content rich, SEO'd sites (that give the user what they are looking for), and the filter is probably 50x more harsh than it needs to be - you get the once good and relevant sites pushed to the bottom, exposing a "vacuum of cr*piness" that escape the filter - probably because they lack backward links but are still heavily spammy - that rise to the top. The “spammy directories” have always been there, just never so noticeable before.
[edited by: subway at 6:02 pm (utc) on Feb. 6, 2004]
This Thread, "Post Austin SERPS Starting to Improve"
"Post" as in after?
Is the Austin update over?
Are there are still G DNS entries missing in action?
Are There still multiple sets of serps? (or is it now everflux)
Has the fat lady sung already, Is GG joining in the tune?
I have no answers for the big Gs current state of being, just questions. The main one, is Austin Over?
From where I sit, Austin isn't any more over than it was on the day it started. There are still different sets of SERPs. They're still hiding things from us. (though, I expect that to continue indefinitely, as Google appears to have decided to create an adversarial relationship with webmasters, rather than the former, more friendly and open one.)
|Google bot seems to note that there are links to somewhere with the right anchor text in them but it can't be checking where they are going |
Why would GG have problems following the links?
"as Google appears to have decided to create an adversarial relationship with webmasters"
Don't forget the adversarial relationship which they have created with many searchers.
Does anyone know when we can expect our pages to be removed from Google? We put that noindex,nofollow tag for googlebot all over our site on Tuesday but we still appear for all sorts of unrelated searches.
When will that filter though so that we have gone completely?
|There are still different sets of SERPs. |
Consider the possibility that this is now normal.
|They're still hiding things from us. |
This has always been normal. :-)
|Google appears to have decided to create an adversarial relationship with webmasters |
I would be cautious about posting this sort of message, because it is precisely what may be used to justify search results that continue to decline in quality.
G may have more than one goal WRT the SERP's they're serving right now, but irritating webmasters isn't one of them.
To repeat, what I noticed about +a, +www and +keyword was that they were completely different than two days ago (except that I had never checked +keyword before).
They have no relationship whatsoever to pre-austing, pre-florida or pre-palezoic.
Previously, +a seemed to pull up the old results that were vaguely similar to normal results and vaguely similar to the old anchor text trash algo.
Now, I see results that seem to be algorithmically valuing "a" as if it was gold. It brings up sites where "...keyword a..." isin the title.
In other words, +a will never show old algorithm or "unfiltered" results. It will not show anything that can be sanely compared to the present "normal" results. Previously adding +a changed little. Now adding +a changes things to a different type of searching altogether.
But then, +www shouldn't change as much, since "...keyword www..." isn't as often in any page titles as "a". What I see here is a completely different ranking algorithm. There isn't a noticeable effect on "lost" sites.
Finally keyword +keyword... what the heck should this do anyway? Make it so the keyword appears at least twice on a page? Value double keyword density? Highly value sites that have the keyword twice in a title? Beats me. But what I get is yet another very different group of results... although these are at least vaguely similar to "normal" results -- except that two of the worst seo'ed sites on the planet that are genuine niche authority sites happen to rank drastically higher this way (like moving from 100+ to top twenty).
I dunno, but this is completely different than a couple days ago. Minimally +a is worthless for any comparison or speculating now.
You can get your pages out in 24 hours. Try [services.google.com:8882...]
|Why would GG have problems following the links? |
I thought that preventing Googlebot following links and thereby passing on PR was why this kind of obfuscated query was invented. Are you suggesting that robots are hitting PPC links and that Overture, Espotting etc are charging for the clicks?
One odd thing about some kinds of redirected URLs is that they actually get into the index as though they are themselves pages but they have no content and Google cannot list any description etc with the link to what it is purporting to be a page but is really a URL which ultimately links to a page. If Googlebot had followed it properly rather than stopping and indexing the URL as a page then it would have some content to also add to the index.
I can't get the point. How exactly do they prevent Googlebot to follow the links?
>>I dunno, but this is completely different than a couple days ago. Minimally +a is worthless for any comparison or speculating now.
Maybe because G intentionally made it so webmasters would be even more baffled? Just a guess, but it seems in keeping with the rest of G's actions as of late.
"Finally keyword +keyword... what the heck should this do anyway? "
From what I am seeing it looks like this querry kicks out some pages that got their SERP because of some synonyms. keyword +keyword seems to preferr pages that use keyword and not a synonym of it.
| This 105 message thread spans 4 pages: < < 105 ( 1 2 3  ) |