homepage Welcome to WebmasterWorld Guest from 54.227.67.210
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 105 message thread spans 4 pages: < < 105 ( 1 2 [3] 4 > >     
Post Austin SERPS Starting to Improve
Keep turning the crank back google
customdy




msg:211403
 1:54 am on Feb 5, 2004 (gmt 0)

In the last few hours I have noticed a moderate improvement in keywords that were heavily filtered in Florida and Austin. One of my competitors that domain name is keyword1_keyword2 is now back in top 10, he was gone in both Florida and Austin, doesn't look like he made any changes.

We are now back to page #2 or #3 on most 2 word searches, we have reduced keyword density but I think it is more of a tweek that Google is doing.

Keep in coming Google.

 

pavlin




msg:211463
 11:04 am on Feb 6, 2004 (gmt 0)

OK, I do not believe that this is about pages anymore. It is in fact about sites! But maybe we would have to give that word a new meaning.

Untill now when I searched for a keyword, the first results pointed to pages, dedicated to the keyword. Now they point to sites, that are mentioning the keyword.

When I search for a green widget the first results probably will contain :
1. Link to google directory
2. Link to DMOZ
3. Article on a newspaper or TV site saying somewhere in the text "She was wearing a green widget tonight"
4. Link to a page in a large online retailer (for most of the cases not exacly the page, selling the green widget, but a page that links to it)
5. Link to a large forum, where there is a post by a member, nicknamed green widget

I thing we shoud not mix the meanings of the words site and domain. But anyway the sites are the kings now.

The king (read content)is dead! Long live the king (authority)!

.....

The main problem with linking is this: Until now when you place a link on page a to page b with anchor text green widget, when the user searchs for green widget G was showing page b. Now it shows page a.

[edited by: pavlin at 11:52 am (utc) on Feb. 6, 2004]

djgreg




msg:211464
 11:09 am on Feb 6, 2004 (gmt 0)

steveb:
it is obvious that these searches bring completley other results:
keyword +a
searches pages containing keyword and 'a'
keyword +www
searches pages containing keyword and 'www'
keyword +keyword
searches pages containing keyword and again keyword

pavlin:
In my area there are no dmoz/google-directory or other directories in the top serps. Indeed the serps are pretty relevant, although my sites have gone for no reason ;-)
By the way: As subdomains usually are seen as autonomous domains, maybe it is not a good idea to split your content in several subdomains? Maybe Google regards the subdomains as different "domain" and suddenly your website has only 1 page at all because you have divided the rest into subdomains?

greg

steveb




msg:211465
 11:23 am on Feb 6, 2004 (gmt 0)

"that's been the case since Austin went live"

"it is obvious that these searches bring completley other results"

The presumption is all pages have "a", "www" and the keyword meaninglessly repeated, so the serps should be about the same. And they have been.

What has changed now, at least on whatever datacenter I'm getting is, the search for +a is wildly different than it has been the past weeks. Previously the +a would bring up what people would call "unfiltered" results. What they were doesn't matter. The point is what is showing now has no relationship at all to what was showing for +a 24 hours ago. These +a results are different than anything before, and also completely different than +www and +keyword -- and when I say completely, I mean there aren't five sites in the top 100 that overlap. It is very, very strange.

[edited by: steveb at 11:26 am (utc) on Feb. 6, 2004]

steveb




msg:211466
 11:25 am on Feb 6, 2004 (gmt 0)

"If you want to sell everything to everybody you won't sell anything to anybody..."

Um, no. Amazon ain't hurting.

Hissingsid




msg:211467
 11:35 am on Feb 6, 2004 (gmt 0)

I don't know what to say about this, except I'm stunned.

Do these searches:

keyword +a
keyword +www
keyword +keyword (use the same word twice)

Hi Steve,

I thought you had seen this before.

In my case if I search with +a or +www appended to my term I pop back at #1 or #2 where I was pre Florida even though I've made big changes to my pages and site.

keywords +keywords brings back something very similar to current SERPs here in the UK.

I think that you know my keywords, go take a look.

<rant starts>
This algo isn't immature its not been carried full term and is premature. At the moment they've got it in an intensive care ante-natal unit on mechanical breathing and 24 hour nursing.

It is very sad that a search engine that has 80% of the search market should not only launch a new untested algo on the World in November but then even before they have fixed the massive holes in it to roll it out to a much wider range of terms in January is almost unforgivable. I can only think that they thought that we were crying "Wolf" in November and December and someone had taught the whole of GooglePlex to chant the mantra "Webmasters are only complaining cos their site got dropped, the algo is good just close your ears and the noise will die down".

I'm not saying that the "baby could cheat the system" algo that existed pre Florida was anything near to perfect but this current one is several orders of presidence worse than it was.

Wasn't Google wonderful when it didn't carry ads and presented a nice eutopian view of the WWW. Somehow the complete focus on one deadly sin will lead to what one might predict from such a path.
</rant>

Gets off soap box, takes deep breath and tries to get on with work.

Best wishes

Sid

Hissingsid




msg:211468
 11:39 am on Feb 6, 2004 (gmt 0)

Hi Steveb,

I see that you are referring to a soecific new event not what was being seen previously using +www etc. Perhaps the techies at GooglePlex are trying to stop our play things rather than focussing on what is really broke with the algo.

If blocking these boolean searches resulted in the rest of SERPs going right I would be the first to applaud them but I don't think that fixing the SERPs is their motivation.

Best wishes

Sid

valeyard




msg:211469
 12:04 pm on Feb 6, 2004 (gmt 0)

keyword +a
keyword +www
keyword +keyword (use the same word twice)

These put up completely different results from each other and anything ever seen on this planet before.

Here's my pure speculation:

If the user searches for "blue widget +wotzit" then they really, really, really want to know about wotzit. So give a big algorithmic boost to that word.

Yes, almost all pages use "a" and "www", but probably at different positions, densities, etc. Could that be causing this?

Are you effectively asking Google for "Pages about the word 'a', and by the way I'm interested in 'keyword' as well."

edit: clarified that I'm the one speculating!

[edited by: valeyard at 12:40 pm (utc) on Feb. 6, 2004]

pavlin




msg:211470
 12:15 pm on Feb 6, 2004 (gmt 0)

djgreg:
I guess you are using the german version of Google, wich is (allmost?) unafected by the update.
There are no problems with the subdomains. G is using lot's of other indications to recognize a site - combination of simiular on some way pages on different domains and subdomains.

Kennyh




msg:211471
 12:22 pm on Feb 6, 2004 (gmt 0)

Pavlin, re: 'I guess you are using the german version of Google, wich is (allmost?) unafected by the update.'

Not true, if it were the results on .de would be the same as they were pre Austin. They're not. It's true that .de and the other European versions of Google have shown different results to .com over the last couple of weeks but that doesn't mean they've been unaffected. In fact, the results on .de and others have been growing closer to .com over the last week or so.

pavlin




msg:211472
 12:31 pm on Feb 6, 2004 (gmt 0)

I have experimjented with de results for an hour today and from what I saw tha algo there is different. Maybe there have been some changes, but the rules that apply are not the same as on the .com's. And german results realy do make sense.

Kennyh




msg:211473
 12:37 pm on Feb 6, 2004 (gmt 0)

None of that means that the de results are 'unnaffected' just that they are different, as has been pointed out meany times on this and other threads. And while de results make more sense to you, to me they're just as bad and irrelevant.

webdude




msg:211474
 12:44 pm on Feb 6, 2004 (gmt 0)

For those that are seeing some improvement, have you made any changes?

See this thread

[webmasterworld.com ]

pavlin




msg:211475
 12:51 pm on Feb 6, 2004 (gmt 0)

I see no meaning in arguing about G.de's SERP's relevansy.
The main question is if this algo will stay or no. In the first case I shoud forget about making meaningful sites with quolity content and start doing balloon sites - with lots of pages and lots of links randomly displayed.
And of course install some forums and then post in it.

Hissingsid




msg:211476
 12:52 pm on Feb 6, 2004 (gmt 0)

Yes, almost all pages use "a" and "www", but probably at different positions, densities, etc. Could that be causing this?

Are you effectively asking Google for "Pages about the word 'a', and by the way I'm interested in 'keyword' as well."

The strange thing is that adding +www or +a puts back pre Florida results and pages that were inexplicably dropped go back to the top. It is the correlation between doing this search and pages returning to where they really should be, not just mine but many more that I've looked at which is interesting.

I'm not sure that the way you are describing how Google produces the results for these searches is right.

This is from the Google Advanced search help page, quoted here for the purposes of education only.
"" + " Searches
Google ignores common words and characters such as "where" and "how", as well as certain single digits and single letters, because they tend to slow down your search without improving the results. Google will indicate if a common word has been excluded by displaying details on the results page below the search box.

If a common word is essential to getting the results you want, you can include it by putting a "+" sign in front of it. (Be sure to include a space before the "+" sign.)

It seems to me that what it is doing is finding pages that rank highly for the term which also include a very common word, so common in fact that almost every page includes it.

Why is it always the case (with minor movements) that when you do this the relevant pre Florida SERPs re-appear? If you can answer that without being dismissive I will be very interested to read the answer.

Best wishes

Sid

Essex_boy




msg:211477
 12:57 pm on Feb 6, 2004 (gmt 0)

Strange this, I have a site knocked out in May listed again in parts.

However its entirly made up of aff links which I thought Google was trying to get rid of.

Now look me in the eye and tell its working fine.

Chelsea




msg:211478
 1:01 pm on Feb 6, 2004 (gmt 0)

Sssid,

Ref: +a, +www, and the older -nonsense

These effects have certainly, and I think with some justification, been used as arguments for the existence of some sort of 'filter'. The idea being that a boolean search somehow confuses whatever 'filter' system is in place and returns the old results.

This is probably the simplest explanation, but of course despite what Occam said, the simplest expanation isn't *always* the true one.

(And of course, there's a newer principle: if it is critical of Google, it must be a conspiracy theory :)

Or as Des-shopping-cartes famously wrote:

"I link therefore I'm spam"

hee hee :)

Kennyh




msg:211479
 1:19 pm on Feb 6, 2004 (gmt 0)

'Or as Des-shopping-cartes famously wrote:

"I link therefore I'm spam"'

Uh-oh Friday gag alert! ;-)

Sssid - FWIW the boolean searches on the keywords I'm monitoring don't give pre-Florida results. They do give results which are different from regular searches post-Austin, but not as different as they were giving a week ago. Mysteriouser and mysteriouser...

Tomseys




msg:211480
 1:42 pm on Feb 6, 2004 (gmt 0)

All I know is that a few months ago, I could find pretty quickly what I wanted to find in google. Now when I do a search, it returns bunches of directories. And directories that have different topics on the same page. Like a big mish mash of results within results. Some of the directories have hundreds of keywords in a faint font on the bottom of the page like the porn pages do.

There is no information in these results, only buy this or buy that, poorly presented.

Chelsea




msg:211481
 1:50 pm on Feb 6, 2004 (gmt 0)

Has anyone got any theories about precisely *why* these pages with hundreds of unrelated outgoing links are performing so well in this daft algo?

glengara




msg:211482
 2:02 pm on Feb 6, 2004 (gmt 0)

You did ask.....

According to My Theory, we are seeing G try to identify topical "experts" using the Hilltop parameters.
This means directories, links pages, anything that looks like a resource.

As "expert" pages are not supposed to be the ones returned, I can only assume the process is a slowly evolving one.

Chelsea




msg:211483
 2:10 pm on Feb 6, 2004 (gmt 0)

But it's the apparent unrelatedness of the outgoing links that puzzles me.

I was doing a job search in the UK today on G, and one of these pages was the top result. Just hundreds of outgoing, barely related links with a search box at the top (great, let's search twice ;). But the inner search provided nothing on the original search topic, despite it being #1 of 500k listed by Google

Maybe Des-shopping-cartes was right :)

pavlin




msg:211484
 2:32 pm on Feb 6, 2004 (gmt 0)

I still believe that the core of the problem is that G is reversing the way it treats links. As I said if you place link on page a to page b and the text of the link is keyord, now when a user searches G serves page a. Thats why all of the directories are now at the top. I experimented at this using some of the links on my sites.
I gues this is a way to identify those "hubs", but it seems G doesnt know what to do with them.

Chelsea




msg:211485
 2:44 pm on Feb 6, 2004 (gmt 0)

As I said if you place link on page a to page b and the text of the link is keyord, now when a user searches G serves page a.

There's definitely a ring of truth to this - it even explains why a description of a page can now out-rank the page itself.

Totally unsustainable though. Let's say I link out to the top 100 in my serps, so now I get into the top 100 - then everyone links to the top 100 including me, so that they can get into the top 100. Before you know it, none of us are in the top 100, instead there are 3.5 billion directories all linking to each other, all of equal status, and no-one can find any original content anymore :)

Illustrative only of course; but you can see that it could turn into a runaway process.

But this won't matter, because surfers will turn away from Google if they only serve up directories.

[edited by: Chelsea at 2:59 pm (utc) on Feb. 6, 2004]

valeyard




msg:211486
 2:44 pm on Feb 6, 2004 (gmt 0)

Why is it always the case (with minor movements) that when you do this the relevant pre Florida SERPs re-appear? If you can answer that without being dismissive I will be very interested to read the answer.

Last thing I want to do is be dismissive, I'm just bouncing ideas around.

You're right about the use of a plus, I was thinking of other SEs where by default not all keywords have to be on the returned pages.

Here's a guess: if Google is now using something like Hilltop or whatever to get recommendations from authorities, this might simply not work with common words. Who's an authority for "a"? Thus it falls back to the good old algorithm we know and love.

This could also explain why some queries have been hit harder than others. If the new algorithm can't identify sufficient authorities/hubs for a term then it uses the old algo.

Hissingsid




msg:211487
 3:08 pm on Feb 6, 2004 (gmt 0)

Last thing I want to do is be dismissive, I'm just bouncing ideas around.

Hi,

Sorry I'm just being a grumpy forty something ;)

This +www or +a kind of gives me hope. That coupled with the fact that its not just my site/pages that have been dropped but millions of others too and even more important the fact that the pages that have replaced them are not relevant or are auto generated directories that just p*ss people off makes me think that things have to change.

If they don't change I think I'll move into autogenerated directories of serps from a decent search engine and put Adsense ads on them to generate some income.

Every silver lining has a cloud!

Best wishes

Sid

pavlin




msg:211488
 3:14 pm on Feb 6, 2004 (gmt 0)

Chelsea:
That'sthe real problem, I guess.
No matter how G changes the algo, there will allway be ways to spam and cheat. ut if they continue with this one, the web will sooon be full of those baloon sites I was describing earlier on this thread.

There is a basic rule in life - If it works, do not fix it!
It should get written on every Google office, I guess!

otnot




msg:211489
 4:01 pm on Feb 6, 2004 (gmt 0)

First thing that I noticed with +a and +www is that who is listed for Adwords also changes the number of ads showing. Whether this has anything to do with the natural results is anyones guess.

Net_Wizard




msg:211490
 4:05 pm on Feb 6, 2004 (gmt 0)

>No matter how G changes the algo, there will allway be ways to spam and cheat.<

You're right on the money. No algo is perfect and it will be always subjected to exploitation. The closest near perfect algo we have is the pre-Florida algo IMO, at least content sites and ecommerce sites coexist in balance.

tribal




msg:211491
 4:32 pm on Feb 6, 2004 (gmt 0)

Hey guys is it me or are the results back to pre-Austin? I checked some sites, and results looked good - very good. I observe the same, or maybe even better positions I had pre-Austin.

valeyard




msg:211492
 4:35 pm on Feb 6, 2004 (gmt 0)

Sorry I'm just being a grumpy forty something ;)

I know the feeling only too well!

This +www or +a kind of gives me hope.

Yep. It means the old, quality based algo is still there somewhere. So all we have to do is either persuade Google to admit they were wrong or teach every user in the world to add "+www" to their searches.

The latter will probably be easier. I've made a start but might need just a little help :-) C'mon you journos reading this, you could have an exclusive!

If they don't change I think I'll move into autogenerated directories of serps from a decent search engine and put Adsense ads on them to generate some income.

You, me and many others who used to wear white hats. If Google thought three thousand million pages was a lot to index they aint seen nothing yet.

Hissingsid




msg:211493
 5:15 pm on Feb 6, 2004 (gmt 0)

No matter how G changes the algo, there will allway be ways to spam and cheat. ut if they continue with this one, the web will sooon be full of those baloon sites I was describing earlier on this thread.

Moving deck chairs around on the Titanic!

Can you sink a search engine?

Best wishes

Sid

VERY Long PS
Has anyone noticed that links that lead somewhere but not to anywhere that Googlebot can go seem to be being given the same weight in this algo as real open links. Most of the outbound links from these crappy directories are via some form of script with a long query string in the URL. As a side issue there's even one of these URLs listed in SERPs above me which redirects to my site.

The main point is that Google bot seems to note that there are links to somewhere with the right anchor text in them but it can't be checking where they are going it just uses them to feed the ranking algo irespective of whether they are blind alleys or not.

There seems to be two kinds of these links ones to a script on the current domain or an associated domain with a query string which is fed into the redirection script and ones on the affiliate domain in which the query string tells the click counter at the other end who sent the referral. Both of these kinds of link seem to be being counted by Google in this new algo.

Perhaps this is specifically the new Spam. If this is about DomainPark this makes sense because all of the links on the pages generated by Google DomainPark would be this kind of link but would have the right keywords in the anchor text.

What do you think?

Best wishes again

Sid

This 105 message thread spans 4 pages: < < 105 ( 1 2 [3] 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved