homepage Welcome to WebmasterWorld Guest from 23.23.22.200
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 176 message thread spans 6 pages: < < 176 ( 1 2 3 [4] 5 6 > >     
October 2007 Google SERP Changes
stevenjm




msg:3465616
 4:00 am on Oct 1, 2007 (gmt 0)

< continued from [webmasterworld.com...] >

The backlink has indeed been devalued, however selectively.

.gov backlinks are still strong as is yahoo directory and other large authority type sites.

backlinks that are showing up all seem to have one thing in common - they also have the url in text. This fact would suggest that the text url is now very strong as its outweighing the negative effect of the actual backlink on the same page.

Its no secret that google has been going after paid linking and I think that this is part of their strategy to tackle it, where it will lead is anybody's guess.

I think you are correct about the search volume effected industries.

Taking it further, although text url is the obvious exploit of the moment, its also possible the algo is doing a correlation on the numbers between what IT considers to be a legitamte link from its list of sources and other sites which at a certain point is assuming paid linking or heavy seo and imposing backlink penalty.

I don't think its about "juice" to good links. I think its about penalty to bad and the ratio between good to bad links.

[edited by: tedster at 7:38 pm (utc) on Oct. 1, 2007]

 

kamikaze Optimizer




msg:3470571
 7:56 am on Oct 6, 2007 (gmt 0)


It would just make sense that if a link leading to a page disapears, and there are no other links crawled (yet) leading to it, that the page not return in the serps. This would affect blogs most since they link from their front page articles a lot and have the "similar article" links that are always changing.

[edited by: kamikaze_Optimizer at 8:00 am (utc) on Oct. 6, 2007]

adfree




msg:3470587
 9:10 am on Oct 6, 2007 (gmt 0)

Understand. Disadvantaged at best.

It is a blog with comments function (allowing anyone to comment). There is a chance for updates by the public.

I like the update box, maybe I'll inform there about similar writings and on-topic news.

That'll do just fine without having to touch the copy and change character of the literature presentation.

Thanks bunches!

followgreg




msg:3470678
 2:10 pm on Oct 6, 2007 (gmt 0)

I am still totally baffle at some websites that continue to keep their rankings in some industries.

There are websites now ranking for mid-competitive keywords based on factors definetely originating from a very dark place.
In notable industry I closely watch for a long time, backlinks from pharmacy type of sites (homepage links!) rank you amongs authorities.

This is really looking for trouble with spammers and text link buyers all over the place.

The plural results in our industry are the worst ever, making sense one time out of 5 or 10. Except for so called authorities eventually.

The geolocation amazes me. Regular national/internaltional US sites replaced by foreign sites on Google.com but still ranking on some other Google tld's....

I don't understand their geo location procedure.
A website that says "whatever country" widget with domain registration in "whatever country" is logically less likely to to interest US users.

At the same time, textlink buyers live in paradise, their rank exactly where they have chosen to. I really feel like I've wasted my time reported all type of scams and textlinks-only rankings.

Blogs are still king of the hill. I don't get this one either. From where I stand 99% of blogs are as good as geocities personal pages used to be. Blogrolls and any other link schemes such as creating multiple domains or simply having a blog on a separate domain bump all sites in the top 20/30.
Any cr.p looks like good Google food as long as it is fresh, whatever the relevancy, whatever even the Language...

Half of the web seems to be gone supplemental. No improvements whatsoever in textlink hunting and certainly not irrelevant backlink schemes.

Yet not all searches are affected like for the past 3-4 weeks. Quality has downgraded only on not so competitive searches.

To tell the exact reality: I find Google SERP's to look like Live.com. Not sure if Google knows what it means but to me Google has complicated their algorithm so much that it filters all but cheap promotion on usually unoticeable websites.

-----

gehrlekrona




msg:3470694
 2:55 pm on Oct 6, 2007 (gmt 0)

followgreg,
I am totally with you on this one! In my area I see tons of sites showing up from out of nowhere. I also see a lot less results for a lot of searches I do, results that usually show hundreds of thousands of pages now is in the tens of thousands instead. For the searches I do, like for widgets for sale, the results haven't changed a lot, same sites (except mine) show up but as soon as I add a location I get all sorts of cr@p. Blogs, sites from wherever, ebay listings and tons of pages from sites with location sub domains.
It seems that they can't get the location searches right and that their usual filters doesn't apply anymore.
I still see a lot of chinese spam sites een with the infamous .[space]cn.
As far as sites being in supplemental I also agree. I have been checking sites and lots of them have tons of pages in supplemental. Most of my pages have gone supplemental for some odd reason that I have spent hour after hour trying to figure out why. Hijacked? Duplicates? Spelling errors? Other penalties? Not fresh enough? Links changed/added?
Has anyone here "recovered" lately?

Errioxa




msg:3470699
 3:00 pm on Oct 6, 2007 (gmt 0)

The page .cn (page A) add links automatically in other pages (B), commentaries in blogs, forums, boxes of searches, etc..

Simultaneously other pages (C), are spamed by the .cn to link to other people's pages (B), in which the link has been introduced with the same key. With this, they obtain similar content and (B) will obtain more authority. They camouflage the links with .edu and sites with authority.

Scheme

B link to A=(our web)
C --> A,B
D --> A,B,C
E --> A,B,C,D
F --> A,B,C,D,E

[edited by: Errioxa at 3:01 pm (utc) on Oct. 6, 2007]

KanukBoy




msg:3470901
 10:49 pm on Oct 6, 2007 (gmt 0)

Here are 2 observations. I apologize if this is old news...

Observation 1...

I am seeing dynamic pages with parameters in the URL get beaten by "static" pages.

For example, static pages like:
- domain.com/widgets-countryX.htm
- domain.com/countryX/widgets.php

are beating out dynamic pages like:
- domain.com/widgets.php?country=12

Personally, my own dynamic pages have almost disappeared for keyword CountryX Widgets. The google toolbar shows gray PR. Only my index page which is static and links to all my dynamic pages appear in the SERPS in any meaningful way for keyword CountryX Widgets and rank very poorly.

Almost all of the top 30 CountryX Widgets SERPS are static pages.

Observation 2...
The SERPS seem to be messed up only in the searches where google suggests search refinements.

Anyone else seeing this? Are these old phenomena?

g1smd




msg:3470908
 11:16 pm on Oct 6, 2007 (gmt 0)

Stuff with parameters usually has other underlying internal duplicate content issues caused by poor design.

Once those problems are overcome (and for sites that I know have overcome those difficulties) I don't see a ranking problem.

nency




msg:3471315
 8:31 pm on Oct 7, 2007 (gmt 0)

I noticed one huge think in Google results.

For few keywords I was in top 5 for last 2 years and few weeks I dropped on google.com but just if you are checking from USA. If you are checking from some EU country I am still in top 5.

followgreg




msg:3471772
 1:29 pm on Oct 8, 2007 (gmt 0)

There's been an additional switch turned on between yesterday and today I think. It's getting even worst on google.com.

I don't know if they've messed up their geo location on purpose or not but there are rather ridiculous search results.

Example of site starting to rank on Google.com, according to Yahoo site explorer:

>site 1: 10 incoming links, including 5 from poland, 2 from india, 3 from totally unrelated sites in quite suspicious industries (casino and the like...)

>site 2: 15 incoming links, including 4 scrappers from previous MSN rankings and 11 links from lowest end general directories with plenty of adsense and textlink networks.

Noted a total of 7 sites similarly "relevant" for some searches. I am talking here about competitive (as far as Paid clicks go, yet low # of searches) 2-5 word queries.

I've seen that before, but only on MSN.

So if my understanding is correct my choices are now: click on the authoriy site or Google partner (ebay, about.com, ...), take a chance to by from the other side of the Galaxy or click on paid links?

I am very surprised that Google ends up with such "thing".
It's doubtful that they did not notice. This has been going on for what, 3-4 weeks and started much before at a lower degree.

I am also disapointed at webmasters here and elsewhere that probably see their own interest in the short term and do not raise any alarms to stop this debacle. Look a little bit around you people, search for medium traffic volume keywords and see what you are given to along with your usual authority sites.
Then compare to MSN for the fun of it.
WE do also have sites that benefit from whatever Google is doing right now, never thought they would rank on anything competitive since never promoted yet they do. But I think that nobody will benefit from more foreign sites, cheaply SEO'd results along with 1-voice-only authorities (or so-called). What is the point really? Does Google have technical issues, I mean real huge ones? Or is this all a giant test in which we are all guinea pigs for a month? Seriously!

------

whitenight




msg:3471785
 1:42 pm on Oct 8, 2007 (gmt 0)

hmm. I've been calling for a naming of this "Universal Update" :cough: for 3 weeks now.

Saw a nice news report on BBC today explaining how Goog (and the others) had better get their "universal search"/web 2.0 results together and more accurately display what searchers are really looking for.

And guess what, that's what they're trying to do.
Unfortunately, Goog has a nasty habit of being in perpetual "beta" and rolling out their testing to everyone.

Long story short - Goog knows Web2.0 will eventually take Goog market share so it's trying it's darnedest to implement Universal Search ASAP to keep up. As always, those changes have interesting side effects.

europeforvisitors




msg:3471958
 5:06 pm on Oct 8, 2007 (gmt 0)

Long story short - Goog knows Web2.0 will eventually take Goog market share so it's trying it's darnedest to implement Universal Search ASAP to keep up.

Universal Search has nothing to do with "Web 2.0" and everything to do with user education and common sense. If you were expanding your site from, say, widgets to widgets plus whatsits and thingamabobs, wouldn't you want your widget users to know that your site now covers whatsits and thingamabobs as well? And wouldn't you make it as easy as possible for those existing widget users to find your content on whatsits and thingamabobs?

whitenight




msg:3471970
 5:18 pm on Oct 8, 2007 (gmt 0)

Universal Search has nothing to do with "Web 2.0" and everything to do with user education and common sense

To paraphrase the report ... Google is failing to understand what searchers are looking for.
(see 1-word search threads)

Web 2.0 properties which are user-generated more intuitively target what a searcher is looking for because.. ummm... err... uhh.. oh yeah!, because it's generated by people -
not a stale outdated education-based algo from 10 years ago.

Last time I checked, universal search incorporated local search, user-generated videos, user-picked news topics and gee, Goog sure seems to like user-generated Wiki for their results.

The searchers are educating Goog, not the other-way around.

[edited by: tedster at 5:50 pm (utc) on Oct. 8, 2007]

europeforvisitors




msg:3472010
 5:54 pm on Oct 8, 2007 (gmt 0)

Sure, and Google's search results incorporate "user-generated content," including sites like yours and mine. But that was happening a long time before Google Universal Search and the October 2007 Google SERP changes.

whitenight




msg:3472012
 5:59 pm on Oct 8, 2007 (gmt 0)


Vertical search is important because it's one of the two major things I've long talked about as being how search will advance. First generation search analyzed words on a page to rank content. Second generation search tapped into link analysis. Third generation search to me is looking at both user input (what we visit; what we click on; personalized results) and making search go more vertical.

Google 2.0 - Google Universal Search [searchengineland.com]
(for your reading pleasure)
Funny that title of the page, no?

More educational reading for those who like to be informed.
[en.wikipedia.org...]

Robert Charlton




msg:3472134
 7:51 pm on Oct 8, 2007 (gmt 0)

To get off this semantic discussion and back on topic about serp changes...

I'm seeing some movement in sites that I'm very familiar with that I can only attribute to more weight being assigned to onpage factors. It may be temporary... I just noticed it this morning. Anyone else seeing this?

Arctrust




msg:3472169
 8:29 pm on Oct 8, 2007 (gmt 0)

Robert:

I have seen the same - More weight to on-page factors.

It seems that up to a few weeks ago, on-page factors was roughly....
Least Density for a KW with Highest Prominence

Now it seems that it has changed to:
Highest Density with Highest Prominence
(this is pretty hard target to hit if your site is not spammy)

Naturally there are other factors (G claims that there are about 200) that get factored into the algo but for the KW I check.... this is what the highest ranking sites ALL have in common except for Wiki and Answers which seem to have received a hand job by G.

ARC

[edited by: Arctrust at 8:35 pm (utc) on Oct. 8, 2007]

tedster




msg:3472211
 9:36 pm on Oct 8, 2007 (gmt 0)

Bill Slawski just posted - for the second time - about a Google patent application [seobythesea.com] made in July 2007. This patent is about Authority Document Identification [appft1.uspto.gov] and it details an approach where one authority document on a domain can boost other urls on the site, even if the specific search terms are not on-page.

If this approach is what is kicking in - and it's certainly possible, given that the patent is associated with localization - then Google would be folding "on-site" factors into their algo - so in making sense of current shifts, we might have on-page, on-site and off-site factors to consider. Bill considers how this same technogy could extend far beyond "location signals".

In fact, this could account for something I'm seeing. Some urls seem to be getting a rankings boost when the entire site is predominantly about the search term, compared to other urls where the entire site is not so precisely focused.

zjacob




msg:3472375
 2:35 am on Oct 9, 2007 (gmt 0)

Apart from all the other speculation of what is going on with the serps, my observations would suggest that there are things going on that go beyond traditional SEO.

Rather, much of the changes in the serps would suggest that Google is weighting more heavily SERP CTR rates and bounce-back rates from those clicks, similarly to what they've been doing more and more with Adwords in determining a Quality Score.

Maybe their new garbage identifying mechanism weights more heavily this "Quality Score" of your SERP listing?

slawski




msg:3472414
 4:22 am on Oct 9, 2007 (gmt 0)

Hey Tedster,

In that post, I did bring up the earlier patent application, but it was because a newer one does expand the process involved to areas outside localization:

Propagating useful information among related web pages, such as web pages of a website [appft1.uspto.gov]

Here's the abstract:

Web pages of a Website may be processed to improve search results. For example, information likely to pertain to more than just the Web page it is directly associated with may be identified. One or more other, related, Web pages that such information is likely to pertain to is also identified. The identified information is associated with the identified other Web page(s) and this association is saved in a way to affect a search result score of the Web page(s).

The two share an author, Daniel Egnor, who was the technical lead for Google Local Search when the first patent application was written, and moved over to the algorithmic search team. The documents also share some ideas on how a page could possibly be seen as an authority page. The earlier one focused upon which page might be the best one (an authoritative page) for a business determined to be at a specific location.

The second one discusses localization, but it also explores determining whether a specific page might be an authoritative one for a specific query term, or for a category, as well as a business at a specified location.

The kind of boosting that you describe is detailed in the newer patent filing - folding "on-site" factors into their algorithms.

tedster




msg:3472855
 4:01 pm on Oct 9, 2007 (gmt 0)

Thanks for sorting that out for me Bill - I was a bit crossed up. I'm wondering if you, or anyone else, is seeing evidence for this kind of boost of a url in the SERPs.

I have one client who keeps getting Google traffic to odd pages that are one click away from an important page. Usually these are long tail searches, with the "authority" phrase as only part of the complete query. Hard to nail this one down definitely as the result of this patent technology, but it is suggestive. Even landing on a "wrong" page, the traffic is doing OK because they do use the site navigation as a rule to check out the "main" page.

If this is that patent in action, I guess the fine tuning and "machine learning" will take a while to kick in effectively.

kamikaze Optimizer




msg:3472869
 4:14 pm on Oct 9, 2007 (gmt 0)

Ted: Evidence... I am seeing it in about 4%-6& of my traffic right now. It was not this way a week or so ago.

I just do not feel that the site that I am seeing it on is authoritative, buy any means.

slawski




msg:3472883
 4:30 pm on Oct 9, 2007 (gmt 0)

I spent most of the weekend going through the patent application, and really haven't had too much chance to do any exploration of serps and analysis of data.

I do think that this kind of authority might show more clearly in results where there maybe should be an "authority" type site, like in their example of "Ramada Cincinnati."

It's probably a good result that the Cincinnati specific page on the Ramada site shows up ahead of the trip advisor result, or the Yahoo Travel, or other travel related sites. Is that the doing of this patent application in action? I don't know for certain.

Robert Charlton




msg:3473060
 7:56 pm on Oct 9, 2007 (gmt 0)

I haven't had a chance to review the patent, but this discussion brings to mind a comment made by Adam Lasnik at the SES San Jose "Meet the Crawlers" panel in Aug 2006. I was struck by it enough to write it down, and I mention it here as it seems germane to the discussion.

Adam said: "Link love to a page may or may not always be appropriate to spread to a whole domain."

Regarding...
...Google traffic to odd pages that are one click away from an important page...

...I remember also an earlier discussion with Marcia, where she'd mentioned observing jumps in ranking for a page which she attributed to inbound links to a page one click away. I thought I had observed the same thing. We didn't know what might explain it, though.

Regarding the more weight for onpage factors that I'd reported in my post above, I'm seeing it on several other sites across a fairly wide range of terms. These happen all to be index pages, so I don't think, within these sites, at any rate, that the cause is the effect of the authority boost the patent covers.

[edited by: Robert_Charlton at 7:58 pm (utc) on Oct. 9, 2007]

kidder




msg:3473231
 10:55 pm on Oct 9, 2007 (gmt 0)

I'm looking at AU results. During the last 24 hours I have seen the most unusual results ever. I have 3 machines on my network, my main workhorse machine gives me a complete different set of results for a search term to my laptop which is on the same network. I mean way different - the results on my main PC are very very bad and a couple of authority sites in my vertical don't even rank for their own domain names? I walk over to my laptop and the authority sites are back in place and search results are fine. If I check my traffic stats the numbers are down slightly so I am guessing this is not that widespread.

I thought this might clear up overnight but it's still the same today.

Can anyone explain this?

g1smd




msg:3473232
 10:57 pm on Oct 9, 2007 (gmt 0)

I see for one site, page counts of 244 000, or 550 000, or 650 000, depending on which Class C block I look at.

I see many other sites with similar such anomolies.

Things are all over the place.

johnlim9988




msg:3473246
 11:56 pm on Oct 9, 2007 (gmt 0)

"I see for one site, page counts of 244 000, or 550 000, or 650 000, depending on which Class C block I look at. "

What is the meaning of "depending on which Class C block I look at"? Does it mean if you see from different PC at different class C? Like you use proxy?

Thanks.

kidder




msg:3473249
 11:58 pm on Oct 9, 2007 (gmt 0)

So maybe the job is getting too big for Google, maybe they need to trim a lot of the excess from the results and provide a leaner better product. It's not like end users won't notice the reduction in quality.

steveb




msg:3473299
 12:51 am on Oct 10, 2007 (gmt 0)

I'm afraid I'm not getting why this is significant:
"Google traffic to odd pages that are one click away from an important page"

This is how it has always worked from day 1. Basically any page "one click away" from an important page is linked from an important page, so of course benefits from a link from the important page.

"Get links from important pages" is hardly news.

kamikaze Optimizer




msg:3473305
 1:00 am on Oct 10, 2007 (gmt 0)

Hey Steve: What I have going on is traffic landing on the wrong page: A search for blue widgets, which I clearly have a page for; is displaying my yellow widget page in the SERPs.

steveb




msg:3473345
 1:51 am on Oct 10, 2007 (gmt 0)

That makes more sense as a phenomenon to be worried about, but still that has always gone on too. Google has often ranked two of my less relevant (and no stronger PR-wise) pages ahead of my most relevant page for some terms.

(If this "wrong page" thing is now more prevalent than previously, okay, that's something quite notable.)

iSrchNkd11




msg:3473349
 1:54 am on Oct 10, 2007 (gmt 0)

I think the whole thing is just completely absurd. G has totally messed up my site and the 100,000+ pages. I am showing 8 pages indexed in the regular index and 127,000 pages on the DC's. What does that mean? Anyone?

This 176 message thread spans 6 pages: < < 176 ( 1 2 3 [4] 5 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved