homepage Welcome to WebmasterWorld Guest from 184.73.52.98
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 105 message thread spans 4 pages: < < 105 ( 1 2 3 [4]     
February 2007 Google SERP Changes - part 2
MetroWebDev




msg:3260874
 10:59 pm on Feb 22, 2007 (gmt 0)

< continued from [webmasterworld.com...] >

trinorth,

You definitely bring up a good point. Up until very recently our site was a bit of mess navigation wise and it was hard for users to find a lot of the information. That has been corrected as of last week, so it will be a few weeks before I can see how the traffic patterns change, but yes, right now the percentage of our more than 1200 pages that are visited regularly is very small. As a long term strategy, we've always gone with 'build quality content' so we continue to add pages with valuable information pertaining to our industry.

Even though most visitors stay confined to a small portion of the site, shouldn't the other unique content be left as is? It fills smaller niches that pertain to more focused user groups. Our industry is very divided by state so pretty much every informational section we add, has to have state by state breakdowns so it's hard to combine this info into fewer pages.

Also, other than immediate backout rates, can Google track user's paths and time spent on a site if the user doesn't have the Google Toolbar installed? Doesn't this bias rankings towards the behavior of users that have the toolbar installed?

Thanks for giving me something else to consider, it is much appreciated.

[edited by: tedster at 5:31 pm (utc) on Feb. 23, 2007]

 

crobb305




msg:3266959
 7:14 pm on Feb 28, 2007 (gmt 0)

I see different pageranks occurring at 72.14.203.84, though I do not know if this is new or left over from the last PR update.

Jim_Olsson




msg:3267058
 8:16 pm on Feb 28, 2007 (gmt 0)

I have done the following to reverse the Google penalty/flaw affecting my site:

I have written several very critical articles about Google Search in Swedish and in English and also describing my problem.

I have reported several times to Google. This is the latest version.

"Google is broken? Unjust penalty?
We are running a site (name excluded here) with a technology profile. Since end of december most of our recent detailed articles from 2006-2007 about lcd-tv, and some other subjects, have been listed just above end of results even those with good ranking (see screenshot). On the contrary, using Live Search they are listed among top 10. Our articles are well researched and works well according to Google Analytics.

This is the second identical report of many similar. For a few days Search was OK now nothing works as usual. I am very tired of this mess."

I sent the latest complaint to both:
Google: Inappropriate or irrelevant search results
[google.com...]
and
Google Share Success
[google.com...]

tedster




msg:3267084
 8:49 pm on Feb 28, 2007 (gmt 0)

Jim, what you are describing is not just a February change but has been discussed intensively here since December - we named it Google's 950 Penalty [webmasterworld.com] and that link goes to the fifth part of a very long discussion. The other name people are using is the "end of results" penalty

You will find in that discussion several webmasters who did manage to regain rankings after taking various steps, even though the exact nature of the problem is still not 100% clear. Perhaps you can pick up on something in that discussion that works for you.

[edited by: tedster at 12:39 am (utc) on Mar. 1, 2007]

anax




msg:3267147
 10:11 pm on Feb 28, 2007 (gmt 0)

This is actually a continuation of the October 21 Massacre [webmasterworld.com], which I believe was the first report of the problem. Everything since then, such as the Google vs. Splogs [webmasterworld.com] report, the 950 penalty, the minus thirty penalty, etc., has been a continuation of the same underlying phenomenon (Google flailing around in search of anti-spam techniques and doing a lot of collateral damage in the process).

Biggus_D




msg:3267185
 10:55 pm on Feb 28, 2007 (gmt 0)

anax,

Do you still have a problem? Do you update your site often?

About the duplicate content I've read that some CMS create several URLs but according to an inteview with Vanessa Fox from Google, there will be no penalty for having duplicate content (that is: having many URLs pointing to the same page).

According to Fox, you can use a Google SiteMap to tell the search engine which URL you do want listed. Otherwise the Google bots will do it for you and pick just one of your multiple URLs. Either way, there won't be a penalty.

(There's also a link to the video interview...)

[edited by: Biggus_D at 10:58 pm (utc) on Feb. 28, 2007]

Undead Hunter




msg:3267195
 11:17 pm on Feb 28, 2007 (gmt 0)

Tedster:

I think some of us are seeing a different variation. I know my site and at least someone else on this thread ranks # 1 if you type in their domain name. Or domain.com. Back actual pages by title, those that used to be in the top 10 SERPS are now about 10 pages back, in the 100+ range.

First we were very high for almost 3 weeks, then down completely for 11 days, up for 3 weeks again and have been down since last Thursday.

Reading the minus 30 penalty: this isn't it. It's also not the end of results, just buried. And it's been up and down... I suspect it will come back up again but who can be certain?

Just for some clarification, although I don't know if that clarified anything or not.

optimist




msg:3267199
 11:27 pm on Feb 28, 2007 (gmt 0)

Based on all the discussions here, What would happen if we added a sidebar to every page of a site that was not just link based but content based?

Would we trigger the 950 penalty?

---------------

Second theory...

The days of free traffic are over and sites will rotate throughout the SE indexes indefinitely. Hence, we have the current versions of MSN and Google, only Yahoo seems to hold stable rankings.

Undead Hunter




msg:3267209
 11:40 pm on Feb 28, 2007 (gmt 0)


Clarification:

I just checked some of my SERP's, and we ARE # 1 for a series of phrases. I assumed the remaining traffic to the site was from Yahoo/MSN/elsewhere, but Google proper does list a handful of our pages in the top spot.

Why I can't search for my own name and have the About Us page come up, when its always been # 1 for my name (its now about 10 pages (110 results) down or so ... and yet have a handful of highly ranked pages only adds to my confusion.

I should add these pages are "duplicate content", in that other sites have since picked up or are running similar copies of these articles. Even 1 or 2 of the # 1 spot articles. Although again, we featured it first.

Overall traffic and revenue is holding at about 1/3rd our previous average.

I guess this is better than having the entire site demolished, but I'd love to know what could trigger the majority of pages to be buried, but not others?

Also: recent pages don't seem to be coming up except on exact searches. But that was always sketchy for the first few weeks after adding content even though it was spidered quickly.

tedster




msg:3267264
 12:56 am on Mar 1, 2007 (gmt 0)

My apology - the link I posted is the one I wanted, but I used the wrong anchor text (too many windows open makes for copy/paste troubles). It should read the "950 penalty" and I've now edited the incorrect link text so it reads properly. I do think this is what you are describing.

My current take on this - and not everyone agrees with me - is that we are seeing collateral damage from an application of Phrase Based Indexing and Retrieval [webmasterworld.com]. A "last step" re-ranking of the preliminary search results could account for what we are seeing. Accordig to the patent, if the phrase-based spam testing is positive (rightly or wrongly), then the url's relevance score can be reduced by a huge factor for that phrase only -- leaving the url free to still rank well on other phrases, including the domain name itself.

followgreg




msg:3267319
 2:21 am on Mar 1, 2007 (gmt 0)

Tedster,

You may be correct. But if that's what we see since last week end especially, then this must be a false positive type of test.

All the ingredient of spam (IMO) remain unfiltered since last week, while collateral damages demote legit websites.

I can't believe that a smart anti spam filter would keep content in the like of:
'At my keyword1 company we offer keyword2 services for keyword3 and keyword1. When you are looking for keyword1 just let us know and our keyword2 experts will help you in the process of buying keyword3'

the above is taken from live examples of a few sites currently doing better then ever, sites that used to be nowhere to be seen in page 1 or 2 or had difficulties in the past. These sites also happen to either buy textlinks, link to totally unrelated sites, having usually "link" pages on the homepage, have few but almost 100% similar backlink, are linked back from unrelated sites.

I'm the average webmaster, I don't understand why such sites would benefit from high rankings whatsoever.

At the end it misleads consumers and it is dangerous overall.
Latest clean results were somewhere 6 months ago, the past few weeks are a catastrophe in terms of quality and impact on legit business (what you call collateral damages).

Hopefully this is a false positive test, otherwise it was not the effort from what I see.

----

Biggus_D




msg:3267360
 3:32 am on Mar 1, 2007 (gmt 0)

Sounds like this phrase-based spam filter needs a huge improvement.

Something as simple as if some pages are detected as spam but others are nš1 or on the top X MAYBE this is not spam.

tedster




msg:3267429
 5:09 am on Mar 1, 2007 (gmt 0)

I agree. IF this is phrase based spam detection -- and we don't have certainty on that -- the the issue becomes how would a false positive (so considered by the site owner) be tripped. The patent [webmasterworld.com] appears to be looking for autogenerated pages (Markov chain stuff) and low value MFA content.

This patent is looking for statistically sigificant deviations. It compares, for example, a non-spam page with 8 to 20 occurances of related phrases to a spam document with a rate of 100 to 1000. That sounds pretty hard to do "by accident".

Now the patent does go on to detail several other approaches - one, for instance, calculates the maximum expected number of expected "natural" occurances (E) and uses E to establish a threshold. But even that configuration still sounds to me like it would be very hard to unknowingly stuff a page's copy to that degree.

But what if we're not just talking about body copy, but anchor text -- both as an off-page and on-page criterion, and especially with internal links? I've aready seen several urls regain decent rankings by backing off on phrase repetition in their anchor text. It's not yet enough evidence for me to yell "Eureka!" but it is enough to get my attention. We've discussed this before in a more generic way -- without the "phrase based patent" mechanism but more simply as an "over-optimization penalty" (OOP).

Especially if a site has a long "laundry list" style of menu, it can be easy to overlook excessive repetitions. I have not seen this in every "end of results" example I've looked at, but it is pretty commonly present. So there's no ureka moment for now, but the trail does have a certain scent to it. Google is mighty complex these days, so pinning down things like this can be quite elusive.

The patent also mentions phrases that have "a distinguished appearance in such documents, such as delimited by markup tags or other morphological, format, or grammatical markers." There we go -- more of those factors we've discussed in our OOP discussions.

If I owned a site where an important search term just got shot down to the end of results, I would definitely look for whether there was an accidentally heavy hand.

madmatt69




msg:3267435
 5:16 am on Mar 1, 2007 (gmt 0)

Yeah I just wonder about the internal linking thing. I mean if a page on my site is about Blue Widgets, why would I link to it with text other than Blue Widgets?

It seems really hard to figure out. If they're turning the knobs on anchor text, they've got them cranked to one side way to hard..and then for a while they crank it the other way.

And we're all caught in the crossfire it seems.

Site owners shouldn't have to change their navigation to try and convince google they aren't spam.

Unless of course they link to their Blue Widgets page with "Blue Widgets - Click here for Blue Widgets and the best in Blue Widgets Online from Blue-Widgets.biz" :)

tedster




msg:3267472
 6:40 am on Mar 1, 2007 (gmt 0)

It takes a lot more than one link to get into trouble - even if that link is sitewide and part of the main menu.

tflight




msg:3267674
 12:24 pm on Mar 1, 2007 (gmt 0)

tedster, much of what you are saying rings true with my site. I used to have a section of my sidebar called "Most popular Widgets" which gave people quick access to those pages. That section of the sidebar appeared on just about every page on my site.

Whenever I would add a page to the "most popular" section it would seem to hold its ranking well for about a month. Then sometimes it would increase in ranking just a little bit, but soon after it would take a significant dive.

It seemed like every page (almost every page, note below) I added to the "most popular" section would take an extremely significant dive about a month or more later, at the same day MC would mention a "data refresh" or people here would mention significant swings in ranking.

People have always talked about building incoming links at an even pace, naturally. Perhaps the same applies to internal links.

Here is where it got interesting though. The type of widgets I have pages for that I would add to the "most popular" section can be put into three groups. Lets call them company Green, company Blue, and company Red. Company Green and Blue are the most popular with company red in third place in popularity.

Pages that I promoted to the "most popular" section of my site from company Red, the third most popular, never seem to be impacted. It is always the pages from company Blue and company Green that get pushed back. There is nothing considerably different between those pages on my site.

< continued here: [webmasterworld.com...] >

[edited by: tedster at 7:20 pm (utc) on Mar. 2, 2007]

This 105 message thread spans 4 pages: < < 105 ( 1 2 3 [4]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved