homepage Welcome to WebmasterWorld Guest from 54.196.57.4
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 72 message thread spans 3 pages: < < 72 ( 1 [2] 3 > >     
Google.com SERP Changes - September 2008
TerrCan123




msg:3735579
 8:01 pm on Aug 31, 2008 (gmt 0)

< continued from: [webmasterworld.com...] >

I think Google is weeding old or stagnant pages out of the index to make way for new pages, it is the only way they can keep up with the internet IMO. I recently did a search for a topic from 2002 and it was like going back into the stone ages in search. Everything now is what is happening today, not years ago. I don't know what all your sites are about but even on the top sites it seems they weed the pages.

For example I ran a search for an electronics product from 2000, only 8 years ago. You can barely find traces of it in the sites I searched via Google. Now do the same search from a product from today, say the iphone. There is probably a billion pages on that. Now I am not saying they are doing things wrong, but with the millions of pages added every day to the internet they have to delete or else run out of space perhaps. I just wish they had the ability to search the archives easily for the topics or products that are "old". Right now you can do that with Google news but not Google search.

Anyway my point is I think Google looks at a site and compares all the content, then keeps some of the most recent content in the results including the higher PR stuff and puts the older stuff in supplemental. That is only a guess but seems to be what is happening.

Since the older stuff I looked for was probably dropped into the deepest parts of these sites I couldn't find it with Google anymore.

Maybe though this is the way the internet search will be, you use if for todays content only. If they had to archive all our sites I don't think it is possible, not with all the pages being added.

[edited by: tedster at 4:46 pm (utc) on Sep. 1, 2008]

 

SEOPTI




msg:3739291
 6:26 pm on Sep 5, 2008 (gmt 0)

Crawling has nothing to do with lost rankings. They need to crawl to decide if a site fixed the problems.

Mbwto




msg:3739367
 8:25 pm on Sep 5, 2008 (gmt 0)

How can we get the ranking back then? We have 90,000 pages. 500 pages of unique content rest is a mash up.

potentialgeek




msg:3739487
 12:09 am on Sep 6, 2008 (gmt 0)

I'm seeing some changes in ranking for phrases that may suggest a weighting recalibration on the words in those phrases.

Keyword1 + Keyword2

An algo can weight the first more than the second, vice versa, or anything in between.

I remember back when Google used to focus on the first word putting much more weight on it. You had to score high on Keyword1 to get good ranking for the next word (Keyword2).

Current guess: if your site has many pages which include Keyword 1, yet not necessarily right next to Keyword 2 in those instances, Google "lifts" (slightly) your ranking level higher for the phrase Keyword1 + Keyword2.

In other words, the algo compares the single incidence exact phrase against the multi incidences of the components of the exact phrase. (Theme Referencing.)

Or a dial shift to increase ranking based on more concentrated established theme.

At the same time I may be seeing a dial down based on words that seem more spammy on a site.

Another one of those annoying fine-line Google algo issues?!

p/g

yuppiemtl




msg:3739498
 12:39 am on Sep 6, 2008 (gmt 0)

My 2 cents. Google's organic traffic was slowly increasing over the last few months for my website. On September 4th I experienced a huge drop, I'm now only getting 23% of the traffic I used to get from Google.

My website has not changed much recently, so I'm sure Google has tweaked somehow their ranking algorithm.

HuskyPup




msg:3739510
 1:51 am on Sep 6, 2008 (gmt 0)

Why try and comprehend something that is out of control of the supposed architects?

artistnos




msg:3739597
 7:32 am on Sep 6, 2008 (gmt 0)

In times like these I always refer to aol for comparative site: search results. I work on the premise that Google sell the results to AOL and therefore they must be absolutely spot on & correct. Google can play around as mouch as they like with their own pages, but they are responsible for anything they sell or hire out. I check one of my sites on AOL and only 44 pages are listed & hey presto on google there are 1426 pages listed.

I don't know what it means but it might mean something to someone.

All the best

Col ;-)

night707




msg:3739676
 2:06 pm on Sep 6, 2008 (gmt 0)

is there anything like a new -minus 50 penalty ?

How can a site go down like that without any big modifications ?

Stefan




msg:3739863
 11:36 pm on Sep 6, 2008 (gmt 0)

How can we get the ranking back then? We have 90,000 pages. 500 pages of unique content rest is a mash up.

Dump 89,500 of your pages, redo the navigation, and hope for the best.

SEOPTI




msg:3740084
 5:35 pm on Sep 7, 2008 (gmt 0)

It is really difficult to keep huge sites with 20k+ URLs in the index without tripping penalties.

I know what I'm talking about. Most of these site trip either the co-occurrance filter or the cookie cutter (thin URLs) filter.

So beware making huge sites until you have the content to support all those URLs without creating thin pages.

Mbwto




msg:3740493
 3:31 pm on Sep 8, 2008 (gmt 0)

Dump 89,500 of your pages, redo the navigation, and hope for the best.

The duplicate content is very useful to the visitors. Because we are combing content from 10 different websites to generate a mash up where the content become very vital for someone looking for that content.

I know what I'm talking about. Most of these site trip either the co-occurrance filter or the cookie cutter (thin URLs) filter.

SEOPTI Thank you for the information.

SEOPTI - our website is not a thin URL, it does provide good value to the visitors. How do google differentiate between a good site and a cookie cutter website. If this is done by some manual editor in some 3rd world country, who has never seen America and whose qualification is as good as dollar an hour, than definitely he will remove us from the serp because he will never understand the value which our client is providing.

The client is a startup company, so they can't effort to just rewrite the 89,500 pages of content just for search engines.

Beverly




msg:3740539
 4:14 pm on Sep 8, 2008 (gmt 0)

Re: "When you buy a link, buy it for your competitors first. Then watch them, if their ranking goes down, you can buy more links pointing to your competitors and this way you can kick them out of Google."

That is SO VERY scummy! Makes me sick.

drall




msg:3740612
 5:54 pm on Sep 8, 2008 (gmt 0)

Our largest site has around 200k pages indexed with around 100k pages in the primary index.

Here is my advice for larger amounts of data

- many deep ibl's
- semi to fully unique value add content
- avoid templates sitewide
- sensible site map
- unique title tags
- unique meta descrips
- more deep ibl's
- huge root pr

without most of the above you will never support 100k urls in the primary index.

bwnbwn




msg:3740697
 6:42 pm on Sep 8, 2008 (gmt 0)

The client is a startup company
Here is the deal right here it takes time and with a starup of 100 k in pages with 95% duplicate no wonder your gone.

Serious you really need to do something with the dup content being a new site it won't ever get off the ground without some serious and I mean serios links. The amount of links it will take will more or less then flip another filter as your building to many links to fast.

Best advice I can see is keep the 95 k in pages in a different folder just keep it from getting spidered viva robots text and the 500 unique pages get them up there to draw links, submit to directories, and do whatever you can to get links for the site in a controlled manner.

The 95k in dup pages will be the death of the site unless ya do something about it.

Mbwto




msg:3740785
 8:00 pm on Sep 8, 2008 (gmt 0)


- many deep ibl's
- semi to fully unique value add content
- avoid templates sitewide
- sensible site map
- unique title tags
- unique meta descrips
- more deep ibl's
- huge root pr

Thanks for your good information

Currently we have semi unique title tags. First part of the title is from our database and the next part is common for a particular category of pages. We are working on making as many pages with unique title as possible.

Does ibl means inbound linking?

Mbwto




msg:3740795
 8:10 pm on Sep 8, 2008 (gmt 0)

The amount of links it will take will more or less then flip another filter as your building to many links to fast.

We are worried about the same thing. As a lot of money is invested in this project, client don't want to buy any link at all.

The website is 1 year old, we came out of sand box in six months.

We are getting good traffic from MSN and Yahoo. We have good repeat visitors too. But client is worried because our Google traffic diminished a lot.

Best advice I can see is keep the 95 k in pages in a different folder just keep it from getting spidered viva robots text

Client wants to get rank for those, he believes that the information of this is very important and with our mashup we are adding a lot of value to the content.

bwnbwn




msg:3740859
 9:47 pm on Sep 8, 2008 (gmt 0)

up we are adding a lot of value to the content
Are you adding content to content scraped from another site?
Were is this content your adding value come from, and what kind of value are you adding to this content the client expects to rank for?

Mbwto




msg:3740893
 10:37 pm on Sep 8, 2008 (gmt 0)

bwnbwn -

we use google maps, whois, any many other API to create a data that is very helpful for visitors to compare things

Mbwto




msg:3740901
 10:51 pm on Sep 8, 2008 (gmt 0)

Another thing I realized now is
The page we lost rank for are all unique content pages.

For example -

We had unique content on new york widget and we were ranking for "New York Widget" for that page. But now, we have no rank for that unique content widget page.

We still rank (or recently got rank) 180 for the keyword "New York Widget" for an inner page that talks about "Manhattan Widget". "Manhattan Widget" is a mashup page and is not all unique content, but it is value added content.

I see this patter for all the keywords. Unique content lost rank, inner pages with not all unique content (not very related pages) still have the rank, but the ranks are 100+

Anyone sees simillar Issues? SEOPTI - Can you check if you are in simillar situation?

I Thank you all for your help

Thanks

SEOPTI




msg:3740928
 12:04 am on Sep 9, 2008 (gmt 0)

drall, the goal is not to get 100k into the index, this is the easy part, the difficult part is to keep those URLs without tripping any penalties.

Mbwto, yes I lost ranking for almost all of my sites on June 4, I'm convinced it's a new cookie cutter and/or thin pages filter.
After adding NOINDEX to the thin parts some sites are coming back.
Of course you will get less traffic this way.

drall




msg:3740945
 12:43 am on Sep 9, 2008 (gmt 0)

Actually if look closer, you will see that my advice goes directly to the whole tripping of the filters issue you are talking about.

Stefan




msg:3740962
 1:36 am on Sep 9, 2008 (gmt 0)

The duplicate content is very useful to the visitors.

The SE's might not see it the same way. They might see it as being scraped material that is posted on your domain only to scoop other's traffic (the domains it was scraped from).

Sorry, bredren, if you've only got 500 pages of unique content, that's all you should have online.

barcodeuk




msg:3741863
 10:10 am on Sep 10, 2008 (gmt 0)

< moved from another location >

We have lost most of our top listing on Google on the 20th June 2008 for most of our keywords. We had top 1-10 ( page1) organic listing for many keywords on our web site that disappeared over night. Not even for obvious keywords such as our company name, domain name etc.

Our site is not black listed by google. There are about 1000 pages indexed by google. But we don't appear on the first search results page as before. It appears on the last page or 5 ,6 pages on google SERP.

We have no idea why ? So we have no way knowing what we are doing wrong and do not know how to correct the "problem" !

Is there anyone who knows ?

As we are aware we haven't done any thing to violate the google webmaster guidelines.

We'd be most gratefull if some one could advice us what we should do to improve our ranking.

Thank you

Kind Regards
Tim

[edited by: tedster at 10:57 am (utc) on Sep. 10, 2008]
[edit reason] moved from another location [/edit]

tedster




msg:3741879
 11:04 am on Sep 10, 2008 (gmt 0)

Hello barcodeuk, and welcome to the forums.

What you describe has characteristics of two problems. One we've been calling the "-950 penalty" and you can read about it here: [webmasterworld.com...]

As much trouble as the -950 penalty (over-optimization penalty) is, the second part of your report is a bit more disturbing - not ranking for your own domain name. That usually shows that your domain has lost Google's trust, and that often happens if they suspect you of buying or sellng links that pass PageRank. It can also happen if your site has been hacked and the hacker placed hidden links on some of your pages - that's known as "parasite hosting".

potentialgeek




msg:3742667
 12:17 pm on Sep 11, 2008 (gmt 0)

I may be seeing indications that Google applies the same kind of extra scrutiny to heavily visited pages as it does to highly competitive keywords (even if there isn't a lot of competition).

p/g

tedster




msg:3743447
 12:44 pm on Sep 12, 2008 (gmt 0)

p/g - what kind of extra scrutiny are you seeing? Manual checks from thwe 'plex? Editorial input adjustments to the SERPs?

barcodeuk




msg:3743471
 1:11 pm on Sep 12, 2008 (gmt 0)

Hi Tedster,

Thank you very much for your reply. We haven't bought or sold any links and we don't participate with any links shemes. As we are aware we haven't violated the google guidlines. So we have no idea what we should do to get our ranking back.

outland88




msg:3744681
 7:04 pm on Sep 14, 2008 (gmt 0)

Look’s like to me around the beginning of September Google begin to make major manipulations on the top keywords in my areas. In quite a few cases they seem to have vanquished sites. In other words either get out the checkbook for Adwords or depend more on the longtail. Anybody see this in this in their areas? Oh, I forgot to mention they did this all for relevance.

tedster




msg:3744796
 2:38 am on Sep 15, 2008 (gmt 0)

outland - there's a new thread with a similar observation. This seems to be worth a dedicated discussion.

Keyword specific 1 page drop penalty after human review [webmasterworld.com]

potentialgeek




msg:3745478
 6:54 am on Sep 16, 2008 (gmt 0)

> p/g - what kind of extra scrutiny are you seeing? Manual checks from the 'plex? Editorial input adjustments to the SERPs?

Manual checks are something I've never been able to figure out so I can't say. I'm guessing it's a similar type of scrutiny algo as with a competitive keyword.

It's not unreasonable, really, for the same algo to be applied to highly trafficked pages even if there isn't competition based on the principle that Google wants to satisfy users. If the content was poor, they would be sending many users to junk pages... But they don't have too much to judge a site by relative to similar competition, so they would have to use other signals.

On a different keyword note, I just found my old 950d site get a rebound for a top single kw to almost top 10. Meanwhile for the main kw in the industry I haven't really seen any recovery yet or minimal (ten places better at best), leaving it near 90.

Not that I'm too worried, though. For this particular kw, if you're not in the top 5, you're toast. The first five sites give most users the data they want, so they don't bother to visit the others.

I just noticed in my sector Google revised its related search suggestions to make them more relevant to the season (instead of the averages for the year).

I've also noticed Google is quicker to update its suggestions on suddenly "competitive" or highly searched phrases.

For example, if a person becomes newsworthy, you could get the top two search phrases from the last, say, 48 hours about that person. I think it used to be based on the averages for a week or two if not longer.

p/g

mr_koozie




msg:3746289
 1:45 pm on Sep 17, 2008 (gmt 0)

How about this one. My site was listed between #1-5 for the past 4 years on google for let's say the word "widgets" also placing #1-5 for the word singular "widget" As of Sept 4-5 I am no where to be found (gone completely) for the word "widgets" but maintain my #1 postion for "widget" and other related searches containing both singular and plural of my key word. The only place things have changed is in the plural of the keyword alone. No content changes of recent. Anyone have any suggesttions on what to do?

tedster




msg:3746301
 2:10 pm on Sep 17, 2008 (gmt 0)

Welcome to the forums, mr_koozie.

That could be a real kick in the teeth. Are you certain your url is gone completely for the singular form of the word? In other words, have you looked all the way through to the last page of the SERP.

I'm thinking this might be the dreaded over-optimization penalty (the -950 penalty [webmasterworld.com].) The thresholds for that penalty get recalculated from time to time, and your site might be just over the edge now, whereas it never was before.

When we are talking about single words, there is often a difference in user intention between the singular and the plural form (informational intent, versus buying intent and so on) and Google continues to work on ways to disambiguate those two results.

It's common to see a set of "Searches related to:" results at the bottom of a single keyword SERP. Sometimes comparing the differences between the singular and plural pages in that area can bring an insight as to how Google is currently disambiguating the two word forms.

This 72 message thread spans 3 pages: < < 72 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved