homepage Welcome to WebmasterWorld Guest from 54.198.25.229
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
The 5 Deadly updates
Have we learnt anything?
Iguana

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27860 posted 9:13 pm on Feb 5, 2005 (gmt 0)

We've had a number of these 'limited' Google updates (with Allegra also being an algo adjustment) - surely enough for us webmasters affected to be able to come to some sort of conclusion as to what is happening? But I certainly haven't figured it out and I gather from all the update threads that no-one else is really sure. My interest is not in pages that move position from #1 to #7 but where a whole site disappears off the radar and all of its pages end up at least 3 pages down.

So far (dates may be not be precise) we've had 5 updates with disasterous consequences for some sites and no difference for others. Certain sites seem to drop off the radar and often appear back in their old places at the next update.

My Data
I have 3 main sites (all in the ODP at least once, Site 2 also in Yahoo). All sites have been around for at least 3 years and done very well for the phrases they are targeted at (targeting done just using title, H1, and maybe one further mention in the text) - traditionally #1 or #2 for the precise phrase.

1 - 3000 page music site (affiliate links)
2 - a lyrics site (but with original content nowhere else on the web!) with about 40 pages - no affiliate links
3 - a review site for submited CDs with 200 pages - original content/no affiliate links

I can think of 10 possible reasons why Site 1 should be affected (affiliate links, multiple links from some other sites, etc) but not Site 2 or Site 3. Looking at what has happened in the updates:

Aug 10 - Site 2 pages relegated to about 100 places below what they had been
Aug 25 - Site 1 drops 40+ positions for most searches
Sept 23 - Site 1 reappears in old positions. Site 2 is back but never regains it's former positions completely
Dec 16 - Site 1 drops again
Feb 01 - Site 1 improves (but only slightly). Site 3 drops dramatically - even on it's name it's difficult to find, and on an exact string search in the Google Directory it's last of 4 results!

Theories:

Glitch
In September/August I was happy with the glitch explanation. But with the same things happening again and again I think there is something deliberate going on. But I did come up with the theory that my Site 1 drops were consistent with removing all the credit of links (internal and external) to my home page - leaving only the links to my deep pages giving me some sort of standing.

Hilltop
Maybe Amazon, epinions, kelkoo etc are all sites regarded as authority sites. Generally they don't link out to other sites (apart from e-commerce sites to which they are affiliates). This would mean that these sites pages are favoured (which they are) and maybe the sites that drop hugely do not have enough authority links and are just folded into the results at a much lower place due to other factors in the algo. I have noticed that my individual pages linked from the ODP pretty much maintain their position.

LSI
Looking at more detailed searches (bandname + 'websites' say) it's difficult to see how many of the first page results have achieved that position - they may mention the individual words but scattered around the page. Latent Semantic Indexing would partly explain this but not the incredible drops that some sites are seeing.

Theming
I did notice something very strange in the Aug 10 update. Site 1 didn't seem affected except in one aspect - it disappeared from the results for a particular phrase 'music widgets' where it had climbed to #11. Now I get all my traffic from the deep pages so my main page has never been much of a concern but for interest I used a phrase like 'music widgets' in every internal link and in links from my other sites (there are a few others not mentioned) - just as a measure of how my SEO was succeeding because this phrase is popular but not a money term. If Google was trying to theme the site it may have taken the fact that the home page entry is not in the music section (legit reason - the site offers itself as free content as well). If Google themed it as non-music maybe that would explain why every page did so badly in music-related searches. Doesn't seem to relate to Sites 2 or 3 because they are pretty well themed in the ODP and through other links.

Related Sites Penalty
Google knows my sites are related - traditionally Site 2 and Site 3 have links to deep pages on Site 1. I set up an experiment about 8 months ago - I invented a band name with 2 uncommon words and created 8 test pages on Site 1. I then put single links to these pages on Site 3 and some other throwaway sites I have. Obviously for the full 2 obscure word phrases all my pages were top. When I looked at where they stood in the search for just one of the words I could judge the 'power' of the links. 2 links did well - 1 from a Geocities site I have (in the ODP) and 1 from a very small lyrics site I have on my own ISP's server. BUT the link from Site 3 gave very poor results about #160 compared to #7 for the two links just mentioned. So, I realised that the links from 'related' sites were being devalued.

Site 1 doesn't have multiple links out to sites 2 and 3 (just a single link each). So, for the Related Sites Penalty to be the explanation, it would have to be that BOTH the site linking in and the site it was linking to was being penalised (but to differing degrees depending on the update). All the same I think my small experiment does suggest that Google does take account of the inter-relationship of sites.

No Conclusions but...
One conclusion we can probably reach from what has been happening is that the old mantra that Google is about PAGES is no longer true - more than ever before it is about SITES (excepting the glitch theory).

 

shri

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27860 posted 2:13 am on Feb 7, 2005 (gmt 0)

Very well put. *grin* Looks like we run the same site. ;)

skippy

10+ Year Member



 
Msg#: 27860 posted 3:18 am on Feb 7, 2005 (gmt 0)

To me this is looking more and more LSI. To rank under a pure LSI algo you have to almost do the exact opposite of what is considered basic SEO. I think we are looking at that with out all of the filters being applied.

If you think about it it makes sense. If you follow normal SEO i.e. using your keyword in your text multiple times and in your H tags and title and so on and then you hammer away with anchor text with the same keyword. This type of behavior is a death sentence with LSI.

Some of the areas I watch actually look half ways decent with the exception that my site is not there :) I can see no reason why they should not be there at least as a website about the chosen topic. They all look a little dated and I think that that is just because they have not been SEOed.

I think this thing is being rolled out across sections as the computing power allows. That is why you are hearing more me too in the update thread. This I think would be expected as everyone here most likely does at least real basic SEO. I really think if you are ranking well in MSN your site will get murdered in this new change.

WebFusion

10+ Year Member



 
Msg#: 27860 posted 3:45 am on Feb 7, 2005 (gmt 0)

I really think if you are ranking well in MSN your site will get murdered in this new change.

Nope. (2 year old) site is now ranking well in all 3 engines. I haven;t made any changes to the structure/layout of the site in over 9 months, as I didn't want to risk the Yahoo/MSN traffic just to try to find the right "google mix". This seems to be more a case of an easing of the "sandbox" coupled with an algo tweak.

At any rate...I never thought it possible (in the current search environment) to rank well in all 3 engines at once, but as of about 48 hours ago, we are.

Further, much older site (6+ years) has remained rock-steady in google serps. Consequently, this older site DOES NOT have a dmoz OR Yahoo listing. It does, however, have over 14000 pages of unique articles/content (not counting the forum, which would add another 10000 pages or so if I cared to check).

Frankly, I've had more luck over the last 3 years using Brett's formula for gaining organic traffic (one new targeted page per day, targeted links, linking out, etc.) then any other "tactic" we've tried. We still use H1 tags, Alt text, and everything else that makes a page user-friendly, so that effectively debunks the folks in the OOP camp as well.

My best advice (for lack of a better word): Stop chasing the engine(s), and start chasing the customers. Eventually, the engines will follow. For those whose site has been "lost"....don't panic. give it 6-8 weeks or so before you start chasing drastic changes.

skippy

10+ Year Member



 
Msg#: 27860 posted 5:59 am on Feb 7, 2005 (gmt 0)

WebFusion glad to hear you are holding up. I wonder what your serps look like. The symptoms I am seeing are loss of position with your domain name which would be expected with LSI. The other is in looking at the cache of the serps. In the sets coming from the new algo(?) two term keywords are rarely present together or are present once or twice but both terms still existing in the cache. Again something you would expect with LSI.

I think this is rolling across sections right now as I see it in effect and then not in effect via the Serps and the way I am check in via the cache for two word keywords together. So I guess my question is how is your domain name holding up and do you see two word keywords together in your serps. The other question is in your logs are you seeing keywords that would normally go to page x but are now going to page y?

gmiller

10+ Year Member



 
Msg#: 27860 posted 7:10 am on Feb 7, 2005 (gmt 0)

A lot of it would depend on what they were using LSI *for*... to compare pages to the search keywords? to determine the weighting of links? or something else?

If I were Google, I'd be looking at the second option. If that's what they did, we'd see sites with lots of links from unrelated guestbooks, message boards, reciprocal links, etc. drop like rocks. We'd see sites without a coherent theme dropping due to devalued internal links, but not as badly as the first group, I'd assume. We'd see sites with a focused theme and only organic links gain ground heavily.

So does that look like what other folks are seeing?

AjiNIMC

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27860 posted 3:36 pm on Feb 7, 2005 (gmt 0)

I really think if you are ranking well in MSN your site will get murdered in this new change

Not at all, our site is #1 in MSN and also #1 in Google and in top 10 in Yahoo, and these are for very very competitive Financial Keyphrases.

Both work on mutually independent Algo's, bill and Larry are no relatives.

Have fun
Aji

Iguana

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27860 posted 4:00 pm on Feb 7, 2005 (gmt 0)

I chose to describe my sites because I wanted to point out that Site 2 and Site 3 - totally non-commercial sites, tightly themed, original content, well regarded(and no SEO tricks, after all they are my hobby sites) have been hit very hard as well as my commercial site.

My Indie music reviews site (people send me CDs from all over the world) received 5 hits from Google yesterday - previous month I was getting 600-700 a day. I really think Google is letting searchers down when soemone sees one of my reviews quoted by Amazon in their Reviews section - types in 'widget noise' to see what I really said and gets a load of results about fans in PCs with the Widget Noise entry at about position #40.

lgn1

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27860 posted 4:52 pm on Feb 7, 2005 (gmt 0)

Since we have a dynamic site (with a complex url string), Google has only ever indexed our index page, so we optimized this page and have stayed in the top 10 most of the time post-florida(excluding about 6 weeks after florida, where we just disappeared). Before Florida we were #1 or #2 for 5 years.

We were hoping that Google would eventually figure out how to spider our dynamic site, and if 'sites' now count more than 'pages', we will need to push on site redesign to fix this.

From looking at the competition, the 'site' counts more than 'page' factor could be a viable explaination for 'Allegra'.

Im sure our results will rise up dramtically with the 'VIAGRA' update :)

webdude

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 27860 posted 5:44 pm on Feb 7, 2005 (gmt 0)

Webfusion wrote:
Frankly, I've had more luck over the last 3 years using Brett's formula for gaining organic traffic (one new targeted page per day, targeted links, linking out, etc.) then any other "tactic" we've tried. We still use H1 tags, Alt text, and everything else that makes a page user-friendly, so that effectively debunks the folks in the OOP camp as well.

My best advice (for lack of a better word): Stop chasing the engine(s), and start chasing the customers. Eventually, the engines will follow. For those whose site has been "lost"....don't panic. give it 6-8 weeks or so before you start chasing drastic changes.

Couldn't agree more. I have followed Brett's formula too. My sites have their ups and downs, but quite frankly, I have not been much affected by algo tweaks. In fact this last update brought one of my sites from #41 to #1.

You gotta stop chasing the engines. It will drive you nuts. The sites I have that are in the top 10 for all three of the major engines have lots of traffic that is not generated by people searching the engines, but by people finding the sites through other links.

I think, and this is just my opinion on the matter, that if you develop a site for your target audience and not the engines, find sites that would compliment your site for recip links, become an authority on the subject so people seek out your site for information, all the rest kind of falls in place.

Don't know what else to say. I don't fret over matatags and whether or not to use <h> headers too much. I tried that for a couple of years and it drove me nuts. Trust me, if you have an authoritive site, and I mean you are THE authority on any subject, it doesn't matter if your are in the top 10 or the top 200, the users will come. It always seems to work for me.

WebFusion

10+ Year Member



 
Msg#: 27860 posted 5:44 pm on Feb 7, 2005 (gmt 0)

So I guess my question is how is your domain name holding up and do you see two word keywords together in your serps.

We're #1 for all domain name (and number 2-3 as well).

Most of our top 10 position consitiute 2-3 keyword phrases. In addition, specific product pages are now showing up #1 for many relevant searches, where previously, they wer in sandbox/purgetory.

The other question is in your logs are you seeing keywords that would normally go to page x but are now going to page y?

No, we're not seeing that behavior. We are, however, seeing quite a few of our pages as "indented" listings (i.e. 2 pages listed for a specific term) in that the most relevant product/category page is displayed first, with our main index page displayed as an indented listing. This has occurred in about two dozen serps.

One thing to note is that we vary our incoming anchor text (from link exchanges/advertising) a great deal, keeping it on-topic, but using synonyms, etc.

Since we have a dynamic site (with a complex url string), Google has only ever indexed our index page, so we optimized this page and have stayed in the top 10 most of the time post-florida(excluding about 6 weeks after florida, where we just disappeared). Before Florida we were #1 or #2 for 5 years.
We were hoping that Google would eventually figure out how to spider our dynamic site, and if 'sites' now count more than 'pages', we will need to push on site redesign to fix this.

We had the same problem at one point (our site used dynamic pages/session Id's) which prevented good/deep spidering. To fix it, we had our developers do the following:

1. Create a system that makes all URls appear as static HTML.

2. Create an automatic sitemap system that adds the titles/URls of new product pages to sitemap everytime a product is added.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved