homepage Welcome to WebmasterWorld Guest from 54.211.97.242
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 425 message thread spans 15 pages: < < 425 ( 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 > >     
Dealing With Consequences of Jagger Update
Your site dropped? Lost rankings? What to do now?
reseller

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 32004 posted 8:25 am on Nov 12, 2005 (gmt 0)

Hi Folks

Jagger is winding down and life must go on. If Jagger has been kind to your site, Congrats. But for the rest of fellow members who lost rankings or their sites dropped of the index, its time to do some thinking and decide on what to improve or change on your affected websites. Still ethical measures are what interest me most.

Some food for the thought.

After my site was hit by Allegra (2-3 Feb 2005) and lost 75% of my Google's referrals and hit for second time on 22nd July 2005 ending up with only 5-10% of pre-Allegra Google's referrals.
My site is now back to the level of around 50% of pre-Allegra Google's referrals and growing... until further. I say "until further" because who knows what the next update or "everflux" do to my site!

Before my site returned back around 19-22 Sept 2005 (very slow at the begining), I went through my site several times for months and did the followings:

- removed duplicate pages. In my case it was several testing pages (even back to 1997) which I just forgot on the server.

- removed one or two 100% frame pages.

- removed some pre-sell affiliate program pages with content provided entirely by affiliate program vendors.

- removed few (affiliate referrals) outbound links which was on the menu bar of all pages (maybe we are talking about sitewide linking).

- on resource pages, I reduced the outbound links to be less than 100 .

- made a 301 redirect non-www to www (thanks to my good Norwich friend Dayo-UK).

- finally filed a reinclusion request in accordance with the guidelines posted on Matt's blog (thanks Mr. Inigo).

Would you be kind to tell us how Jagger Update affected your site, and what do you intend to do about it.

Thanks!

 

texasville

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 32004 posted 7:54 am on Dec 13, 2005 (gmt 0)

Reseller!..Good morning to you!
Let me tell you a story about my clients site..
I wrote many articles about safety and usage of his product line..
I have a secondary index about those articles..all original content and some have been picked up across the web and reprinted. Fine. no problem. They give full credit and a link back (the link back in most cases). All the articles carry our copyright in them and in the meta tags.
On some of them, I just did a description of them with a ...read more
and on some of them I used the first 3-4 sentences and a ...read more
Well, on all the ones I used direct content- the 3-4 sentences- have ALL gone dup content and relegated to supplemnetal listings in all the test dc's and varies back and forth on my google.com
So..I think this means that they have turned the dup content filter up so much that it is kicking even 3-4 sentence repition into supplemental.
The bad thing about this is that the articles are very pertinent and I get many hits on other se's from them.
If google ever gets it together and eliminates the supplementals- all those articles will go down the drain.
But it is proof because the articles I used just the description of in the sub index are not supplemental.

nfinland

10+ Year Member



 
Msg#: 32004 posted 8:08 am on Dec 13, 2005 (gmt 0)

I have been battling with this thing (and mayb esome others) since august when my main site was dropped from Google. I say droppped as traffic via G was 7 000 unic a day and the next morning 50 or something.

I do not know the exact reason, but have done several modifications since that.

- I have a cell phone related site and used to comment press releases when a new phone hit the market. I used to quote some keyline in the press release and then comment the new phone with my own editorial stuff. I think that isnt duplicate content for humans, but in fear of Gs filters I stopped doing this.

- I had an affilite template for selling ringtones as a part of my site. There are many other of the same affiliate sites out there so again was I afraid of beeing banned or filtered due to duplicate content and I have made the template <noindex> via roibots.txt (I guess I should also put the nioindex on the template itself as at least Yahoo is indexing the site though I have the noindex in the robots.txt)

- I have cell phone reviews and the technical specs of the phones are what they are and that is why there is similar looking pages on many other sites also. I do not know how this is treated, but here also I now try to use other words than in the original phone specs given by the manufacturer (use cell phone instead of handset etc.)

I think this rewriting of the tech specs is actually really stupid, but what can you do? Google says make pages for human readers not for SEs. Right - then I wouldnt rewrite the specs and I would quote the original press release as I think this is something that you do to make a good article (Im also a journalist so I think I know something about writing too).

All the above mentioned was a quite easy step to prevent duplicate content. Then came the off page factors that Im still working with.

In august I had more than 400 incoming links from scraper sites that used my content or directory sites that had my url along with snippets of my site. I guess those sites somehow could be treated as having same content as my site. And most of them were spammy sites so links from bad neighborhoods could be an additinal problem. Google helped me out with these scarpers also (have been banned / removed etc.) but there are still many of them. There is not much you can do about this.

Then there is the hijacking and related issues and in a way you could say (?) they are duplicate content matters also. There were several sites (and still some) that showed my page "within their URL". Im not shure what techniques were used, but they showed (still show) my page and content though the URL is theirs. Some of the sites have been removed from Google thanks to G and some of them actually removed the redirect "thing" when I contacted them.

I aslo rewrote pages that I found had been copied elswere. Not a nice job - the content was originally written by be, but what can you do when it has been copied to ten or hundreds of sites out there? Can you be shure G knows your is the original?

Now I have a copyscape logo on my site and info that anyone taking my content will be reported without any notice. I do not know if this helps, but you have to try...

And Im still not ranking on Google, but see some odd signs of hope. One page is now ranking 30 + on a very competitive keyword. All other pages still nowhere or buried in 60 + searches (used to be 1-10).

colin_h



 
Msg#: 32004 posted 8:18 am on Dec 13, 2005 (gmt 0)

A similar thing happened to me back in June. A clients financial advice site, mainly for existing clients (not passworded, but nofollow tags inserted) got hit by google. The only thing I could think was that it was our using of paid for syndicated articles from the financial info body that advises firms like these.

I got round it and back in the charts in august, by using quote marks around my 'read more' phrases and a full copyright notice on each intro page. I then linked these to framed pages which surrounded the basic info page supplied by the financial body. Thus my customers were none the wiser and google only credited the originators of the document.

Now a few months on and the page views for this site have not fell from their pre-june figures.

Regards

Colin

cleanup

10+ Year Member



 
Msg#: 32004 posted 9:46 am on Dec 13, 2005 (gmt 0)

"I aslo rewrote pages that I found had been copied elswere. Not a nice job - the content was originally written by be"

Yes, I have done that to every page on my whole site. It has made absulutely NO difference. Complete waste of time so far.

"Can you be sure G knows yours is the original? "

Google can/does not decide correctly, that is the problem.

"Pages still don't rank but I am hoping"

I think that about sums it up. Many of us are "Hoping" that Google can sort out this out along with other Jagger/Sept 22 issues.

I have decided that running around trying to "deal" with the consequences of Jagger is the wrong strategy.

The majority of the issues sites are having are caused by Google bugs (canonical,duplicate,links etc). Its impossible with all the fog to see the wood for the trees.

These are Googles problems not ours. IMO if you run around in circles trying to fix your sites now you may have to do it all over again in three months when the issues are finally resolved/rolled back.

cleanup

10+ Year Member



 
Msg#: 32004 posted 9:57 am on Dec 13, 2005 (gmt 0)

Hi Reseller from a blindingly sunny (but freezing)day here in Madrid!

Yes, I made some changes for a very small number of pages that I could use to draw conclusions from.

It worked to a degree, ie increased the pages that were ranking in the index so they ranked bit better.

My problem is not with those few pages that are still in the index rather with the hundred of pages that are blocked now in Google and the complete original site lost to a (assumed) duplicate filter.

Remember, we are now nearly three months after Sept 22.

So I say again that we cannot go on running round in cirlces trying to fix things for Google, they have to clean house a bit first!

Jean Valjean

5+ Year Member



 
Msg#: 32004 posted 11:06 am on Dec 13, 2005 (gmt 0)

" hundred of pages that are blocked now in Google "
Cleanup my advise ,google will never index those pages again but will have them in its daft brain ,so trasfer them to another directory.And wait until they get a new PR.Of cource not at once because you will have the red flag of hundreds of new pages ,plus don't delete the directory , make all the old pages just blanck text only or put images on it or what ever.Remember Google has all data even if is not at there index so to be sure that not after a few months you get another dub penalty dont delete the directory or the pages that MIA.Just change them.

Jean Valjean

5+ Year Member



 
Msg#: 32004 posted 11:14 am on Dec 13, 2005 (gmt 0)

Another advise.Don't underestimate Yahoo because you are possessed from _google.If you pages have top rankings there do not delete them or change them.If I haven't had top rankings at Yahoo I could be starving today.

nfinland

10+ Year Member



 
Msg#: 32004 posted 11:42 am on Dec 13, 2005 (gmt 0)

Hi Reseller from a blindingly sunny (but freezing) day here in Madrid!

You must be kidding? Can you be freezing in Madrid? Merry Christmas from the north (Finland :-)

Anyway. Could rewriting and a reinclusion request be something worth trying?

Jean Valjean

5+ Year Member



 
Msg#: 32004 posted 12:21 pm on Dec 13, 2005 (gmt 0)

reseler as far as i can see the thread has no future ,it will be boring like the saga one.Google is ice dry cold like the north wind that blows at Nyhavn and at Kongens Nytorv while you skating (do they put ice this year?)even if it is a Sunday sunshine morning at the middle of Decemper .Hmm... I'll better dring a Gluwein on a frozen Alster in Hamburg instead...Google .What a waste of time for Us West Europeans.....................If you know what I mean.

cleanup

10+ Year Member



 
Msg#: 32004 posted 12:24 pm on Dec 13, 2005 (gmt 0)

nfinland,
Cold in Madrid? oh yes my friend, Continental climate (hot summers and cold winters) plus an altitude of 750m mean that although nearly always bright, winter temperatures are often the same as or less than my native London.

Jean Valjean,
"transfer them to another directory.And wait until they get a new PR."

Worth a try I suppose, I will make a some tests. As you say I don't want in anyway to screw with MSN and Yahoo success just for the sake of screwed up Google!.

Cheers.;)

Jean Valjean

5+ Year Member



 
Msg#: 32004 posted 12:25 pm on Dec 13, 2005 (gmt 0)

As far as I can see on that page
[webmasterworld.com...]
Google hates Western Europe I can see only postings from DK FI DE ES UK.....Funny init?

Jean Valjean

5+ Year Member



 
Msg#: 32004 posted 12:28 pm on Dec 13, 2005 (gmt 0)

Cleanup ,Gracias mucho mi amigo.No se olvide de sus enemigos.

Jean Valjean

5+ Year Member



 
Msg#: 32004 posted 12:42 pm on Dec 13, 2005 (gmt 0)

"Cheers.;)"
Why not salud.;)
Why we keep on loosing our national identities? In favor of PAX ROMANA?

cleanup

10+ Year Member



 
Msg#: 32004 posted 12:51 pm on Dec 13, 2005 (gmt 0)

I have not lost my national identity, I am still a Londoner - a pesar de haber vivido 12 aos en Madrid.
;)

Jean Valjean

5+ Year Member



 
Msg#: 32004 posted 12:53 pm on Dec 13, 2005 (gmt 0)

Unfortunately this Forum administration is not a European administration ,so I know my future here.As soon the insiders wake up back in (ROME) they will ban me for posting here.LOL.
But first they should better read the book
The last days of Pompeii.

Jean Valjean

5+ Year Member



 
Msg#: 32004 posted 12:54 pm on Dec 13, 2005 (gmt 0)

cleanup sorry I thought you are from Spain.

Pico_Train

5+ Year Member



 
Msg#: 32004 posted 3:15 pm on Dec 13, 2005 (gmt 0)

Bon, on parle de quoi ici?

reseller

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 32004 posted 3:39 pm on Dec 13, 2005 (gmt 0)

Hi Folks

Very nice "language" exchange :-)

Lets go back to:

Dealing With Consequences of Jagger Update

Thanks ;-)

Miop

10+ Year Member



 
Msg#: 32004 posted 4:43 pm on Dec 13, 2005 (gmt 0)

I agree about the dupe content now being turned up high.
My site departments are set out so that the user can easily access products (not more than two clicks away from home page ISTR someone saying), so we would have body widgetery and that would be split up into subsections called for example gold body widgetery.
One very large section is split into three - metal body widgetery etc. black metal body widgetery, gold metal etc. All those three kw's used to rank highly, but now the are all demoted in the Serps because when you search for metal body widgetery, G finds all three pages even though two of them are a different kind of metal body widgetery. It seems to be attaching more weight to the keyword on the page.
I have now moved the two X metal body widgetery sections inside the metal body widgetery section to see what happens, but again this is making it harder for the user, plus the PR won't be passed down as the sections are deeper in the site. Most of my third level sections have been dropped by G, so it may be that they won't get found at all, but being at position 100+ is pretty useless anyway.
It seems to me that New Google places much more importance on the *hierarchy* of the site structure than previously. I could be wrong but this is a message I am getting on a daily basis now I am seeing how the update is panning out for my site.

Miop

10+ Year Member



 
Msg#: 32004 posted 4:52 pm on Dec 13, 2005 (gmt 0)

Actually I'm revising that a bit - G is ignoring the black metal widgetery section totally and returning a completely different page with *1* single item with that word in it amongst 30 other items without that word!
I still reckon over-optimisation and dupe content factors high.

Eazygoin

5+ Year Member



 
Msg#: 32004 posted 5:01 pm on Dec 13, 2005 (gmt 0)

Miop>>

I have a site that contains categories, sub-categories, and then individual items. Some of the sub-cats are broken down into as many as 30 individual titles,before seeing an item list, and so far it hasn't affected my site whatsoever.
I am mentioning this in case you are changing your site around for the wrong reasons. But more than that I can't help more....sorry :-(

Miop

10+ Year Member



 
Msg#: 32004 posted 5:05 pm on Dec 13, 2005 (gmt 0)

Thanks Eazy - I had other issues with the site too so I don't know exactly, but we have a lot of complex and similar items on the site. I can't see why else G would ignore the main section for the particular kind of widget and just show the page where the word appears only once.
I am just guessing over-optimisation filter.
Our rankings are so pathetic I can't see I'm doing harm by having a minor tweak and see what happens - if it is that that is the problem, a minor tweak might be all it takes. :)

Eazygoin

5+ Year Member



 
Msg#: 32004 posted 5:57 pm on Dec 13, 2005 (gmt 0)

Miop>>

Sorry I can't be more helpful. You may want to post on the Google public support forum at Google groups, as theres some good guys on there, who post replies. I use it to keep up with certain info.

Pico_Train

5+ Year Member



 
Msg#: 32004 posted 8:20 pm on Dec 13, 2005 (gmt 0)

Why do some results for my pages use page text snippets in the description and others the "meta" description?

Is this good, bad or neither>

trimmer80

10+ Year Member



 
Msg#: 32004 posted 8:34 pm on Dec 13, 2005 (gmt 0)

Pico_Train

Meta descriptions will be used as snippets usually only when the keyword / keyphrase is not present on the page (eg. you rank for the keyword because of link text or keyword in URL).

I have however seen instances where the meta description is shown as a snippet when the keyword / phrase is on page. When this occurs the site never ranks for that phrase in Google.
Eg. I believe i should be ranking for "blue widgets", I am top in MSN and Yahoo and nowhere in Google. If I put "site:www.example.com blue widgets" into google it returns pages but with the meta description as snippets.

arubicus

10+ Year Member



 
Msg#: 32004 posted 11:32 pm on Dec 13, 2005 (gmt 0)

Thanks reseller for this topic I believe that there is much to be discussed.

As far as duplicate content...I find it hard to believe that a scraper scraping a percentage of content would be viewed against the originating site. To me that would be asinine. That would actually create another tool in an unsightly compeditor's arsenal of ways to destroy another's site.

I don't know what it would take for Google to penalize/downgrade for onsite duplication. To me I would think it would be a page only thing not a site-wide thing. Why would they have penalize the complete site for, let's say, re-destributed artlices even if it had a good percentage of unique content? Maybe they view this content as a way to push up internal PR which in turns links to other parts of the site? Why not not just discount single pages considered duplicate and links from those pages leaving the original content in place?

Many webmasters have worries of using snippets of their own articles/contnet when linking to the actual articles/contnet as in topic sub-sections. How is this viewed by Google? How about RSS feeds to other sites that use snippets of articles?

You know there are just too many questions along this line. We will never get a straight answer from G anyway and since there are so many factors in play...even sharing experiences can lead us into false conclusions...still wold be fun to discuss.

LegalAlien

5+ Year Member



 
Msg#: 32004 posted 11:58 pm on Dec 13, 2005 (gmt 0)

trimmer80,

>>> I have however seen instances where the meta description is shown as a snippet when the keyword / phrase is on page. When this occurs the site never ranks for that phrase in Google. <<<

That's not true! Pre-Jagger had us 1-10 for around 10 very competitive 2 and 3 word phrases; all 50-200m results, all with the phrases in the meta description and body, every one displayed the meta description.

Post-Jagger we now rank #8 for one of these phrases (60m results). Our meta description is still displayed. This includes the key phrase, as does the body text. Other phrases are down between 2nd and 4th pages, but all (except 1) are still displayed with meta descriptions.

Usually, it's the other way round from what you stated - i.e. If G cannot find the exact phrase in the meta description, it uses a body snippet instead. However, there are occasions where it drops the meta description in favor of either a snippet, or more often your DMOZ description -- I previously believed this was when the phrase was too prominent in the meta description, but not prominent enough on the page itself (also if the meta description was too long), but I now believe this also happens when it's too prominent site-wide (i.e. title straps, etc.).

<edit>Clarified

trimmer80

10+ Year Member



 
Msg#: 32004 posted 12:49 am on Dec 14, 2005 (gmt 0)

LegalAlien
Just illustrating what i've seen. I have never seen the meta description taken in favour over the body text for a 1-10 ranking. Doesn't mean it doesn't happen. I would be interested in a example though.

LegalAlien

5+ Year Member



 
Msg#: 32004 posted 2:19 am on Dec 14, 2005 (gmt 0)

trimmer80,

Just sent you a sticky.

LA

reseller

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 32004 posted 7:14 am on Dec 14, 2005 (gmt 0)

Good morning Folks

>>arubicus

I don't know what it would take for Google to penalize/downgrade for onsite duplication. To me I would think it would be a page only thing not a site-wide thing. <<

In the case of onsite duplication. If duplications happen on few pages, I guess its mostly a page penalty.

arubicus

10+ Year Member



 
Msg#: 32004 posted 8:00 am on Dec 14, 2005 (gmt 0)

"In the case of onsite duplication. If duplications happen on few pages, I guess its mostly a page penalty. "

That would be my guess. Well not a actually a penalty but more like being filtered.

Now the case with re-distributed articles I would have to say any single page that is re-distributed may get filtered rather than site-wide type penalty/exclusion. The effect of course will be on how many of your pages are re-distributed content. Which in my mind I have no problem with as long as the unique content is left undisturbed and only benefiting from links from other unique content on and off site. I can see how using re-distributed articles can be used to "push" up PR internally and in that case Google can just downgrade the effects of such pages. Distributing article elsewhere can also be used to manipulate PR in which G can (and I think does in some instances) downgrade for such things. This enables site owners to share information benefiting directly from site to site but not getting unfair benefit in G SERPS. I could be wrong though.

Now offsite duplication is a bit out of control in my opinion. It remains a full time job for many of us to keep up with all of that. And that is complete duplication of content (Full articles and even design). If G did crank up their dupe filter (off site occurances) it really can make a mess of things which would include many thousands of scraper sites scraping bits of content. (anyone with high ranking sites will surely have a scraper problem). I really have a hard time believing that G would intentionally penalize a site for such occurances. I believe they know that these scrapers are way out of control and that most webmasters cannot keep up with it all.

Now if G did a sudden downgrade of any benefit from links from scrapers can affect sites and make it appear to be a "penalty". Having links from such sites that once were keeping you in high the SERPS when taken away your site falls off the face of the planet.

This 425 message thread spans 15 pages: < < 425 ( 1 2 3 4 5 6 7 [8] 9 10 11 12 13 14 15 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved