homepage Welcome to WebmasterWorld Guest from 54.166.66.204
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 110 message thread spans 4 pages: < < 110 ( 1 2 3 [4]     
Old HTML redesign to pure CSS = dropped from google
flicky




msg:4160205
 3:35 pm on Jun 27, 2010 (gmt 0)

Wow, I'm losing faith quickly.

13 year old site. Using 13 year old code.

I decided to modernize using the cleanest of standards. I'm using the STRICT doctype (minimal errors upon validation) ... pure CSS, one stylesheet controlling everything which is minified... all pages compressed with gzip. No header information was changed... same titles, same meta. Basically I know what I'm doing.

I effectively took my average page size from 80k to 20k (before gzip compression) and the site is lightning quick. Visitors have been super happy with the improvement.

Every visitor but googlebot apparently.

The DAY after I rolled this major revision out, I began losing my major keywords. Every keyword was top 10, now I'm gone... not in supplementals - nowhere. Keywords dropped over the last 3 days... today there isn't much left.

My question... how long does it take google to "sort it out"? I'm quite certain that improvement in ranking should come from this work. It's discouraging to see everything gone completely.

What's killing me is that Google is encouraging webmasters to do everything I did to improve speed. So I actually take that to heart and get creamed? Come on, google!

Any insight?

thanks!

marc

 

sf49ers1983




msg:4170183
 8:23 pm on Jul 14, 2010 (gmt 0)

We actually had a similar experience with lost Google rankings and pages for implementing the "canonical" attribute to our pages. We had noticed that Google had indexed several versions of the same page for many of our pages, each differing in either capitalization or affiliate tagging (for analytics purposes). We also noticed that we had tons of inbound links coming into different versions of the same page thus creating different page ranks for each; so, wanting to capitalize on the inbound links so they all went to the same page to hopefully create stronger pages we implemented the canonical attribute, which Google suggests doing, which is supposed to tell Google what the real page is. Within a week all of our 2000 or so product pages were dropped from the index and our rankings tanked. This was about 7 weeks ago that these changes were implemented. Since the great and dreadful day of losing all of our page indexes we have since been recrawled and indexed. Our rankings are slowly starting to regain their lost ground, though we have not seen increases in any ranking to a better position of pre-change. So, my advice, although hard to accept, is to probably just let it go for now.

Google, from what I understand, attempts to mimic the real world business atmosphere in which when businesses make large changes, they may or may not be good changes. Google then decides if they are good changes by seeing if links change, if pages can still be found like they used to, if content on the pages has changed, etc. So, that might also explain the delay.

Planet13




msg:4170227
 10:01 pm on Jul 14, 2010 (gmt 0)

I find it hard to believe that just doing canonical link tags would decimate rankings like that... did you also do 301 redirects?

The reason I say that is that google has repeated often that canonical link tags are only "a suggestion" as to which url they should index.

By the way, how did you decide which was the correct canonical URL? Did you look at which version of the page naturally ranked highest for the keywords you were targeting?

Did you set the canonical URl to the URL that had the highest conversion rate?

tedster




msg:4170233
 10:28 pm on Jul 14, 2010 (gmt 0)

Several Google spokespeople have said that canonical link tags are taken as a "strong suggestion" not simply "a suggestion". So it pays to be careful when deploying them for the first time, especially if your solution is extremely automated. Here is some of the troubles I've seen people get into:

1. Choose the "www" version of the URL when Google has currently indexed almost all the site's content without the www. Eventually this works out OK, but there can be a long period of accommodation.

2. We have another thread here where a member accidentally used "example.com.oom" - and since com.com os a live CNet property, and since Google now follows cross-domain canonical links, trouble soon followed. IMO it shouldn't have because the original content and the content at the canonical URL are supposed to be "substantially similar" for Google to accept a website's "strong suggestion" of the canonical URL. Sometimes "should" does enter into the picture.

3. Another war story came from a Drupal based site that deployed the canonical module incorrectl, by having it pick up the Drupal native URL instead of the web-friendly rewritten URL.

So the canonical link tag can help a site, but you are playing with a voltage tool when you first deploy it - it pays not to be casual.

[edited by: tedster at 12:24 am (utc) on Jul 15, 2010]

flicky




msg:4170267
 11:49 pm on Jul 14, 2010 (gmt 0)

Update:

I want to thank everyone, including webmasterworld for bringing attention to my post. I'm still waiting this out 3 weeks later. I plan to report any new developments as time goes on as to hopefully help others planning to make a huge improvement in the quality of their code.

Let me recap and give today's status.

1. Site is 13 years old, one of the most established in my vertical and enjoys a very strong inbound link profile.

2. Code was also 13 years old. Table this, table that, font this, font that... you name it... the bloat and waste was there.

3. I've enjoyed top 10 rankings for dozens of premium keywords for years.

4. Ripped my code apart from the ground up, learned CSS, opened notepad and got to work.

5. I decided to follow STRICT doctype because I'm super anal when it comes to "getting it right".

6. Adopted a minimalist approach to my code which included the following:

a. removed all javascript
b. consolidated all CSS into one stylesheet
c. used CSS shortcuts to further trim down size of code
d. minified stylesheet
e. re-used stylesheet elements as often as possible within the page design to keep the code footprint to an absolute minimum.
f. compressed the final HTML page output
g. turned gzip compression on server-side
h. added cacheing details to htaccess
i. compressed all images further

7. The result of all this optimization is an average page size of 13k after compression versus the original 80k. The CSS file is 3k versus the original 15k.

8. Webmastertools shows a huge improvement when looking at the "site speed" graph - as expected. I was slower than 70% of sites, now faster than 70% and climbing daily.

9. Literally the day after I rolled these changes out, my google traffic dropped. Upon further investigation, I had lost all of the premium keywords. I was and still am getting some random long-tail queries. Not everything was gone.

10. My pages are still in the system, however most of them are only accessible when querying "domain.com premium keyword" not when searching the full title of the page.

11. Randomly, one by one a premium keyword will reappear. This has occured about 5 times. A few have stuck, but others will disappear after about 24 hours. As of today, I have about 3 of these premium keywords back. Dozens are still missing.

12. Forgot to mention that my content stays the same through all of this. I did add a "main menu" that I did not have before. That is the one thing I keep considering removing.

--

This has been an extremely frustrating process. You wonder if things will ever come back... you wonder if more changes are needed, but then, again worry that if you change something, it will prolong the process that "might" be in place.

I understand "googlebot shock". I understand sandboxing a site that has completely changed in order to protect google's index in case that site was hacked. But 3 weeks is a long time for a business to be effectively shut down. Especially in this economy.

Thanks again for all of your help. If anyone wants a peek at this site for research purposes, please IM me.

marc

UserFriendly




msg:4170298
 1:12 am on Jul 15, 2010 (gmt 0)

If your "main menu" is now on all article pages, and is more than a link or two, then I reckon that is the problem.

I feel that anything more than "go to section index page" and "go to site home page" is excessive, and perhaps Google's current algorithm feels the same way. Google are always banging on about maintaining accurate site map pages, so they might feel that duplicating a chunk of this on every article page is unwelcome.

Reno




msg:4170315
 2:01 am on Jul 15, 2010 (gmt 0)

If your "main menu" is now on all article pages, and is more than a link or two, then I reckon that is the problem.

I understand the complexity of this situation, but to think that simply putting a "main menu" on primary pages can cause this much havoc to an older well established site is very scary and drives home the point made many times in this thread about treading lightly when deciding on large scale changes. The evidence presented by flicky only hardens my resolve to not upset an apple cart that is holding the apples just fine and is attracting apple eaters on a regular basis. Sorry -- it's just too risky.

......................

flicky




msg:4170319
 2:14 am on Jul 15, 2010 (gmt 0)

Regarding the "main menu"... my site is broken down like this:

Main Categories (menu in place)
Sub pages beneath these categories (no menu)

So this menu only occurs on the actual top level category pages. Not every page of the site.

Regent




msg:4170852
 4:38 pm on Jul 15, 2010 (gmt 0)

Hi Flicky,

I feel your pain. The circumstance you are describing is a little different than mine and the sites I manage, but I believe the application is the same.

Over and over again, I have seen Google perform what I call "Google Background Check". In my cases, adding new pages or substantially changing existing page content triggers this check. In your case, it seems to have been triggered by code changes relating to site structure.

Nonetheless, the "Google Background Check" (at least in theory) says that if there is substantial change to a page, Google removes the page from rankings and puts the page into their "Background Check" process. This "Background Check" process could include many filters including:

Broken link
Code validation
Near-dupe content
PR contribution check
Malware check
Keyword screening (for government)
Trust level
SPAM check
Etc.

These types of processes are extremely CPU intensive and could not possibly be performed in real time or on a regular basis. So Google MUST have off-line processing for these CPU intense processes.

It is logical that Google would perform this kind of "Background Check" when they see a new URL. But it is also logical that when an existing URL undergoes significant changes that a "Background Check" must be performed again.

It is this latter condition I am theorizing your site has been relegated. In most circumstances, I see new pages or pages with significant changes ranking within 4-6 weeks, but there are cases where I see it taking longer.

It seems that the more commercial a keyword (phrase) the more Google scrutinizes that page. It may be that Google takes a harder look at pages that are in more competitive spaces - just a guess. PR may also affect how Google prioritizes one page over another. Presumably, higher PR pages would receive a higher priority and go through the screening process quicker.

In any case, there is lots of empirical evidence to suggest that Google does perform some kind of "Background Check" process. The evidence suggests that is where your site is right now.

P.S. "You Can't Hurry GOOGLE Love" - a new tune sung by 'many webmasters' ;-)

dgj

sf49ers1983




msg:4171043
 8:38 pm on Jul 15, 2010 (gmt 0)

"Planet13"

Since this is Flicky's problem, I will keep my post short. Yes, canonical tagging dropped all of our product pages from the index. No other change to the site was made, we have been in business for going on 10 years, and have had excellent SERPs for a long time now. Our pages were dropped; I saw it with my own eyes; every URL version of the same page was gone. Nearly all of them have been recrawled and indexed now.

Flicky, once again, this process for us has taken close to 8 weeks now. Our hope is that, given enough time, this canonical implementation will result in higher SERPs since all of our inbound links will not be pointed towards a single URL page. We regain lost ground almost daily. I think the change you implemented is far more reaching than ours, so I would possibly expect an even longer waiting period while Google sorts out that change. I have grown to understand and trust Google's word when they say that things done for the site that help a human user (instead of focusing on search engines) will help your rankings. Your changes, I think, will ultimately help your users with faster page load times and possibly better navigation (depending on how that menu thing works out for you). So, you might have to just wait it out...

indyank




msg:4172281
 4:10 am on Jul 18, 2010 (gmt 0)

flicky, This was an interesting discussion.Just wanted to find out from as to whether things have come back to normal for all your pages.

onepointone




msg:4172350
 9:06 am on Jul 18, 2010 (gmt 0)

I think Regents post has a lot of merit. From the webmasters viewpoint, he thinks: instantly streamlining the code will lead to instant rewards from google.

But from googles standpoint, it just sends the whole thing into flux. And re-examination.

Google loves new content, but not the base supporting the content changing.

londrum




msg:4172355
 9:52 am on Jul 18, 2010 (gmt 0)

there something to be said for just doing it anyway, and not worrying about google. if it's quicker, leaner, and better in every way, then whats the problem? i know it's a pain when the days tick by and you don't see any benefit, but stick with it.

in the long run you'll probably get more repeat visitors, quicker backlink growth, and these things are bound to tell in the end.

flicky




msg:4172453
 6:35 pm on Jul 18, 2010 (gmt 0)

I mentioned this before, but this is becoming a serious pattern here... anyone seen something similar:

As I mentioned, I've enjoyed top 10 positions for hundreds of very high quality, very competitive keyword combinations over the years. Since the code upgrade all of those were wiped and what's happening is that from time to time one of these will reappear in a position slightly better than before the chance. Within 24-48 hours it's gone again. This happens randomly and each time it's a different keyword phrase.

The details:

1. a keyword phrase will reappear in google for 24-48 hours
2. during this time a search for the title of this page will return #1
3. once it goes away, a keyword search for the title will NOT return the page at all.

This is maddening! I'm praying someone else has seen such strange behavior.

I've deduced this down to two possibilities:

1. Google is testing my pages one at a time
2. Google brings my page back, notices something wrong and removes it.

any help?

Planet13




msg:4172493
 7:58 pm on Jul 18, 2010 (gmt 0)

2. Google brings my page back, notices something wrong and removes it.


Just a thought here...

Is it possible that SERP click through rate is becoming a bigger factor in ranking?

Maybe google is giving your URLs a brief testing period to see what the user click through rate is in the SERPS before deciding where it will rank them on a more permanent basis.

Instead of finding something wrong, google says, "Well, it's time for this URL to get some temporary exposure so we can determine the CTR before assigning it a more permanent value."

And I wonder if it is also looking at domain-wide CTR as a factor on individual page ranking. So if MOST of the URLs to your site have an above average CTR, then there is a little boost giving to each individual URL ?

dimitar




msg:4173229
 8:49 am on Jul 20, 2010 (gmt 0)

flicky: yeah, I can report similar findings across 2 separate e-commerce sites I do front end dev work on - running a different backend, markup and product base.

the common thing was that prior to mayday, they both did very very well within their nichés - including very strong long tail traffic.

since mayday, longtails are all but gone. keywords for particular products that used to land page 1 now do not show in the first 200 results. there is no logic explanation for this at all

landing pages traffic (brand based) has largely remained the same, perhaps a very slight drop.

looking at this and trying to combat the longtail problem, I have done a number of changes that I would consider serious, at least on one of the sites.

due to product variations (colour versions), certain products were being shown via 2-3-4 or even 5 different links that would pre-select a different colour for the user (links from list view). these variants are now beind a no-follow and there is a canonical rel on all variants to the default product version instead.

content wise, the site has bazaar voice reviews (iframe based) which I have now modded to output review texts inline under a noscript tag (so much more unique content on every product page has become available).

no change in serps. if anything, pages of over 2-3 years of age on certain products have gone from the serps completely (unless you add the domain name to the keywords).

as an experiment, I even posted a post mentioning a particular product that used to be #2 on page 1 before on a web development blog and it showed up in the serps within 10 mins some 100+ results ahead of the e-commerce site's page.

i re-organised the linking strategy which was user-centric (funnelling traffic down by category from landing pages by brand, eg, Nike > Shoes > Trail running shoes > Mens trail running shoes etc which was providing a deeper level of linking to products - instead, the products are now also linked from the main landing pages. all variants which can be construed as duplicate content now have a canonical link to a top-most category.

no change in google rankings.

another interesting side effect, the automated breadcrumb parsing for the pages on the site have now gone and it shows the urls instead.

yet another thing i am noticing is a severe drop in the craw rate of the product pages in comparison to craw rates at end of april / may - i suppose this is relevant in the context of caffeine. custom crawl rates have little to no effect - can't hurry love?

if this continues, we are likely to go under - i am totally lost for words and devoid of ideas now.

our linking strategy has been organic links built at a rate of less than 1 a day from a relevant source (not paid) - done over the course of 2 years or so, targeting either the home page or the main brand landing pages.

one thing that is puzzling is, google seems to rank really high for old/disabled products that 301 to the parent brand pages (despite of it crawling them from cache and getting the 301, repeatedly). sigh...

jdancing




msg:4173451
 4:50 pm on Jul 20, 2010 (gmt 0)

Earlier this year it seems you had some backlink gaming problems in Google's eye's. It seems you fixed things and you did a reinclution request and eventually got your site back in the index.

Then months later you do this massive site rewrite. I am thinking when you were reincluded to the index, your site was put on probation and when everything changed overnight you triggered a flag which pulled your site back out of the index because now Google is wondering what you are trying to do now.

Perhaps this code change scheduled your site for a manually review. If there is even one borderline trick you are doing now it could be a while before you see your former rankings again.

The backlink issues you had in the past definitely could be related to the problems you are having now.

freejung




msg:4173592
 8:19 pm on Jul 20, 2010 (gmt 0)

I just want to chime in and agree with what pageoneresults said earlier in this thread. It would be a big mistake to conclude from this thread that you shouldn't redesign your site.

I just did a major redesign a couple of months ago that included replacing tables with divs, major changes to the nav structure (recategorization of some subcategories, implementation of new meta-categories, other nav changes), increased text content on category pages, and a number of other major revisions. My rankings didn't even flicker. Indeed, I was surprised at how little difference my changes made to rankings and traffic across thousands of moderately competitive keywords.

Something has gone wrong here. The OP has triggered some sort of filter or penalty somehow. I have no idea why, but I wouldn't advise taking this as an indication that you should not improve your site. It is definitely possible to make major changes without losing rankings or traffic. Make sure that you keep the same URLs or do systematic 301 redirects, don't change your overall theme, and I expect the odds are you'll be just fine.

flicky




msg:4173770
 1:40 am on Jul 21, 2010 (gmt 0)

If there was some penalty in place, why would "some" keywords come back with favorable positions? I can tell you that when I experienced a penalty in the past, EVERYTHING goes away. This is much different.

Stas




msg:4174061
 2:16 pm on Jul 21, 2010 (gmt 0)

I think there has been some kind of sandbox for changes implemented very recently. So there's no point using expirience from months ago on this.

2-3 months ago I have been doing sitewide changes everyday, no page has ever dissapeared from google. As I've said earlier I had month and year in <h2> tag on every page of my website, I was changing it monthly, never a problem. Until I've changed it on 1st of july - Very product specific pages have dissapeared immediately. Still not coming back. I get few google hits to them from pages 160-200...

Pages coming back to first pages and dissapearing seems like the same sandbox effect as with new sites/domains.

Now this puts me off doing any on-page SEO. Maybe that's what Google wants? Or maybe it just thinks page headings (h tags) shouldn't really change - and sends it to sandbox.

jdancing




msg:4174081
 2:57 pm on Jul 21, 2010 (gmt 0)

True Story:

Two and a half years ago I had a content site the was getting ~3000 google search hits a day then one day I went in and made the privacy policy and TOS nofollow. The NEXT DAY I went down to ~25 google searches a day.

This site was bringing in $1-2K a month so I really wanted that traffic back. So for about 12 months I tried all kinds of things to get the site to start ranking again because I knew what it could do but I eventually gave up because nothing made a difference. I put the site on the back burner an paid almost no attention to it for another year.

Then about five months ago after two years in the Google black hole I was looking a statcounter overview for all my sites and noticed that this site had zero traffic for 3 days.

While it lost most traffic, it did not lose 100% of traffic. I went to the site and it was completely crashed. The site was so far off my radar it took me 3 days to notice. I rebooted the site later that day and the NEXT DAY the site started getting 3000 hits per day.

Moral of the story: The last thing you did is not always the cause of the effect. Every time a site gets dropped, filtered, or banned, chances are you just did some kind of change if it is a fairly active site you are always trying to improve. So there will frequently be an "I did this and my site got hit" story.

The weird thing is I did nothing to my site for a year except have it crash for 3 days and it came back. I write this off to pure coincidence.

(Or you could always try turning off your site for 3 days and turn it back on to magically get all your traffic back :) )

This 110 message thread spans 4 pages: < < 110 ( 1 2 3 [4]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved