homepage Welcome to WebmasterWorld Guest from 54.226.168.96
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Website
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 110 message thread spans 4 pages: < < 110 ( 1 [2] 3 4 > >     
Old HTML redesign to pure CSS = dropped from google
flicky




msg:4160205
 3:35 pm on Jun 27, 2010 (gmt 0)

Wow, I'm losing faith quickly.

13 year old site. Using 13 year old code.

I decided to modernize using the cleanest of standards. I'm using the STRICT doctype (minimal errors upon validation) ... pure CSS, one stylesheet controlling everything which is minified... all pages compressed with gzip. No header information was changed... same titles, same meta. Basically I know what I'm doing.

I effectively took my average page size from 80k to 20k (before gzip compression) and the site is lightning quick. Visitors have been super happy with the improvement.

Every visitor but googlebot apparently.

The DAY after I rolled this major revision out, I began losing my major keywords. Every keyword was top 10, now I'm gone... not in supplementals - nowhere. Keywords dropped over the last 3 days... today there isn't much left.

My question... how long does it take google to "sort it out"? I'm quite certain that improvement in ranking should come from this work. It's discouraging to see everything gone completely.

What's killing me is that Google is encouraging webmasters to do everything I did to improve speed. So I actually take that to heart and get creamed? Come on, google!

Any insight?

thanks!

marc

 

Swanny007




msg:4166641
 4:49 pm on Jul 8, 2010 (gmt 0)

flicky, I feel for you, I really do. I have a 10 year old site that has about a 7 year old design and now I'm terrified to do a table-less CSS layout. Sure I use tables and it's slow but clearly Google likes that. I'm definitely not going to make any big changes in the foreseeable future thanks to this thread.

JAB Creations




msg:4166722
 6:49 pm on Jul 8, 2010 (gmt 0)

Google likes HTML5 because the elements are more descriptive. In a previous post it was revealed that Google wanted class attributes on every element to understand the context of the content (e.g. class="menu" class="sidebar").

At the same time Google likes more content and less code.

Google has also done lots of research to understand what elements, attributes, and values are most common.

The best way to balance this is to make a good balance between using unique id's, classes, and understanding how CSS 1.0 works...yes, way too many people don't even understand how basic CSS works.

In general I give id's to a lot of elements to make working with JavaScript much easier though it also makes it much more descriptive for robots indexing each page.

The key I think when determining the names of unique id's and classes is to choose very common values such as "side" or "aside" (if HTML5 is still using that as an example).

Also having valid XML markup that won't break if it were to be served as application/xhtml+xml should be a high level goal. There is no doubt in my mind that Google engineers have spent time figuring out where the end tag for the following element would be on a page on a site such as mine...

<div id="content">

Since Google loves low code to content ratio using divisible elements with CSS1 effectively is important. Often you have to use two divisible elements to achieve certain styling goals however by knowing how to use selectors effectively (e.g. #side div {margin: 8px;}) you won't need to add excessive classes or id's.

In the end once your site recovers I think that you'll find it overwhelmingly worth your effort and time. I'm currently working on the 29th version of my site and I've changed the HTML and then XHTML code so much over the years but the site has a reasonable PR even without strong external references.

My biggest advice is since your site does not make changes like this frequently (e.g. been the same for 13 years) to not freak out, put it back, and make the site look like it's trying to gauge Google's rankings because then it'll make your site look spammy or not trustworthy. Anyone worth giving good PR to on Google is going to update their code to keep their site from reeking of the 90's. I'm sure just as other web designers dispise working with IE6 that Google engineers probably don't desire to work with older and often broken HTML code when trying to determine a site's PR.

- John

Atomic




msg:4166737
 7:10 pm on Jul 8, 2010 (gmt 0)

About two years ago I redesigned several of my sites so they were tableless, pure CSS etc. My rankings climbed steadily. I also redesigned a site last year that was made in 1996 so that it was pure CSS with a tableless design. It dropped a little bit for a few keywords, but most still rank as well as they ever did. Many of these keywords outrank Wikipedia for common, popular terms.

I see no problem with CSS if it's done right.

dvduval




msg:4166766
 7:52 pm on Jul 8, 2010 (gmt 0)

How many navigation links does a spider now have to wade through before it would reach the unique core content of the page? How does that compare with the previous version?


In my case, I reduced the number of navigation links significantly. I still hold the opinion (barring a better explanation) that making major changes to a site switches on a protective mechanism, or requires the need of a complete site reindexing. I don't have enough evidence to support my claim 100%, but that is the best theory I have right now. With a month having passed, I am seeing some improvement now, but only slight.

tedster




msg:4166767
 7:53 pm on Jul 8, 2010 (gmt 0)

Thank you, Atomic, I think that's a very important counter-point.

Whatever is going on with the re-coding that flicky did, the experience of an extended period of lowered rankings is not at all a common report. If a ranking drop was the common experience, all the forums would be lit up with complaints, all over the internet. So I suggest that others not overreact or feel total paralysis about making positive changes across their site.

Something has gone wrong in flicky's case. It's hard to say whether it's something about the site itself that is so far undetected, or some crazy bug on Google's side.

Reno




msg:4166774
 8:06 pm on Jul 8, 2010 (gmt 0)

all the forums would be lit up with complaints... I suggest that others not overreact or feel total paralysis about making positive changes across their site.

Good advice, but it must be said also that in fact forums all over the web -- including this one -- regularly see the thread "I changed my site and dropped in the rankings". It happens so often that the old adage "where there's smoke there's fire" seems awfully relevant.

Upon closer examination, could it be determined that changing the site was not the bottomline cause, but actually other factors triggered this drop? Possibly, but we don't examine each other's coding (here or anywhere else for that matter), so that question is mostly unanswerable.

Thus, while I do make regular changes to my sites (especially new content) -- sometimes on a daily basis -- I find that a cautious approach to architecture formatting generally works best. Over the long term the sites see considerable change, but not all at one time -- it just seems to me that the Googlebeast is happier that way, and a happy Googlebeast lowers my stress levels, which makes my wife happy too ;)

.........................

maximillianos




msg:4166778
 8:08 pm on Jul 8, 2010 (gmt 0)

This is why I don't make big changes to my big site(s). Too risky. Unless something is broke, I tend to lean towards leaving it.

I use new development techniques and styles on new sections of the site, but the old, ranked pages never get touched.

Just my rule of thumb.

Please google, don't penalize a site for following YOUR suggestions for increase PAGE SPEED!


I don't think it is a penalty. Your page source changed. The order of things changed. Their algorithm utilizes certain factors of your site that changed. Perhaps it is just a precautionary sand-box effect. When something they have known to be the same for 13 years changes across the board, they need to re-evaluate to make sure they are targeting queries to the right pages again.

Hugene




msg:4166792
 8:24 pm on Jul 8, 2010 (gmt 0)

First, in G's very own words, Page Speed is yet just a small factor in the ranking algo. Considering that the algo probably has hundreds of inputs, I really don't think that page speed is actually that important. Also, it will never become that important because of broadband. Back in the 56k days, pages speed did matter to the consumer; now it doesn't so. So why should G really care.

Second, if you still want to go for the Page speed bandwagon, just run Page Speed test in the Firebug firefox add-on. It will tell you your score, and if it is over 70%, your more than OK (I think). Also, it will tell you what you can change. You can gain a lot in the small details, without having to redesign and remove all <tables>. Tables actually don't hurt that much.

Third, flicky, all this is too late for you, but I suggest you got Webmastertools and use the option "Fetch as Googlebot". For your site to have disappeared, it means that googlebot is not seeing it like it used to.

What I mean is: the fact that the <tables> are gone is not that important. What about all the:
*links
*headers (h1, h2...)
*title attributes, alt attributes
*meta data in <head>
*site structure in general
*maybe some content is hidden by CSS?

What I suggest is maybe to get an old browser, like Linx or something, and view your page. Maybe some content is hidden to googlebot by the CSS.

Also, do a HTML validation of your code, maybe there is some error that makes it display incorrectly to googlebot.

setzer




msg:4166799
 8:39 pm on Jul 8, 2010 (gmt 0)

Maybe some content is hidden to googlebot by the CSS.


That's impossible. Adding display: none to a page element only hides the appearance of content to end users -- it still shows up when viewing source. Since Googlebot simply reads a site's HTML content, it will still see the content.

Lapizuli




msg:4166807
 8:42 pm on Jul 8, 2010 (gmt 0)

If this turns out to be long term (and I am not thinking it will - just brainstorming), is there a chance that the overhaul actually gave Google more algorithm-pertinent information than they could detect before?

As in: I send a low-resolution photo image to my friend. They're happy; they like it. Then I work hard to take a duplicate high-resolution image and send it to my friend. The friend sees something in the second image that wasn't visible in the first, and gets upset, and never calls me again.

In the case of the CSS overhaul, is it possible Google is more sensitive to CSS signals than HTML signals for "spam" triggers?

And if that's the case, chances are there IS an end to this - for example, if it takes a while for Google to put the site through the various algo filters. Or (worst case scenario) if it goes on longterm and it gets to the point that you submit your site for reconsideration.

ferfer




msg:4166812
 8:50 pm on Jul 8, 2010 (gmt 0)

Recently I deleted an include to all pages in a section of my old site (general insterest articles). That include just was a little javascript to show an email without exposing it to bots, one adsense unit, and another plain internal link.

Exactly from the day they were re-spidered, almost a week ago, that section lost 66% of google traffic.

PD.I made the exact same change to another smaller section, but that one didn't get affected.

alika




msg:4166833
 9:35 pm on Jul 8, 2010 (gmt 0)

I see no problem with CSS if it's done right.


I've been thinking of doing the same thing to my 12 year old site with a 12 year old code (it was done in Frontpage, LOL, and still uses that code). This thread, though, made me nervous.

My questions are:

1 - Is it better to change slowly, e.g. a few pages at a time, not everything all at once?
2 - What is the right way to CSS a site?

tedster




msg:4166839
 9:41 pm on Jul 8, 2010 (gmt 0)

I already spoke most of my piece above - I would have no such nervousness. I've been involved in hundreds of redesigns over the years, some with content changes and some with only mark-up changes. I never ran into this problem.

I know flicky is reporting a big problem, and I'm not trying to diminish its devastating impact on his business. But I am cautioning against making generalizations based on this.

Lorel




msg:4166840
 9:44 pm on Jul 8, 2010 (gmt 0)

I redesigned my 100 pg site a couple months ago changing the logo and color scheme and switching from a decade old Table format into CSS and my rank has remained steady for a multitude of keywords. I didn't move the menu however (which is horizontal across the top of the page) and I kept the same content and nav links. My traffic has gone down slowly but I believe this is seasonal as I get a lot of student traffic during the school year.

I also redesign several other websites per year and sometimes make drastic changes like rewriting file names (along with redirection in htaccess), changing titles and content, moving to another host, etc. and have never seen a drastic drop like was described in this thread, although it may take a couple months for the improvement in rank to show up in Google.

SuzyUK




msg:4166841
 9:44 pm on Jul 8, 2010 (gmt 0)

I think you're all reading way too much into it..

"where there's smoke there's fire" seems awfully relevant.


No, there's always been smoke (CSS type smoke that is) but it never ganined the fire analogy until it started breathing Google/SEO Fire, tedster is absolutely right, the OP is likely to be somewhat of a difference, maybe the site wide change/"penalty" is nothing to with CSS - but the mere fact that it is a sitewide code change, HTML, CSS, Platform or otherwise, maybe a "wide boy" just took over a recently dodgy site (not saying you are flicky.. but just trying to make the analogy that none of us actually know the full story)

We have no way of knowing what else may actually be an affecting factor.. my next question has to be, for those that are able to get picky over which pages they (smugly?) change/or don't change is it because they know better - is it because they can actually tell us, with 100% proof, their reasons or is it because the "unaffected" changed page content didn't warrant a flag?, was it really anything to do with the styling (CSS is only a suggestive styling language remember!) and If it was, which ones are they and why are they "broke" as opposed to similarly HTML Source coded ones..

then..are you hand coding/tweaking/able to write php/html/css, I take it you're not using an open source CMS template where one change affects all pages and not just the ones you deem "broken"?

that I feel is the major issue - more contributory to the Shock 'n' Awe effect than actual CSS alone, - a wake up to a different technology, a change of software platform - what sets your site apart, are you "really" an authority or you are an pro/am, just slightly under the radar, where every change just might raise a flag - and I take it you expect the "hobby sites" to die a death after they've been unattended for a while? how do you think SE are going to figure that out without adding an auto pen for a site wide change?

every JoeSchmoe can use a CMS template, but how many of them know how to make it (or sections of it) different enough (from the orginal "unbroken" amateur site) to not raise that big red flag? - I honestly think it doesn't matter if you do it bit by bit or do it all in one go like the OP, I think it's gonna have a effect, much the same an plain old HTML tweaks - so at this point in time do you read all the ol' posts and sit back on your laurels or bite the bullet and work with the times? - my advice is to look at the big picture and decide what's better to keep up with for the skills/needs you have at this time

Chrispcritters




msg:4166842
 9:49 pm on Jul 8, 2010 (gmt 0)

I recently moved a 10 year old authority site out of a CMS package.

Changed the order of the html code (nav moved from top of HTML to bottom), changed every URL (301'd every page), went extensionless, cleaned every single page's html to be strict (no errors at all), most pages saw changes in title and description, and saw no drop in ranking.

You mention there are "minimal" html strict errors. You might want to fix those. Have you noticed any changes in google webmaster tools? Did your css id/classes include your keywords? (over optimization?)

setzer




msg:4166845
 9:51 pm on Jul 8, 2010 (gmt 0)

Awhile back, our forums did drop from the SERPs after a sidewide code change. This affected around 12,000 pages. After a week rankings picked up and things went back to normal though. I suppose Google's crawlers need time to adjust to any major changes, especially ones that impact navigational structure.

I'd wager a guess that the time to "rebound" may be dependent on certain factors, such as the amount of inbound links pointing to your site, whether or not it is an authority in your niche, etc.

physics




msg:4166846
 9:53 pm on Jul 8, 2010 (gmt 0)

Have faith. If you do what's good for users you will no doubt rank high in Google. I can't find the sarcasm key on this keyboard right now...

Trav




msg:4166863
 10:19 pm on Jul 8, 2010 (gmt 0)

Some interesting replies here; my feeling is that there's more to the story here. Re-factoring to a CSS-based layout just doesn't cause this in and of itself, at least not on any of the sites I've updated.

I'm using the STRICT doctype (minimal errors upon validation)


I basically stripped the pages down completely and created a CSS-only structure.


and,
You mention there are "minimal" html strict errors. You might want to fix those. Have you noticed any changes in google webmaster tools? Did your css id/classes include your keywords? (over optimization?)


I'd start here- either fix the errors, or use a transitional doctype. Then make sure you've used semantic elements for your content, h1's p's etc. Then make sure you haven't moved a whole bunch of less-important stuff up higher in the source code. Finally, have you taken a look at your robots.txt lately?

Ujang




msg:4166916
 1:22 am on Jul 9, 2010 (gmt 0)

Hi flicky

IMO, you could try to increase googlebot crawl rate, say 10 seconds between requests or even higher.

idolw




msg:4166985
 6:19 am on Jul 9, 2010 (gmt 0)

the original post sounds scary, but I am with tedster, who beleives there must be some "innovation" that is taken as spammy by the algo.
You wrote the new menu changes the way content is organized. Could you elaborate?

zett




msg:4167005
 7:35 am on Jul 9, 2010 (gmt 0)

Thanks, flicky, for the message. I am in a similar situation with my main site (which is 9 years old) and I have been thinking to "upgrade" to modern HTML and CSS. However, your report has made me re-think. I won't "upgrade" anytime soon.

Recently, I have become very suspicious when I read "Google says" or "Google recommends". Often, doing the very opposite of what Google says or recommends is good for me. Go figure.

justguy




msg:4167170
 1:20 pm on Jul 9, 2010 (gmt 0)

Second, if you still want to go for the Page speed bandwagon, just run Page Speed test in the Firebug firefox add-on. It will tell you your score, and if it is over 70%, your more than OK (I think). Also, it will tell you what you can change. You can gain a lot in the small details, without having to redesign and remove all <tables>. Tables actually don't hurt that much.


Ah ... don't you love Google. Following MayDay our ecommerce traffic dropped 75% and main site traffic 60%.

However, back on load speed topic. Prior to 2 weeks work, we averaged 3.5 secs or so loadspeed according to GWT.

So, 2 weeks dev work, Firefox pagespeed gives us a 97% score and ... yup .. our page load time has gone up to 4.5 secs. Every change we make seems to increase it.

Oh well ....

pageoneresults




msg:4167171
 1:25 pm on Jul 9, 2010 (gmt 0)

Seen this happen before thanks to all the CSS nuts pushing everyone to upgrade.


What's that word that all the old g33ks use around here? FUD, FUD, FUD

Bottom line, if it ain't broke, don't fix it.


That type of attitude is why we have sites today using HTML 4.01 Transitional DOCTYPEs and are built using only <table>s. If you listen to the above advice, you'll be extinct soon enough.

I already spoke most of my piece above - I would have no such nervousness. I've been involved in hundreds of redesigns over the years, some with content changes and some with only mark-up changes. I never ran into this problem.


Same here.

Please, don't get suckered into this hype. There is definitely more here than meets the eye. Maybe the OP will put their site up in the Review Forum and we'll find out exactly what is wrong. I seriously doubt it is the CSS unless you have a boat load of negative coordinates sitting in there and are hiding stuff that you shouldn't be. That would have invoked a manual review most likely at which time if the CSS trickery was present, you get whacked.

But I am cautioning against making generalizations based on this.


Me too! The last five sites I've assisted with this type of conversion all went as planned and expected. In fact, the very last one is a WebmasterWorld Member who saw improvements beyond anything they had done previously. And, this all happened prior to and during the so called Mayday Update.

I say put the site up for review in the Review My Site Forum and put this one to rest. I'd be willing to place a large wager that there is a technical glitch somewhere in the process. 9 out of 10 times, it's technical.

I have a testimonial from a WebmasterWorld Member dated 2010-06-16 that will pretty much refute this entire Elmer topic. ;)

P.S. I'm really surprised this made Front Page at WebmasterWorld. Oh wait, no I'm not. This site is long overdue for its CSS overhaul and someone is wanting to use this as an excuse not to. I've seen it happen time and time again. ;)

[edited by: pageoneresults at 1:29 pm (utc) on Jul 9, 2010]

yaix2




msg:4167172
 1:26 pm on Jul 9, 2010 (gmt 0)

Finally, have you taken a look at your robots.txt lately?


That would be my first guess too, some small detail, maybe in robots.txt or www/non-www duplicates, meta tag, etc. There are many possibilities.

flicky




msg:4167213
 3:03 pm on Jul 9, 2010 (gmt 0)

Update:

One of my best keywords reappeared yesterday for most of the day in a favorable position slightly better than before the redesign. It's gone today, however.

Strange that ONE keyword would reappear. It gives me faith that things are being "sorted out".

Regarding the navigation links... yes, I did add a site-wide navigation bar that did not exist before. It contains about 20 links to each section of my directory-based site. I don't see how that could trash things though. If anything, it makes it easier for googlebot to discover the other important sections of my site. Each section has always enjoyed it's own favorable results within Google.

marc

honestman




msg:4167214
 3:17 pm on Jul 9, 2010 (gmt 0)

Seen this happen before thanks to all the CSS nuts pushing everyone to upgrade.

Bottom line, if it ain't broke, don't fix it.


This has been my experience, but it runs counter to all my training and instincts having been in software/web development for many years.

I think that it is good that there is consistency on Google's side, but one should not be afraid to make substantial upgrades as the technology improves for web development (often exponentially since the early 90's.) This really puts web developers in a tough bind. To take a chance to improve user experience or not?

chicagohh




msg:4167217
 3:20 pm on Jul 9, 2010 (gmt 0)

I did add a site-wide navigation bar that did not exist before.

On your 2000 page site you just added 40,000 links to your internal link profile. That's a significant change from just upgrading the CSS and could cause your site to drop. Hopefully, it will come back, but you made a major change to the internal link structure.

Case closed.

[edited by: chicagohh at 3:24 pm (utc) on Jul 9, 2010]

netmeg




msg:4167218
 3:22 pm on Jul 9, 2010 (gmt 0)

I've done a lot of site relaunches too, and never experienced what you have (when it was done to my spec, that is) I bet you pop back in eventually. I do usually tell people to expect three to six months of bouncing around before the dust settles, even if all the content is exactly that same and it's just a structural overhaul.

ponyboy96




msg:4167238
 4:02 pm on Jul 9, 2010 (gmt 0)

Hey flicky,
Just a couple of quick checks and questions for you:
1. Have you viewed your robots.txt file? Did it change?
2. What about your meta robots file? Is there a no-index?
3. When you redesigned the site, did the URLs change?
4. Is your site still listed in Yahoo? Bing?

I've been around the block on these things as well and been doing SEO for some very big companies for some time. It has been my experience when there is a drop, to try and eliminate the simple things first. I cannot see the changes that you made dropping you out of the rankings like that. There has to be more to it.

[edited by: Robert_Charlton at 7:03 pm (utc) on Jul 9, 2010]

Sylver




msg:4167248
 4:13 pm on Jul 9, 2010 (gmt 0)

I think the idea behind sandboxing a site-wide change is to prevent the situation where someone sells a web domain to someone else who changes the website completely and therefore takes undue advantage of the existing links.

To you, the content might look the same, but as you went with css, I expect that the sequence of items is significantly different, and that could be enough for Google to decide that the content is too different to warrant the same rankings as before.

You can test that by grabbing one of your old page, stripping the code, then taking the equivalent new page, stripping the code, and comparing both with a difference engine.

Try the (excellent) Google difference engine based on Meyr's diff algorithm Google-diff-match ([code.google.com ] - click on the diff demo)

This should give you some insight on how different your new pages are, from Google's viewpoint.

I don't know if Google Search uses this same difference algorithm (who knows?), but it is considered to be a very efficient way to compute differences and I wouldn't be surprised if it did.

Google handles a crazy volume of data and they can't afford to use processing-intensive algorithms. Implementing an algorithm to recognize differences the same way a human does would be prohibitive in terms of computing time.

In a classic diff algorithm, difference can be measured by the number of characters that need to be changed in the old document to get the new document. Take a sample text and scramble the order of the paragraphs. You will see that the result will be considered to be completely different in Google-diff-match demo.

My point is that even though the content is "the same" from a human's viewpoint, it could easily be computed as being mostly different or even totally different depending on the way the code was rearranged.

Test several of your pages and see how badly different they are from what they were previously. If they are very different, try to rework the code sequence to make it more similar to what it was. CSS gives you a lot of freedom in regards to what goes where in your code, so you should be able to change things to match pretty closely. It is a lot of work, but it might be worth it.

This 110 message thread spans 4 pages: < < 110 ( 1 [2] 3 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved