homepage Welcome to WebmasterWorld Guest from 54.166.65.9
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 110 message thread spans 4 pages: < < 110 ( 1 2 [3] 4 > >     
Old HTML redesign to pure CSS = dropped from google
flicky




msg:4160205
 3:35 pm on Jun 27, 2010 (gmt 0)

Wow, I'm losing faith quickly.

13 year old site. Using 13 year old code.

I decided to modernize using the cleanest of standards. I'm using the STRICT doctype (minimal errors upon validation) ... pure CSS, one stylesheet controlling everything which is minified... all pages compressed with gzip. No header information was changed... same titles, same meta. Basically I know what I'm doing.

I effectively took my average page size from 80k to 20k (before gzip compression) and the site is lightning quick. Visitors have been super happy with the improvement.

Every visitor but googlebot apparently.

The DAY after I rolled this major revision out, I began losing my major keywords. Every keyword was top 10, now I'm gone... not in supplementals - nowhere. Keywords dropped over the last 3 days... today there isn't much left.

My question... how long does it take google to "sort it out"? I'm quite certain that improvement in ranking should come from this work. It's discouraging to see everything gone completely.

What's killing me is that Google is encouraging webmasters to do everything I did to improve speed. So I actually take that to heart and get creamed? Come on, google!

Any insight?

thanks!

marc

 

Sylver




msg:4167248
 4:13 pm on Jul 9, 2010 (gmt 0)

I think the idea behind sandboxing a site-wide change is to prevent the situation where someone sells a web domain to someone else who changes the website completely and therefore takes undue advantage of the existing links.

To you, the content might look the same, but as you went with css, I expect that the sequence of items is significantly different, and that could be enough for Google to decide that the content is too different to warrant the same rankings as before.

You can test that by grabbing one of your old page, stripping the code, then taking the equivalent new page, stripping the code, and comparing both with a difference engine.

Try the (excellent) Google difference engine based on Meyr's diff algorithm Google-diff-match ([code.google.com ] - click on the diff demo)

This should give you some insight on how different your new pages are, from Google's viewpoint.

I don't know if Google Search uses this same difference algorithm (who knows?), but it is considered to be a very efficient way to compute differences and I wouldn't be surprised if it did.

Google handles a crazy volume of data and they can't afford to use processing-intensive algorithms. Implementing an algorithm to recognize differences the same way a human does would be prohibitive in terms of computing time.

In a classic diff algorithm, difference can be measured by the number of characters that need to be changed in the old document to get the new document. Take a sample text and scramble the order of the paragraphs. You will see that the result will be considered to be completely different in Google-diff-match demo.

My point is that even though the content is "the same" from a human's viewpoint, it could easily be computed as being mostly different or even totally different depending on the way the code was rearranged.

Test several of your pages and see how badly different they are from what they were previously. If they are very different, try to rework the code sequence to make it more similar to what it was. CSS gives you a lot of freedom in regards to what goes where in your code, so you should be able to change things to match pretty closely. It is a lot of work, but it might be worth it.

pageoneresults




msg:4167272
 4:57 pm on Jul 9, 2010 (gmt 0)

Sylver, nice tip on the Diff Match and Patch tool.

flicky, the site that you are referring to, is it the same one you've had challenges with recently?

Could mass "nofollow" addition be the cause of my penalty?
2010-02-10 - [WebmasterWorld.com...]

Penalized site re-included for 24 hours, then removed. Any thoughts?
2010-02-08 - [WebmasterWorld.com...]

"Re-inclusion request processed" response taking longer than normal?
2010-01-07 - [WebmasterWorld.com...]

I've lost 80k hits a day from Google since this problem began.
2009-12-07 - [WebmasterWorld.com...]

There appears to be a pattern of challenges that are fairly recent. When you look at the crawl activity of your site in GWT, is it very erratic - lots of peaks and an overabundance of dives? You've got a Google Black Cloud hanging over that site. :|

flicky




msg:4167284
 5:24 pm on Jul 9, 2010 (gmt 0)

Regarding the site-wide navigation that I added.

This only appears on the actual category pages. Not all 2000 pages of the site.

So each category has a link to choose an additional category. Pretty standard navigation. Once you dig deeper into the site, the navigation does not exist.

In essence I added this to only a handful of pages.

flicky




msg:4167289
 5:31 pm on Jul 9, 2010 (gmt 0)

flicky, the site that you are referring to, is it the same one you've had challenges with recently?

YES, same site. The issues I reported earlier were completely resolved via a re-inclusion request.

Could mass "nofollow" addition be the cause of my penalty?
2010-02-10 - [WebmasterWorld.com...]

Penalized site re-included for 24 hours, then removed. Any thoughts?
2010-02-08 - [WebmasterWorld.com...]

"Re-inclusion request processed" response taking longer than normal?
2010-01-07 - [WebmasterWorld.com...]

I've lost 80k hits a day from Google since this problem began.
2009-12-07 - [WebmasterWorld.com...]

There appears to be a pattern of challenges that are fairly recent. When you look at the crawl activity of your site in GWT, is it very erratic - lots of peaks and an overabundance of dives? You've got a Google Black Cloud hanging over that site. :|

Yes, crawl activity is very erratic.

I do feel like I have a black cloud over this site!

flicky




msg:4167290
 5:33 pm on Jul 9, 2010 (gmt 0)

Hey flicky,
Just a couple of quick checks and questions for you:
1. Have you viewed your robots.txt file? Did it change?

nothing changed

2. What about your meta robots file? Is there a no-index?

all good there... nothing changed

3. When you redesigned the site, did the URLs change?

No, all stayed the same.

4. Is your site still listed in Yahoo? Bing?

My site has seen a SIGNIFICANT increase in both bing and most noticeably yahoo over the last two weeks.

ponyboy96




msg:4167368
 7:18 pm on Jul 9, 2010 (gmt 0)


Hey flicky,
Just a couple of quick checks and questions for you:
1. Have you viewed your robots.txt file? Did it change?

nothing changed

2. What about your meta robots file? Is there a no-index?

all good there... nothing changed

3. When you redesigned the site, did the URLs change?

No, all stayed the same.

4. Is your site still listed in Yahoo? Bing?

My site has seen a SIGNIFICANT increase in both bing and most noticeably yahoo over the last two weeks.


Wow! 99% of the time, that is the issue. Pageone's post definitely shed some light though. You may have been on thin ice before the redesign and that was enough to put you through. I'm definitely interested in hearing about what you find out.

jexx




msg:4167397
 8:31 pm on Jul 9, 2010 (gmt 0)

You may have been on thin ice before the redesign and that was enough to put you through. I'm definitely interested in hearing about what you find out.


Is there a pattern here to others where they have also had similar problems in the past, or has this happened to others without the history that this domain would have had with Google?

13 year old site. Using 13 year old code.


flicky, does that mean this was a mainly static site with minimal content changes?...

IF SO, as opposed to Chrispcritters and Atomic:

I recently moved a 10 year old authority site out of a CMS package.

Changed the order of the html code (nav moved from top of HTML to bottom), changed every URL (301'd every page), went extensionless, cleaned every single page's html to be strict (no errors at all), most pages saw changes in title and description, and saw no drop in ranking.


About two years ago I redesigned several of my sites so they were tableless, pure CSS etc. My rankings climbed steadily.


my thinking here is that PERHAPS if both content and code remained the same, or close to the same, for extended periods, the algo would easily be able to detect magnitude of changes through diffs over time... then all of a sudden (even though content remained the same), there's a increase in incremental diffs (algorithmic assumptions here) between old and new pages.

thus, as opposed to Chrispcritters who, assuming that use of a CMS meant that he changed content continually?, your "change" would appear as out of the norm for your 13-year-old-relatively-similar site while other sites with continual changes (code or content) would have a history of incremental changes..

so, in other words.. me thinks it's not about slow or fast pace of changes but the history of what you've done in the past and not diverging from that pattern significantly.

jrivett




msg:4167411
 8:59 pm on Jul 9, 2010 (gmt 0)

Maybe some content is hidden to googlebot by the CSS.

That's impossible. Adding display: none to a page element only hides the appearance of content to end users -- it still shows up when viewing source. Since Googlebot simply reads a site's HTML content, it will still see the content.

And yet Google says: "Hiding text or links in your content can cause your site to be perceived as untrustworthy since it presents information to search engines differently than to visitors. Text (such as excessive keywords) can be hidden in several ways, including:

* Using white text on a white background
* Including text behind an image
* Using CSS to hide text
* Setting the font size to 0"

Link to source: [google.com...]

blend27




msg:4167426
 9:29 pm on Jul 9, 2010 (gmt 0)

And yet Google says: * Using CSS to hide text


There is nothing wrong with that when it comes to CSS MENUS or MULTIPLE TABS to Organize the content in esthetic way. I have several sites that rank for text that is not shown in first loaded tab for couple of years now.

BTW, that statement was on that page way long before Youtibe, Facebook and Tweeter existed, donít over react on it.

jrivett




msg:4167438
 9:49 pm on Jul 9, 2010 (gmt 0)

There is nothing wrong with that when it comes to CSS MENUS or MULTIPLE TABS to Organize the content

You're probably right, but it's best to be careful since Google is clearly wary of hidden text. And it's certainly something the OP can check.

BTW, that statement was on that page way long before Youtibe[sic], Facebook and Tweeter[sic] existed, donít over react on it.

If you say so. All I know is that the page was last updated on 6/10/2009 and it's been much referenced elsewhere.

Chrispcritters




msg:4167441
 9:52 pm on Jul 9, 2010 (gmt 0)

jexx,

No, once content was added to the CMS it generally did not change. Just additional pages being added.

ferfer




msg:4167458
 10:55 pm on Jul 9, 2010 (gmt 0)

I just discovered I made a mistake and my previous post was wrong, I messed the stats, the traffic has not changed for the section...

Recently I deleted an include to all pages in a section of my old site (general insterest articles). That include just was a little javascript to show an email without exposing it to bots, one adsense unit, and another plain internal link.

Exactly from the day they were re-spidered, almost a week ago, that section lost 66% of google traffic.

PD.I made the exact same change to another smaller section, but that one didn't get affected.

johnnie




msg:4167487
 12:21 am on Jul 10, 2010 (gmt 0)

Have a look at your redirects. Are you redirecting from http to https using a 302 status code? Are you redirecting from '/' to '/home/' using a 302 status code?

If so, change the '2' into a '1', or even better... Avoid the redirect altogether. It can make all the difference.

mlemos




msg:4167850
 10:37 pm on Jul 10, 2010 (gmt 0)

I suspect Google does not parse your external CSS. Therefore, if you changed tags like

<font size=2>keywords</font>

to

<span class="font-size-2">keywords</span>

it is natural that you would loose ranking for the "keywords".

It may be too late for you, but it would be better if you changed to:

<h2 class="font-size-2">keywords</h2>

UserFriendly




msg:4167916
 2:18 am on Jul 11, 2010 (gmt 0)

I think it's probably the moving of the navigation links to the "left" (I assume you mean that the actual markup for the links now comes before the main text, whereas before it was after the main text.)

Have you tried moving the navigation box back to a position after the main text? You can use CSS to position it wherever you like the look of it, but remember to take a look at your pages in "user styles mode" so that you can see how your page reads to users without graphics and CSS enabled (usually people with sight impairment). Perhaps offer a link down to the nav links for such users, and then use CSS to hide that link for CSS-enabled visitors.

And is your main text nicely broken up with suitable h2 and h3 headings? (I'm pretty sure Google gives keywords in these headings greater weight.)

flicky




msg:4168093
 3:33 pm on Jul 11, 2010 (gmt 0)

Noticed the following this morning... any meaning to it?

When doing a search for the entire title of my page, I do not show up in the serps.

When doing a search for the following, I do show up #1 as expected:

"keyword domain.com"

the weirdness continues

Planet13




msg:4168137
 7:06 pm on Jul 11, 2010 (gmt 0)

When doing a search for the entire title of my page, I do not show up in the serps.


Can you elaborate a bit?

is this the home page we are talking about? If not, how many clicks away from the home page is it?

How many words are in the title of that page? Are those words particularly unique?

When you search for the page title, what DOES google return instead?

Planet13




msg:4168139
 7:10 pm on Jul 11, 2010 (gmt 0)

Not to hijack this thread, but...

@UserFriendly

Have you tried moving the navigation box back to a position after the main text? You can use CSS to position it wherever you like...


Would you have a link to a tutorial on how to do this?

tedster




msg:4168142
 7:30 pm on Jul 11, 2010 (gmt 0)

Here's an overview of one approach. The details will vary according to the page template you are modifying.

1. Leave empty space in the layout flow, where the navigation div will finally appear on screen.

2. Place the code for the navigation div at the bottom of the body element.

3. Use the CSS absolute positioning rules to display that navigation div "on top of" the blank space you left in the regular flow.

You can research it here with our Site Search [webmasterworld.com] by looking for "source ordered content". If you want to track down one member, pageonresults has been a frequent poster on the topic.

I don't think the "source ordered content" approach is nearly the SEO help that it was a few years ago, by the way. I stopped using it, routinely at least, about two years ago.

<added>
This could make for a good new discussion in our CSS Forum [webmasterworld.com] so we don't hijack this Google SEO thread too far off topic.

[edited by: tedster at 8:10 pm (utc) on Jul 11, 2010]

Planet13




msg:4168159
 8:09 pm on Jul 11, 2010 (gmt 0)

Thanks for the tips, Tedster!

SuzyUK




msg:4168172
 8:42 pm on Jul 11, 2010 (gmt 0)

"source ordered content"


It does work, but perhaps not to the extreme that some might want or think is best for SE bots

the reason such extreme "source ordering" might not "work" any more is that it doesn't make sense to a USER to stick your "nav" outside the <body> or on the extremes of it.. never mind a SE bot

content first = yes, as that's what a user would expect to see, as a basic right - as opposed to a 300 link deep menu

if your nav menu is set to have drop downs all 300 links but they're not visible on immediate "screen/page" load, it might well *p* off a text/phone USER as well as a SE bot to have to "read" through all that, if they're viewing on a device that makes them scroll through it, i.e. non screen!
This is increasingly more common and I believe is a factor in, Google especially's, algo - hence the warning/FUD/OMG reports about the CSS/Javascript visibility.. CSS "hiding" is not illegal, it never has been, it's there for use and abuse as a right of valid coding.. but it is just common sense on where, when & how to use it, no-one I know ever got "banned" for using W3C valid VISIBILITY/DISPLAY properties PROPERLY

I'll say it again, look at your page in plain text view, then imagine that is what you were 'seeing' when trying to access it on your phone (a cheap phone or a browser without Javascript or CSS, text reader etc...) - now can you say what do you want to "see" first? (and no this isn't a trick accessibility question it really is a USER question!)

to me, it's a few basic jump/skip links first i.e. give me the facility to skip to the nav (if I want to!), then I want content, as it's probably why I came to your site in the first place, whether through an SE search or adding you via RSS Feed! - then below that the 300 link deep site menu, I might investigate that if the content did it's job..then feel free to stick your footer info/links in just in case, then perhaps you might want to stick in your adverts and CSS position THEM visually "above the fold" for PC users - I am sorry that may not be what may would want to hear, but it is from a USER point of view, and what do think G is doing? They're using/reding CSS the way it was always/ever meant to be used, so while it can be abused I think they know that by now ;)

today's sites make it easy to "subscribe" they've bred a new generation of followers based on loyalty not SEO. and in the same vein those who actually know how to "subscribe" and know what twitter is and have feed readers and "know how to use 'em" ;) , probably already know how to avoid the "little SEO tweaks" (OP, did your site suddenly gain an RSS feed?) - at most your subscribers/readers/SE attracted visitors will want to make A QUICK SCAN/SKIM OF YOUR CONTENT, but you should still be able to optimise as best you can (if you're flexible enough) for the screen readers by placing your ads "in their face" as it's no longer a "hotspot" world ;) .. now that is where CSS "source ordering" can come into it's own, one template that is flexible from the start will ensure you never have to change the order and can CSS Position "bits of content" to your hearts content..

leadegroot




msg:4168418
 9:47 am on Jul 12, 2010 (gmt 0)

When doing a search for the entire title of my page, I do not show up in the serps.

I think you should be looking beyond the CSS change.
Is the webmaster tools unstuck yet?
If not, do your logs show googlebot crawls with a 200 result and the correct filesize?
I'm wondering if something is stopping the bot getting in there - it can happen :(

idolw




msg:4169103
 9:12 am on Jul 13, 2010 (gmt 0)

Regarding the navigation links... yes, I did add a site-wide navigation bar that did not exist before. It contains about 20 links to each section of my directory-based site. I don't see how that could trash things though. If anything, it makes it easier for googlebot to discover the other important sections of my site. Each section has always enjoyed it's own favorable results within Google.


This ends the case in my opinion. Some of your pages (including home page) were probably ranking basing on lots of internal link love from internal pages. Now, when you have added 19 more links (I bet optimised for anchor) the home page receives way less link juice.
I would try to re-think the navigation to limit the number of new links so that link juice is not so hardly distributed.
Or wait for Google to figure everything out as Matt Cutts suggests.

Hugene




msg:4169245
 2:03 pm on Jul 13, 2010 (gmt 0)

To add some fire to the <table> vs <div> formating, I just read a technical post by the BBC about the semantic engine they built for their World Cup page ([bbc.co.uk ]), and the thing is so complicated, it makes my little index.php engined pages look like BASIC Hello World programs (and I didn't understand much of how it actually works).

Yet, the actual page that is semantically generated: [news.bbc.co.uk ], still uses <table> for formatting.

If it ain't broke, don't fix it.

Incremental changes and fixes are much better IMHO. Fix little things first, it can be done quicker and it gives G time to adjust. And if you have to back out a little change, that's way easier.

londrum




msg:4169253
 2:29 pm on Jul 13, 2010 (gmt 0)

i had a look at one of those BBC pages and it loads up 10 different CSS stylesheets, plus a couple of optional ones for IE6 and IE7, plus a big chunk of on-page in-the-head CSS, followed by 5 javascripts... just in the <head>.
the page itself has 12 separate <tables> on it, 9 of which appear to be layout tables.

the page does look good, but it looks like for every simple looking list of links, or data table they add to the page, like 'leading scorers', they are forced to include an extra CSS stylesheet and javascript in the head.

fabulousyarn




msg:4169483
 8:38 pm on Jul 13, 2010 (gmt 0)

Yes, especially if there were added navigation menus put in front of the content. I'd like to hear about this too - I just did the same - just popped a little ole CSS menu in there - in addition to my side nav. It was post mayday and it didn't seem to affect my rankings - but I'm keeping an eye on it.

Regent




msg:4169565
 10:35 pm on Jul 13, 2010 (gmt 0)

I don't typically participate in forums - rather just read them lots for information, but this Google issue seems to be an exception as I have seen this condition several times over the past several years.

Although it is just a theory, you can make of it what you want.

I started noticing this kind of Google reaction to pages that had significant content changes and about the time Google started to implement their "near-duplicate" content filtering. If page changes were modest and the keywords was relatively modest, Google would not blink. Google would re-rank the page within a few weeks depending on PR value. Higher PR values usually responded faster. But when there were more substantial changes (my experience is limited to copy changes rather than structure changes), Google would kill the page for a period of time.

My theory is that more substantial changes cause Google's near-dup content filter to re-assess the page as if it were from scratch. And as a safe guard, Google kills the page as this near-dup processing takes place. Once the near-dup processing is completed and Google is satisfied that pages have unique content, then rankings return.

Remember, this is just a theory but it has served me well for the past few years.

If Google's pattern is consistent, I would expect rankings to return within the 2-4 week period as it seems Google applies a higher priority to their near-dup content filtering process for pages with higher PR value.

But there could be complications in Google's near-dup content process. Based on some of the older Google patents covering near-dup content, PR "may" be used to assess what page receives ranking over another that is found to be in the same "cluster". If the URLs have changes and if Google is re-assessing PR values for each new URL, then other content on the web that may have near dup content could out rank the new URLs. In this case, Google could take longer to sort everything out and apply 'correct' ranking value to the new pages.

DISCLAIMER: causality and correlation are two different things. Just because I observe some correlation does not infer causality. Frogs and umbrellas have strong correlation: they both come out in the rain. frogs don't cause umbrellas and vice-versa.

:-)
dgj

jimorandy




msg:4169587
 11:26 pm on Jul 13, 2010 (gmt 0)

Simple.
The drop in rankings is because the order of content/links on the page has changed. If you take the engine from the front of your car and stick it in the trunk, it may make it better, but it sure will take some time getting used to. I'm pretty sure your positions will come back, but not before Google will re-spyder and recalculate the ranking PR for your whole website. Even then, the PR (link) profile for some pages will change. Therefore, the power of these pages will change, and will not necessarily rank for the same things they did before...

webdevfv




msg:4169900
 10:10 am on Jul 14, 2010 (gmt 0)

I agree with Regent regarding duplicate content penalty.

If you have any affiliates or shopping comparison sites etc who have any of your code/content it will now appear to google that they are older / more trusted / the originators of the content so be careful when making big changes to a website.

I'm sure something similar happened to one of my sites and it's never truly recovered all the rankings it had even though a number have come back as I've cut back and made changes to our product feeds.

tootricky




msg:4169909
 11:16 am on Jul 14, 2010 (gmt 0)

I'd check that you aren't using display:none; in a non accessible way. I've seen many times that Google can completely ignore content contained in display:none; block elements. They hate it.

If you want to hide content (for tabbed content etc.) then ensure the display:none; content shows onload and is then hidden with js. or something similar.

Basically as someone has already said, if js is off, does the page load as a user (including the big G) would want?

sf49ers1983




msg:4170183
 8:23 pm on Jul 14, 2010 (gmt 0)

We actually had a similar experience with lost Google rankings and pages for implementing the "canonical" attribute to our pages. We had noticed that Google had indexed several versions of the same page for many of our pages, each differing in either capitalization or affiliate tagging (for analytics purposes). We also noticed that we had tons of inbound links coming into different versions of the same page thus creating different page ranks for each; so, wanting to capitalize on the inbound links so they all went to the same page to hopefully create stronger pages we implemented the canonical attribute, which Google suggests doing, which is supposed to tell Google what the real page is. Within a week all of our 2000 or so product pages were dropped from the index and our rankings tanked. This was about 7 weeks ago that these changes were implemented. Since the great and dreadful day of losing all of our page indexes we have since been recrawled and indexed. Our rankings are slowly starting to regain their lost ground, though we have not seen increases in any ranking to a better position of pre-change. So, my advice, although hard to accept, is to probably just let it go for now.

Google, from what I understand, attempts to mimic the real world business atmosphere in which when businesses make large changes, they may or may not be good changes. Google then decides if they are good changes by seeing if links change, if pages can still be found like they used to, if content on the pages has changed, etc. So, that might also explain the delay.

This 110 message thread spans 4 pages: < < 110 ( 1 2 [3] 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved