Welcome to WebmasterWorld Guest from 54.211.101.8

Forum Moderators: open

Boogled...

Google is dropping my pages from search engines after I changed the design

   
3:37 pm on Jul 21, 2004 (gmt 0)

10+ Year Member



I would really appreciate anyone's suggestions re: rapidly loosing top positions in Google. Pages that used to rank #1 are no longer appearing at all.

There are a few factors that could be causing this:
1. We just changed our site design. The content has remained the same. The new design was constructed carefully to ensure no violations of Google's rules and to be standards compliant.
2. Many of the URLs for site pages changed. However, all the old address were replaced with a 301 - permanently moved error - and a new URL provided. This actually netted us duplicate entries on Google for about 2 weeks (with both the old and new pages showing). However, now after Google appears to have removed the 301 pages, the new pages are also disappearing - AAAAGH!
3. We removed a sitemap page, choosing instead to put links to all the product pages in the footer of the homepage.

Any suggestions are greatly appreciated.

[edited by: agerhart at 3:46 pm (utc) on July 21, 2004]
[edit reason] removed URL [/edit]

1:17 pm on Jul 22, 2004 (gmt 0)



I would suggest not having a sitemap is a bad move.
2:15 pm on Jul 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>I would suggest not having a sitemap is a bad move.

Not necessarily, the home page now doubles as a sitemap:

>>We removed a sitemap page, choosing instead to put links to all the product pages in the footer of the homepage.

Doesn't have to be labeled a sitemap to actually be one. Works great for this site [webmasterworld.com]

2:39 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



301s can take some time to take effect. I would suggest continued patience.

Another factor could be moving all of your links to your footer. Proximity matters and having all your links as the last thing on your page might be causing you trouble. Sometimes it's a really good idea to put your links at the bottom of a page, other times not. But I'd wait another few weeks for the 301s before doing anything radical, particularly if your pages have "disappeared" rather than just suffered a drop in the SERPs.

3:16 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



Thanks Karmov,

That is what was most shocking. Many pages literally just dropped of Google totally. As I said in my first post, first we had both the old and new pages on the results page, then just the new, and now nothing.

The odd thing is that the old site was getting great positions (many #1's for key keywords). It also had some html code that I would have considered dubious (for example: a transparent <div> that covered the entire page content [purely used for a javascript feature to detect where the users mouse was, still I thought could be problematic]). Now, the new site gets rid of that <div> and attempts to be as straightforward and standards based as possible.

I will wait a while and pray the listings return. I'm just frieghtened that we may have unwittingly done something that may have blacklisted us!

[edited by: DaveAtIFG at 7:58 pm (utc) on July 24, 2004]
[edit reason] URL removed [/edit]

3:30 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



Welcome to Webmasterworld erk01.

You might want to validate your HTML. The answer to your problem may lay there. Also if you want people to see your page just put it in your profile.

Good Luck

4:25 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



I have to disagree about HTML validation being a factor - my sites are a mess. But despite not dotting the t's and crossing the i's (or is that the other way round ;), Google position is great, and they look fine on the users' screens.

...and take a look at those high ranking university pages - do you think the prof cares (or even knows) about HTML validation?

To make good validation a part of the ranking would play directly into someone's hands (that someone being *us* - SEO types!)

So let's not propogate another old Google myth.

5:38 pm on Jul 22, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



html validation is not a "must" to get indexed or get high rankings. I know it hurts the purists to hear this, but it is the truth.

As long as there isn't something horrible that stops a spider from crawling, everything will be fine.

It is a matter of function over form.

5:44 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



I've used lynx viewer to test the crawlability... is there a better method/tool?
5:56 pm on Jul 22, 2004 (gmt 0)



I have done this a few times. You have to be patient with the 301 redirects. There is nothing you can do but wait, it usually took google 2-4 weeks to sort out my sites.
6:16 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



When you changed the urls you lost all the pagerank on those pages so it is normal for the new ones not to rank well.
6:27 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



I donít disagree that the validation thing is a myth. I guess my point was that if you drop out of the Serpís after changing the HTML there may be coding problems that make it problematic for googlebot.

I too have sites that are just a mess that rank fine. That was my only point about validation.

6:39 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



google frowns upon sitemaps?
10:29 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



erk01: Lynx is a pretty good way to check to see what bots see, that's what I use. Also since I missed it the first time

Welcome to Webmasterworld!

kosar: I've seen no evidence whatsoever that would suggest that Google frowns upon sitemaps. Google tends to promote things that make sites good for people to use and sitemaps are often a very useful tool that people use while browsing the web.

10:38 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



In reference to the validation, though it's true that valid code is not required for Google, I think the idea is that if you've written valid code you're less likely to have written some tag soup that could trip up a bot and have them miss/skip your links. I'm in firm agreement on that point and it's one of the reasons why I use valid code on my site. There are more, but that's another topic :)
10:39 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



Well, turn off javascript and try browsing the site -- there is no left menu and I don't understand how you say it's crawlable in lynx. It's not. I think that javascript-only navigation is killing it.

Googlebot thinks that your internal pages are not linked from anywhere and has rightfully kicked them off the index.

12:09 am on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Actually, the js nav isn't the ONLY nav tool there. Bottom of page text links exist. HOWEVER.... if one has js disabled, it's rather difficult to really WANT TO navigate with the text links, which seem to have little "coherence". It's a nice page, attractively laid-out - but something needs done with the mish-mash of grey text-links at page bottom.

A somewhat larger font-size would help, perhaps as a mouse-over change (from the static as-presented to an ALL CAPS BOLD, maybe). A more obvious mouse-over color change would help. A change from the "¶" (pipe symbol) to a different symbol in a different color would help.

A CSS left menu might be the best solution. Circumvents the "no-js" situation, as well as the "fade-to-nonentity" grey text-links.

5:14 am on Jul 23, 2004 (gmt 0)

10+ Year Member



Many of the URLs for site pages changed. However, all the old address were replaced with a 301 - permanently moved error - and a new URL provided.

This is your mistake erk. You should have used the new content on the old url's only. Depending on 301 redirects has never been a wise decision. What now:

1)either consider that you've to reoptmize your pages and will have to wait for some time befoer they get back to where they were before.
2)Or reinstate your old url's and put the new text there.

6:03 am on Jul 23, 2004 (gmt 0)

10+ Year Member



>>> This is your mistake erk. You should have used the new content on the old url's only. Depending on 301 redirects has never been a wise decision.

Totally agree with Webnewton - this should be the main problem that caused your site to drop dramatically.

2:01 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



Bummer. I did this because I didn't want Google getting annoyed that I had two pages with the same content (the old URL and the new URL). If I reinstate the old URL, how does this help me since Google thinks they are permanently moved?

The new URL is (I think) better SEO (e.g.: the old URL was example.com/productname [the product name being non-descriptive], the new URL is example.com/products/radars/productname [radars being descriptive].

Karmov: Thanks for the welcome!

vkaryl: The bottom nav is a bit arcane, however, we have 99.9% of users with JS enabled. That being said, we did consider putting a cloned static menu under where the JS menu is rendered (so it would look better, etc). However, we were concerned that having that menu under the JS menu might be considered devious? But I really didn't know if Google would detect a div that was created by Javascript? If not, doesn't that seem to be a flaw in their method of detecting if someone is hiding content under a div?

BTW: All this feedback is superb - thanks!

[edited by: ciml at 5:49 am (utc) on July 24, 2004]
[edit reason] Examplified [/edit]

3:22 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



To find out if search engines can follow your linking structure go to:

[searchengineworld.com...] > SE Tools > Sim Spider

Type in your URL and then see if the pages indexed are the one's you'd expect.

I'd have to say kill the JavaScript menus to be honest.

I also notice you have no backlinks?!

[edited by: DaveAtIFG at 7:02 pm (utc) on July 24, 2004]
[edit reason] Linked URL [/edit]

3:45 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



petehall,

By kill the Javascript menus, you mean just completely eliminate them? This was a huge feature request to allow visitors to get around quickly from any page in the site. Couldn't I get away with just having a site map too?

Also, by back links do you mean that as I go deeper in the page hierarchy, I can also traverse back up the hierarchy (eventually all the way to the homepage)? If that is what you mean, I do have that in the footer links. Well - I start to say that - however, it does appear the last step (to the homepage) is missing! Thanks for that catch!

5:35 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



What webnewton said. Those 301's can give people fits, even when brilliantly executed. We've lost traffic from between 2 to 6 months when making changes of the type you describe. Be patient and just ensure that that other potential issues are addressed in the meantime. Good luck.
6:14 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



Oh the shame! In a positive light, almost none of these pages were getting position based on Page Rank (the reason we felt ok to go with the 301). We are now in a strong push to get our resellers to link from their sites to our product pages. Hopefully that will help.

I'm going to see how we can deal with the site map/validation/menu in the mean-time.

11:38 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Running the pages through [validator.w3.org...] isn't to ensure that the code is perfect, as it rarely is, but is only to convince yourself that the code is not so bad that it is unspiderable, and to find any major errors that you may have missed and which might actually still render OK in a browser like IE that doesn't care at all as to what errors you have made in that code, but which might stop the spiders from indexing your site.
12:13 am on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



[Beforehand caveat: I don't do seo/sem etc. So I don't know how this stuff falls out for that.]

Considering the stated caveat though, you should honestly be considering the USABILITY OF YOUR SITE - FIRST! Search engines etc. should come AFTER you make sure that your clients/potential customers/fly-by visitors are able to fully utilize your site.

Which was why I posted the stuff about your text-link setup. I really think as far as usability, you'd be better off to "fish" for whether the browser has js disabled and if so, move the menu-section to a text-linked one set up like the js one as much as possible. In other words, if the browser has js disabled, load "textmenu" instead of "jsmenu" in that portion of the page.

11:23 am on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Those 301's can give people fits, even when brilliantly executed.

I would second that. 6 months on, stil waiting for Google to discover the 301!

Just relieved to see lynx viewer showed all my page links are crawlable. Sim Spider had indeed put me in a fix, for it showed none of the links were crawlabe.
Not sure which one to believe though (?)

Mc

4:38 pm on Jul 24, 2004 (gmt 0)

10+ Year Member



5 years experience:

Validation is simply not an issue - but obviously the Bot has to be able to crawl! (Doh!) But this has as much to do with vaildation as a locked bank vault has to do with being a member of the institute of locksmiths.

JS Links - the jury is *apparently* out on whether they are crawled or not - but in my experience, they are NOT crawled - use standard links.

Robots text - don't bother unless you have a purpose for it.

Send all further Google urban myths to an urban myth website. The key lies in content and semantics (and clicking on your competitors' stupid money-down-the -drain-adwords - sorry, just a joke ;-)

BTW, concur, 301 - forget it - too dangerous, stick with your temp redirect. The bloody Sun might implode before Google recognises your 301. In the meantime, your site has been de-listed and, given the timescale, you have died anyway.

6:58 pm on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



6 months on, stil waiting for Google to discover the 301!
Why wait? Tell Googlebot [google.com] where to find it.
 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month