I would suggest not having a sitemap is a bad move.
>>I would suggest not having a sitemap is a bad move.
Not necessarily, the home page now doubles as a sitemap:
>>We removed a sitemap page, choosing instead to put links to all the product pages in the footer of the homepage.
Doesn't have to be labeled a sitemap to actually be one. Works great for this site [webmasterworld.com]
301s can take some time to take effect. I would suggest continued patience.
Another factor could be moving all of your links to your footer. Proximity matters and having all your links as the last thing on your page might be causing you trouble. Sometimes it's a really good idea to put your links at the bottom of a page, other times not. But I'd wait another few weeks for the 301s before doing anything radical, particularly if your pages have "disappeared" rather than just suffered a drop in the SERPs.
That is what was most shocking. Many pages literally just dropped of Google totally. As I said in my first post, first we had both the old and new pages on the results page, then just the new, and now nothing.
I will wait a while and pray the listings return. I'm just frieghtened that we may have unwittingly done something that may have blacklisted us!
[edited by: DaveAtIFG at 7:58 pm (utc) on July 24, 2004]
[edit reason] URL removed [/edit]
Welcome to Webmasterworld erk01.
You might want to validate your HTML. The answer to your problem may lay there. Also if you want people to see your page just put it in your profile.
I have to disagree about HTML validation being a factor - my sites are a mess. But despite not dotting the t's and crossing the i's (or is that the other way round ;), Google position is great, and they look fine on the users' screens.
...and take a look at those high ranking university pages - do you think the prof cares (or even knows) about HTML validation?
To make good validation a part of the ranking would play directly into someone's hands (that someone being *us* - SEO types!)
So let's not propogate another old Google myth.
html validation is not a "must" to get indexed or get high rankings. I know it hurts the purists to hear this, but it is the truth.
As long as there isn't something horrible that stops a spider from crawling, everything will be fine.
It is a matter of function over form.
I've used lynx viewer to test the crawlability... is there a better method/tool?
I have done this a few times. You have to be patient with the 301 redirects. There is nothing you can do but wait, it usually took google 2-4 weeks to sort out my sites.
When you changed the urls you lost all the pagerank on those pages so it is normal for the new ones not to rank well.
I donít disagree that the validation thing is a myth. I guess my point was that if you drop out of the Serpís after changing the HTML there may be coding problems that make it problematic for googlebot.
I too have sites that are just a mess that rank fine. That was my only point about validation.
google frowns upon sitemaps?
erk01: Lynx is a pretty good way to check to see what bots see, that's what I use. Also since I missed it the first time
Welcome to Webmasterworld!
kosar: I've seen no evidence whatsoever that would suggest that Google frowns upon sitemaps. Google tends to promote things that make sites good for people to use and sitemaps are often a very useful tool that people use while browsing the web.
In reference to the validation, though it's true that valid code is not required for Google, I think the idea is that if you've written valid code you're less likely to have written some tag soup that could trip up a bot and have them miss/skip your links. I'm in firm agreement on that point and it's one of the reasons why I use valid code on my site. There are more, but that's another topic :)
Googlebot thinks that your internal pages are not linked from anywhere and has rightfully kicked them off the index.
Actually, the js nav isn't the ONLY nav tool there. Bottom of page text links exist. HOWEVER.... if one has js disabled, it's rather difficult to really WANT TO navigate with the text links, which seem to have little "coherence". It's a nice page, attractively laid-out - but something needs done with the mish-mash of grey text-links at page bottom.
A somewhat larger font-size would help, perhaps as a mouse-over change (from the static as-presented to an ALL CAPS BOLD, maybe). A more obvious mouse-over color change would help. A change from the "¶" (pipe symbol) to a different symbol in a different color would help.
A CSS left menu might be the best solution. Circumvents the "no-js" situation, as well as the "fade-to-nonentity" grey text-links.
|Many of the URLs for site pages changed. However, all the old address were replaced with a 301 - permanently moved error - and a new URL provided. |
This is your mistake erk. You should have used the new content on the old url's only. Depending on 301 redirects has never been a wise decision. What now:
1)either consider that you've to reoptmize your pages and will have to wait for some time befoer they get back to where they were before.
2)Or reinstate your old url's and put the new text there.
>>> This is your mistake erk. You should have used the new content on the old url's only. Depending on 301 redirects has never been a wise decision.
Totally agree with Webnewton - this should be the main problem that caused your site to drop dramatically.
Bummer. I did this because I didn't want Google getting annoyed that I had two pages with the same content (the old URL and the new URL). If I reinstate the old URL, how does this help me since Google thinks they are permanently moved?
The new URL is (I think) better SEO (e.g.: the old URL was example.com/productname [the product name being non-descriptive], the new URL is example.com/products/radars/productname [radars being descriptive].
Karmov: Thanks for the welcome!
BTW: All this feedback is superb - thanks!
[edited by: ciml at 5:49 am (utc) on July 24, 2004]
[edit reason] Examplified [/edit]
To find out if search engines can follow your linking structure go to:
[searchengineworld.com...] > SE Tools > Sim Spider
Type in your URL and then see if the pages indexed are the one's you'd expect.
I also notice you have no backlinks?!
[edited by: DaveAtIFG at 7:02 pm (utc) on July 24, 2004]
[edit reason] Linked URL [/edit]
Also, by back links do you mean that as I go deeper in the page hierarchy, I can also traverse back up the hierarchy (eventually all the way to the homepage)? If that is what you mean, I do have that in the footer links. Well - I start to say that - however, it does appear the last step (to the homepage) is missing! Thanks for that catch!
What webnewton said. Those 301's can give people fits, even when brilliantly executed. We've lost traffic from between 2 to 6 months when making changes of the type you describe. Be patient and just ensure that that other potential issues are addressed in the meantime. Good luck.
Oh the shame! In a positive light, almost none of these pages were getting position based on Page Rank (the reason we felt ok to go with the 301). We are now in a strong push to get our resellers to link from their sites to our product pages. Hopefully that will help.
I'm going to see how we can deal with the site map/validation/menu in the mean-time.
Running the pages through [validator.w3.org...] isn't to ensure that the code is perfect, as it rarely is, but is only to convince yourself that the code is not so bad that it is unspiderable, and to find any major errors that you may have missed and which might actually still render OK in a browser like IE that doesn't care at all as to what errors you have made in that code, but which might stop the spiders from indexing your site.
[Beforehand caveat: I don't do seo/sem etc. So I don't know how this stuff falls out for that.]
Considering the stated caveat though, you should honestly be considering the USABILITY OF YOUR SITE - FIRST! Search engines etc. should come AFTER you make sure that your clients/potential customers/fly-by visitors are able to fully utilize your site.
Which was why I posted the stuff about your text-link setup. I really think as far as usability, you'd be better off to "fish" for whether the browser has js disabled and if so, move the menu-section to a text-linked one set up like the js one as much as possible. In other words, if the browser has js disabled, load "textmenu" instead of "jsmenu" in that portion of the page.
|Those 301's can give people fits, even when brilliantly executed. |
I would second that. 6 months on, stil waiting for Google to discover the 301!
Just relieved to see lynx viewer showed all my page links are crawlable. Sim Spider had indeed put me in a fix, for it showed none of the links were crawlabe.
Not sure which one to believe though (?)
5 years experience:
Validation is simply not an issue - but obviously the Bot has to be able to crawl! (Doh!) But this has as much to do with vaildation as a locked bank vault has to do with being a member of the institute of locksmiths.
JS Links - the jury is *apparently* out on whether they are crawled or not - but in my experience, they are NOT crawled - use standard links.
Robots text - don't bother unless you have a purpose for it.
Send all further Google urban myths to an urban myth website. The key lies in content and semantics (and clicking on your competitors' stupid money-down-the -drain-adwords - sorry, just a joke ;-)
BTW, concur, 301 - forget it - too dangerous, stick with your temp redirect. The bloody Sun might implode before Google recognises your 301. In the meantime, your site has been de-listed and, given the timescale, you have died anyway.
Why wait? Tell Googlebot [google.com] where to find it.
|6 months on, stil waiting for Google to discover the 301! |