Welcome to WebmasterWorld Guest from 34.228.41.66

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Changed from OSC to Magento - All subpages gone from serps

     
12:13 pm on Apr 21, 2015 (gmt 0)

New User

joined:Apr 21, 2015
posts:2
votes: 0


Hi there,

I've read around a bit about this now but my question is more concentrated on one specific point which I haven't found much information about.

About 5 weeks ago we switched from Oscommerce to Magento as OSC wasn't really doing the job anymore and we want to use software to synch with ebay etc.

Anything which is in the header tags for the home page is still holding the same rank but every sub page has dropped completely out of the serps.

For example

cheap shoes
cheap shoe
shoes canada
cheap mens shoes

all pretty much retained their rank from before. This is the home page still attracting traffic from these searches


whereas

pink shoes
green shoes
dark blue shoes

have all dropped from top 10 to not in the top 100


From everything I read so far, these should come back in time but generally what I read the time was around 4 weeks tops.

Unfortunately I slacked on doing 301 redirects as I didn't quite get the gist until it was too late so now I am waiting on these pages to get reindexed as they were before.

Will this happen 'in time' as I've read ?


The worry I have is that my url structure has changed a bit.


Before we had website.com/mens-shoes.php

Now it's more like website.com/shoes/mens.html



The thing I question (my designer is not an SEO expert) is that we don't have pages which will open on the /shoes/ part

For example you can not directly click on the shoes menu and nothing resides at www.website.com/shoes


Hovering over the shoe menu the link is javascript:void(0) so it can not be clicked but the submenus will pop out on hover



So my big question is will this be interfering with the SEO of the site?


Thanks in advance for your help.

Tina
7:09 pm on Apr 21, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13007
votes: 222


Um, these are things that should have all been figured out and mapped out before you launch. You have a lot going on there. You need to get your redirects done. You probably should be using rel=canonical. There are a number of pages that you should noindex or block in robots.txt (for example, in Magento I block anything having to do with customer accounts or the shopping cart, search results pages, tags, reviews, and the sorting) You probably want go into Google Webmaster Tools and define your URL parameters. You will want a Google sitemap, but you need to make sure it doesn't add any pages that you are blocking elsewhere.

In short, there's a lot to do with Magento. There are some SEO type extensions that will get some of it done for you. But first you better fix your redirects.

The last time I did an OSC -> Magento launch, I think it took about 3-4 weeks to get most things properly reindexed and settled down, and by eight weeks out it was all good. I left the 301s in permanently, because Google has never met a URL it really forgets, and the old ones still pop back into GWT every now and then. Probably from links from scraper sites.
11:38 am on Apr 22, 2015 (gmt 0)

Preferred Member from GB 

10+ Year Member Top Contributors Of The Month

joined:July 25, 2005
posts:404
votes: 16


slacked on doing 301 redirects as I didn't quite get the gist until it was too late

Never too late to do 301s. You may still be able to recover. Not only it will help Google better understand your new site. By getting the 301s in place you also get to keep your old 'link juice' which you would otherwise lose.
Say, if somebody used to link to this page: example.com/mens-shoes.php the link would no longer help you unless you do 301s

Some awesome tips from @netmeg, I'd just add a little trick I've used on a Magento site I recently finished.
To keep the sorting/options/filters pages out of bots' reach, you can add this to robots.txt
Disallow: *?*=*

From my experience, the filter pages is the biggest Magento stumbling block. One moment you create 50 product pages and the next moment you've got 300 pages with duplicate content :)

Also, about your hierarchy. So, you're saying example.com/shoes/ is blank?
Your category pages are usually very important from the customer journey point of view. See how big retailers use the main category pages to display compelling content and entice people to browse further.
You can create static blocks of custom content (think big images and compelling textual content)
CMS -> Static Blocks
and then apply the Static Blocks via
Catalog -> Manage Categories

the shoe menu the link is javascript:void(0)

Not ideal. Have you given Google an alternative route to be able to navigate and understand your site?
Sounds like you'll have to invest a little bit of $$$ in a custom 'Skin'
12:37 pm on Apr 22, 2015 (gmt 0)

New User

joined:Apr 21, 2015
posts:2
votes: 0


Love the replies you guys.. thanks so much.

I did all the 301s I could muster (basically the colour pages and category pages) from old site to new and ones which were gone. Also I did some site: searches and found urls that I could still link products up with.

Regarding disallow I added this section to robots.txt immediately


## MAGENTO SEO IMPROVEMENTS

## Do not crawl sub category pages that are sorted or filtered.
Disallow: /*?dir*
Disallow: /*?dir=desc
Disallow: /*?dir=asc
Disallow: /*?limit=all
Disallow: /*?mode*

## Do not crawl 2-nd home page copy (example.com/index.php/). Uncomment it only if you activated Magento SEO URLs.
Disallow: /index.php/

## Do not crawl links with session IDs
Disallow: /*?SID=

## Do not crawl checkout and user account pages
Disallow: /checkout/
Disallow: /onestepcheckout/
Disallow: /customer/
Disallow: /customer/account/
Disallow: /customer/account/login/



I guess it doesn't hurt too much to put my site in here? Maybe it will help you better understand what I mean regarding the hierachy.

I do like your idea about having it clickable with compelling custom content, but then it gets a bit tricky with the ipad as they would go to the category instead of opening the submenu unless I'm able to make it separate coding for the ipad as far as the menu goes.

> Also, about your hierarchy. So, you're saying example.com/shoes/ is blank?


Yes, the customers can't access it but the bots can I guess? Silly mistake now perhaps I think about it!
6:58 pm on Apr 22, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15313
votes: 707


Disallow: /customer/
Disallow: /customer/account/
Disallow: /customer/account/login/
If that's someone else's recommended ruleset, I recommend getting your rules from someone else in the future. The subdirectory /customer/account/ is included within the directory /customer/. And /customer/account/login/ is included within /customer/account/ which in turn is included within /customer/. A robots.txt block refers to the entire directory, not just to files in its top level.
8:36 pm on Apr 22, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13007
votes: 222


(Don't put your site in here, it's against the TOS)

Here's a snip out of one of the Magento sites I oversee. Some of this stuff won't apply to you because it's custom to the site. I can't use adder's suggestion because we do have some pages that our configured with ?= from an add-on someone custom built.

# global
User-Agent: *
Disallow: /catalogsearch
Disallow: /checkout
Disallow: /customer
Disallow: /review
Disallow: /sales
Disallow: /wishlist
Disallow: /category
Disallow: /index.php
Disallow: /upload
Disallow: /facebook
Disallow: /report
Disallow: /product_compare
Disallow: /sitewideimages
Disallow: /onlinecatalogs

# wildcard
User-Agent: *
Disallow: /*page
Disallow: /*shipping_time
Disallow: /*rsort
Disallow: /*cat=
Disallow: /*price=
Disallow: /*SID=
Disallow: /*low_price
Disallow: /*option_id
Disallow: /*limit=
Disallow: /*dir=
Disallow: /*order=
Disallow: /*sendfriend
Disallow: /*mode=
Disallow: /*origin=
10:01 pm on Apr 22, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13007
votes: 222


(Opps, that should say out of the robots.txt for one of my Magento sites. My fingers aren't as quick as my brain)
10:48 pm on Apr 22, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15313
votes: 707


If your robots.txt has two separate blocks naming the same user-agent-- in this case * "everyone else" --can you be certain that ordinary compliant robots will heed both?
12:20 am on Apr 23, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13007
votes: 222


So far they seem to but you're probably right. I just combined two into one file.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members