Forum Moderators: Robert Charlton & goodroi
I set up redirects through .htaccess to keep the back links and page rank.What did you do for this exactly? Sometimes technical mistakes block or direct search engines to a wrong direction.
Out of curiosity have you noticed if your home page has been replaced with an inner page and that page appears maybe somewhere around page 7 (no not because I love that number!) in the SERPs?Not that I can see
What did you do for this exactly? Sometimes technical mistakes block or direct search engines to a wrong direction.I have a lot of redirects in my .htaccess file because i had three landing pages and I have another url pointing to this domain so its a bit complicated. I think the .htaccess file is okay because I got the host company to do it for me, but I am not 100% sure.
If it was as simple as you've described - from subfolder to the root of your domain - you should be able to achieve it with a single line in your .htaccess.
Also, have you checked what G says for your robots.txt in WMT?I had a look and asked WMT to fetch the file - it said "Crawl postponed because robots.txt was inaccessible". I dont understand this as it is a basic robots.txt file and is just sitting there in my root directory waiting to be spidered! So I ran the "test" function and this is the result.
I had a client a few months ago who was in a similar situation as yours, it turned out he enabled rel canonical on some pages, but it in fact was canonicaling all pages to one page on the site, causing a massive site de-index. Such a simple thing was overlooked for many months.This is something I have to do too - canonicalise. Any advice or tutorial on this would be appreciated.
[edited by: Robert_Charlton at 5:16 pm (utc) on Nov 5, 2012]
[edit reason] removed domain name, fixed paragraph formatting [/edit]
On further inspection, it looks like your CMS is duplicating some content. I won't post the full URLs as mods will likely snip all references, but;I had no idea this was happening. Also I dont know how you found it out but thanks. I did disable the review feature in Cubec.rt a while ago by editing some of the files. Do you think this is what has caused these additional urls? If so, if I change the files back to allow reviews do you think it will resolve this issue? If I do change the files back, could you check by whatever methods you used to find this issue in the first place? How did you do it by the way?.
1) Is product URL - /product_###.html
2) Is index.php page - /index.php?_a=viewProd&productId=###&review=write
3) Is variation on product URL - /product_###.html?review=read
I checked my sitemap and got an error "Missing "charset" attribute for "text/xml" document" I did a bit of research and was even more confused as to whether to use an internal DTD or external DTD or whatever.
It's possible that the change in your structure has caused Google to reevaluate your site...Usually when you have a dramatic change in site structure there will be a full recrawl and hence more discoveries by more savvy bots.
Is it xml? It should look like this, minus the {braces}. Note the first line. Question marks are part of the format.Thanks for that. Yes that is exactly what I have but GWT is still saying "no sitemaps found for this site"
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="{blahblah}">
<url>
<loc>http://www.example.com/</loc>
{ <changefreq>monthly</changefreq> }
</url>
{ et cetera }
</urlset>
blahblah = http: // www.sitemaps.org/schemas/sitemap/0.9 (without spaces). I thought this was the people that made my sitemap, but it's the exact format from google's how-to page. NOT your own sitename here! That goes in place of "example.com".
deborahbaker - I've edited one of your earlier posts to remove a link to your siteSorry it slipped in when I copied and pasted the results.
When I took a look, I saw that much of the content on your site is not uniqueYeah you are right and I know about unique content but I think due to the age etc of this site it has not been an issue. I have been gradually replacing it. You know when this site was developed Google did not even exist. It was just Yahoo and since its birth it has had number 1 placements. You are right I did copy some of the content because duplicate content was not an issue then. But a lot of it is my original content which has since been duplicated by other sites. Also my site gets plenty of new unique content everytime a product is uploaded there is a huge description added. I really dont think this is the reason for its recent problems and is one of the reasons I havent bothered too much about the content (if it aint broke dont fix it - which I really wish I hadnt done 6 months ago as I wouldnt be in this position now) I am sure it is one of the other reasons above that I am trying to resolve but I do take your point and I am going to change all the content. Incidentally since I changed the robot file my site is climbing back up the rankings. But I want to resolve all my above issues so please if anyone can help then post here.
One thing also, some pages you should "noindex, follow" for example your search results pages and your "tell a friend pages". (If the site I see is yours I see about 4000 or more tell a friend pages in the index)Thanks I hadnt thought of this. I really have opened up a whole can of worms I now have to deal with.