|or would it be safe enough to point every link a the full URL of my main site |
I am afraid I couldn't get you.
But to avoid dupes, 301 is the better option because it also benefits your main site with the backlinks of dupe site, if it had one. But it can take a lot of time.
If the backlink benefit isn't significant enough, then consider removing it from Google, using Google URL controller.
Sorry wasn't clear enough. Rather than 301 mysite.net to mysite.com could I just have all the relative links on mysite.net be absolute but point at mysite.com?
e.g. On both sites using same web app location on server just have all the anchors fully qualified to http:/ /mysite.com/page. The side effect would be that every page is availble on mysite.net, but following any links would take you to mysite.com
I'm just wondering if this would cause dup problems compared to a 301 redirect
If you could also put a <meta name="robots" content="noindex, follow"> tag on all of the pages of mysite.net then that would work very well indeed.
That would be possible if the pages were on a different server OR if the pages were generated using a script.
You should also put that no-index tag on other served versions of a page, such as "print-friendly" versions, and so on.
I started a reorganisation of my site. Not having the option to 301 I put a noindex, follow on the oldtopic/oldfolder/oldpage.htm with a link on the page to newfolder/newpage.htm .
The first folder I moved was reindexed and the pages regained their PR within a few days. The next folder was slower to be indexed and 2 months down the line the pages still have no PR (white bar). The new pages remained where the old pages had been in the SERPS and have fared even better since the recent updates. The old pages have almost all disappeared. Does the 0 PR pose a problem or should I continue to finish the spring cleaning in this way?
[edited by: acemi at 7:59 pm (utc) on April 11, 2005]
Make a list of all the old folders that you do not want indexed. Add each one as a "disallow" instruction in the robots.txt file.
Sign up for the Google "URL console" service and submit the URL of the robots.txt file to that.
They will remove the old pages from their index in a day or two. Hopefully they will then index the new pages, if you get other links to them.
|Sorry wasn't clear enough. Rather than 301 mysite.net to mysite.com could I just have all the relative links on mysite.net be absolute but point at mysite.com? |
rescendent - I'm still not quite sure what you're asking. Are you saying that you have a site that shows identical content on two domains... mysite.com and mysite.net... and that you want to figure out a way of handling this to avoid the duplicate content problems of this kind of setup?
Please clarify, and we can go from there.
have had a 301 redirect from my non www to www version for 4 months but google continues to list both (with no desc and urls)
Would the remove page using the google urlconsole work to get rid of the non www one or is that dodgy?
It would remove both versions. Don't do that.
It is likely that Google isn't spidering the URLs that you want to drop and is therefore unaware of the redirect that is now in place.
Make a sitemap page that lists all the URLs that you want removed, and get a friend to put that page on their site for a few weeks. That will fix it.
Make sure that all internal links point to the correct version. For any links that point to an index file in a folder, do NOT include the actual file name, just end it at the folder name ensuring that it is always followed by a trailing / on the URL. The trailing / is very important.
I find that strange that google are counting them as different sites but If I want to delete one of them they will delete both......muppets!