Welcome to WebmasterWorld Guest from 22.214.171.124
1- I created a sitemap and submitted through Google sitemap.
2- I changed all my internal relevant url's to absolute url paths.
3- I changed my parameter from?Id=346 to?property=346
4- I redirected my non www. to www. by permenant redirect 301
5- I have more back links than I had it before. (not much but still more than before.)
6- I am sure I am not doing the black hot SEO to my pages.
7- I see Unreachable URLs under sitemap Diagnostic area, but I already changed the url's more than month ago and they are not exist anymore.
8- I see URLs timed out under sitemap Diagnostic area, again those links are not exist nor in the sitemap or in our site.
I changed my parameter from?Id=346 to?property=346
I believe that it means that all your URLs has changed.
If I am correct Google consider all your pages as new and you experience a kind of the sandbox.
It would be probably better to redirect old URL to new ones with 301.
All the internal PR accrued to the old pages due to internal linking are lost for the time being too.
If I were in your place, I would make an immediate switch back to the old site structure and URLs.
I have done this one change years back, and suffered even then for the better part of an year.
NEVER change indexed URLS unless you want to permanently redirect all old URLs to their new URL counterparts. Sometimes an impossible task, especially on Windows.
when I changed the url parameter, it's changed for almost 3000 property url's, how possible to pick them and do 301 for each one, is there a way of doing this with a peace of code for all in one go?
sample old url
New url is
old urls are showing "URLs timed out" and "Unreachable URLs" in Googles sitemap and still saying found in my submitted sitemap which I have changed and re-submitted the new sitemap with new urls 2 months ago. Since than Google is downloading my sitemap every day.
Google for a tutorial, or browse the Servers sub-forum here at WebmasterWorld. This code has been posted many times before.
any idea for windows server?
I am allready using this code for redirecting non www urls
Domain_Name = lcase(request.ServerVariables("HTTP_HOST"))
if domain_name <> "www.mysite.com" Then
HTTP_PATH = request.ServerVariables("PATH_INFO")
QUERY_STRING = request.ServerVariables("QUERY_STRING")
theURL = "http://www.mysite.com" & HTTP_PATH
if len(QUERY_STRING) > 0 Then
theURL = theURL & "?" & QUERY_STRING
Response.Status = "301 Moved Permanently"
Response.AddHeader "Location", theURL
[edited by: dede_dublin at 11:36 am (utc) on June 23, 2006]
<<< NEVER change indexed URLS unless you want to permanently redirect all old URLs to their new URL counterparts. Sometimes an impossible task, especially on Windows.>>>
Msn and Yahoo picked the new url's and drop the old ones farily quick, I don't know why Great Big Technology Giant Google can't do the same.
"I have tons of incoming links from other sites!"
Well, let's take a look at those "links", and whetehr they are helping you, or in 100% of the time, we see they are hurting you.
But the problem is, what kind of neighborhood are these incoming links to your site coming from? Would you use Timothy McVie as as a reference for your resume for example?
On Google Sitemaps forum where they do allow you to post your site, I have checked numerous web sites from people with this same problem and 100% of the time you find each page which they claim to be Supplemental, might only have a couple of links coming in from obvious loser sites, often PR=0, often Supplemental themselves.
Add to the that the fact that even thought they all claim "I'm doing everything right, I'm not violating Googles rules!" When you look at their site, it's so cheezy, the layout stinks, and there's Adwords all over the place, no human editor would spend more than 5 seconds on the page before rejecting it.
I think Google has been clamping down on links from all these loser sites recently, and as they make more pages Supplemental in the index, they probably don't trust any links coming from a Supplemental page, so bottom line is you probably don't want incoming links from Supplemental pages of other web sites.
Thgis is why I do NOT do cross linking even though web sites ask me daily! Furthermore, many webmnasters are scum and show you that they link to your site, but then use a nofollow tag in the HTML code.
Rather you want links coming from high PR pages from authority sites with REAL CONTENT, .ORG sites, .EDU sites, .GOV sites, which are much higher quality incoming links from a scraper site, or SERP page, or a cheezy looking site.
One link of a higher quality type can be better than 50 links combined from Supplemental pages, which I try to avoid whenever possible.
Try doing other searches in Google such as link:, allinurl:, inanchor:, and see what other sites are linking to you.
We found over 90,000 of them a month ago, mostly scraper and bogus SERP pages, some doing 302 redirects on your incoming links, to try to make you look like a black hat SEO.
Welcome to the party!
agreed on your point about yahoo and msn. But this is the way it is with google.
even 301 redirecting old pages to new ones work easily only on the apache server. I have windows sites, and the rule of thumb here is - never change a URL. Google cares about the age of a page, the links coming to it, and if suddenly that page vanishes and a bunch of fresh pages appear, they are treated as fresh pages. switch back, if google has been crawling your earlier pages without any problem.