Welcome to WebmasterWorld Guest from 54.81.220.239

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

year old 301'd domains reappearing in SERPs

     
2:34 pm on Apr 22, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Sept 14, 2004
posts:56
votes: 0


I have a lot of domains that i 301'd over a year ago reappearing in Google SERPs. These domains haven't appeared for at least 10 months. anyone else seeing this? I wonder if this has anything to do with the recent work on 302 issues.
6:28 pm on Apr 22, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 12, 2004
posts:961
votes: 0


I did on March 23, coinciding with a >80% drop in referals from Google. The domain had been gone probably a year at least.
7:04 pm on Apr 22, 2005 (gmt 0)

Senior Member

joined:Oct 27, 2001
posts:10210
votes: 0


I don't know about year-old pages (since I wasn't redirecting anything a year ago), but pages that I moved and redirected with a 301 recently are showing up twice: once at their old URLs, and again at the new URLs.
12:04 pm on Apr 23, 2005 (gmt 0)

New User

10+ Year Member

joined:Apr 5, 2004
posts:21
votes: 0


Hi, same here. I had mysite.com and anothersite.com pointing to the content of my-site.com (very bad idea). However I started with the 301 redirect to my-site.com a few months ago. A week ago or so all versions started to be reappear now showing a lot of suplemental listings. These suplemental listingsīcache is dated April 2004 though. I hope Google is still processing whatever it has to process and will get rid of those old listings very soon. After that I hope it will do a new crawl and get all listings up to date.

One question: would it be a good idea to use the removal tool and get rid of these two other versions once and for all? Or will that kick my actual page out of the index now that Iīm 301 redirecting the ones to be deleted?
At the moment my site seems to be getting a huge hit in the SERPs although it again got a PR7 with the latest PR update. I wonder if all these suplemental listings are harming my actual site.

12:16 pm on Apr 23, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:June 22, 2004
posts:67
votes: 0


I had the same issue with old pages showing up with cache date of nov 04 as well as a whole buch of other old pages from a site upgread.

G just came through and did a full deep crawl, within 2 days G now showes only the new pages with fresh tags and all of the 1200 old pages are gone.

It appears that it cleaned everything up. Side note, I did not notice any toolbar pagerank updates.

My thoughts are that they added an old index to bump up the total indexed page numbers when MSN launched their new SE the mix. and are now going through and cleaning everything up.

6:21 pm on Apr 23, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 31, 2004
posts:43
votes: 0


I have a related 301 problem: a 301 to redirect non-www to www pages. It has been in place for > 6 months and the non-www continue to appear. Testing the redirect in a browser and reviewing logs indicate it is working properly, however google doesn't update. The www pages are being spidered regularly.

Mito99:

>One question: would it be a good idea
>to use the removal tool and get rid
>of these two other versions once and
>for all?

Be very careful with the removal tool. I just nuked a bunch of my www pages trying to remove non-www then to find out a 6 month wait before reinclusion. See this thread: [webmasterworld.com...] msg #:232

6:37 pm on Apr 23, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


>> I have a related 301 problem: a 301 to redirect non-www to www pages. It has been in place for > 6 months and the non-www continue to appear. Testing the redirect in a browser and reviewing logs indicate it is working properly, however google doesn't update. <<

Make a list of all the URLs that you do not want in the index. Make that list into a page of links and get that loaded onto another site somewhere. It is likely that Google isn't crawling the old URLs and therefore has not seen the redirect. A page of links to crawl will get them started. In the short term the number of rogue pages in the index might rise, but will then fall to almost zero. It takes at least a few weeks to sort out.

Check your internal links to folders. Make sure they all end in a trailing / on the link. This avoids the redirect from entered-domain.com/folder to $default-domain-name$.com/folder/.

7:07 pm on Apr 23, 2005 (gmt 0)

Senior Member

joined:Dec 29, 2003
posts:5428
votes: 0


I wonder if they are calculated when G does the dupe /penalty checks? what do you guys think?
2:30 pm on Apr 25, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 31, 2004
posts:43
votes: 0


g1smd - thanks for your reply:

Make a list of all the URLs that you do not want in the index. Make that list into a page of links and get that loaded onto another site somewhere. It is likely that Google isn't crawling the old URLs and therefore has not seen the redirect. A page of links to crawl will get them started. In the short term the number of rogue pages in the index might rise, but will then fall to almost zero. It takes at least a few weeks to sort out.

This is done a/o 5/23. Placed list on main of another (unused) domain and submitted that url to G. Will wait to see how long it takes either for a visit or for the 301 removals.

Check your internal links to folders. Make sure they all end in a trailing / on the link. This avoids the redirect from entered-domain.com/folder to $default-domain-name$.com/folder/.

I think I am ok here, but will recheck.

This should be a start, now if I could only get rid of those nasty "supplementals" (:

3:59 pm on Apr 25, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Mar 31, 2003
posts:386
votes: 0


This is the exact same thing that's happened to me as of 3/23. Phrases that I'm used to monitoring now return "supplemental results" that rank *very* poorly, and my newer pages are nowhere to be found. However, if I request "more results from www.mydomain.com" my newer pages show up in the results.

My feeling is that my newer pages are in the index but have not yet had pagerank transferred to them properly; or my newer pages aren't in the index at all. It is conceivable that my pages are in the "per domain" index file, but not the regular results.

Also, my old supplemental pages haven't been crawled en masse by Googlebot for quite some time: at least several months. I only see one or two being 301'd here and there in my logs.

11:44 pm on Apr 26, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 31, 2004
posts:43
votes: 0


This is the exact same thing that's happened to me as of 3/23. Phrases that I'm used to monitoring now return "supplemental results" that rank *very* poorly, and my newer pages are nowhere to be found. However, if I request "more results from www.mydomain.com" my newer pages show up in the results.

I'm coming at it from a different approach, ie not serp, instead using site: or allinurl: to get an idea of total in the index. I'd estimate perhaps 90% of my pages are either non-www, url only, not cached, or supplemental.

Also, my old supplemental pages haven't been crawled en masse by Googlebot for quite some time: at least several months. I only see one or two being 301'd here and there in my logs.

Well, I thought mine were being crawled en masse. I'm getting +/-6000 googlebot requests per month (it's a small site). Yet some of the cached files have dates as old as March 04 and Oct 04. The files in question are certainly being requested, but as g1smd mentions above, evidently the non-www url is not. Just checking the logs now, I don't see many 301s associated with googlebot, though do with a few others.

One question I have on the supplemental, url-only, etc issue is page title and meta content. My page 1 for blue-widget has the same <title>, meta kw and desc as pages 2 and up. The head section is generated dynamically with a php .inc file for each product category. So, blue-widget.php is indexed, cached etc all fine, but page 2 which is blue-widget.php?offset=10 is not. I'm code challenged, so I'd have to hire someone to make any programming changes. Would the simple addition of "Page 2" to <title> and some different kw's/desc make a difference?

As an aside, all the above mentioned issues do not occur with MSN. Pages are in cache, cache dates are fresh, many times < month, new pages appear quickly, etc.

1:59 am on Apr 27, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


Don't forget that a page might not actually be a "supplemental result" for all search queries that the page can be returned for.

Once a page is a supplemental result for a particular search query then the title and snippet are not updated again (for that search query).

You can change the content on the page and it will rank for the new content, the new content will be cached, but the page will also continue to appear in SERPs relating to searches for the old content even if that content is no longer actually on the page or in the cache.

4:07 am on Apr 27, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Mar 31, 2003
posts:386
votes: 0


That's assuming the new content has the same URI as the old content.

Supplemental results occur a lot from orphaned (i.e. non-linked-to) pages, so if you update a page it never really becomes a supplemental result. :)

5:35 pm on Apr 27, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


Even if you do update a page it can still become a supplemental result for search terms that used to be on the page but are no longer on the page.

This can happen even if Google does update the cache for the page.

A page can be a normal result for searches based on current content, but can be a supplemental result when making a search that includes words that used to be on the page but no longer are.

6:55 pm on Apr 27, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:Mar 31, 2003
posts:386
votes: 0


Hmmm, dunno if that's right :)

I've *never* seen that happen with my site. Ever.

8:36 pm on Apr 27, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0



Look harder. I see it all over the place. :-)
4:47 pm on Apr 30, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 31, 2004
posts:43
votes: 0


Following up on this. Submitted reinclusion request per GG's instructions in the 302 thread - [webmasterworld.com...] (For those of us who removed our www pages along with non-www.)

Got reply back from support ". . . Please note that we searched for example.com and found that it is currently included in our search results. . . " Unfortunately, they made no mention of the underlying problem with the non-www pages and with getting the www pages which I inadvertently nuked back into the index before 6 months.

g1smd: If you're still following this thread, an update on your suggestion re linking to the bad (non-www) urls from elsewhere. Submitted url 4/23 (not 5/23 as I previously posted). Yahoo found the site on 4/27 and again on 4/29. Googlebot hasn't been around yet. Will continue to wait.

[edited by: ciml at 7:27 am (utc) on June 13, 2005]
[edit reason] Examplified [/edit]

6:07 pm on Apr 30, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


For the 150 page site where we sorted out duplicates by making sure that all folder links ended with a trailing / and then resolved www and non-www with a 301 redirect in mid to late March (not sure of the date - somewhere between 18 and 25th I think), the site originally had about 50 www and 70 non-www listed, many of each without title and description.

Within a few days of sorting out both the links and redirect, nearly all of the non-www pages had a title and description where the URL also had a trailing / on it, but there were still many of the other 3 variations also with a title and description. All of the non-www pages with trailing / on, were now in the index. There were also a large number of www URLs with and without a trailing / on, and a large number of www URLs without a trailing / that were also listed, but mostly without title and description.

Within another week, all non-www with / had a title and description, and only about 10 of the non-conforming URLs still had a title and description. The number of URLs (the ones we wanted removed) without title and description originally shrank to a few dozen, and then got "stuck". I then put the "external sitemap" in place, and the number of URLs without title and description started falling again, and then a few days later rose again to well over 100. After a few days, the number then started falling again, and has continued to fall. The number gets less by 6 to 8 links every 2 to 3 days, and is now down to 14. All of the non-www with / continue to be fully indexed, and every page of those is indexed. Everything else is without title and description, and is fast disappearing.