homepage Welcome to WebmasterWorld Guest from 54.226.18.74
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 88 message thread spans 3 pages: < < 88 ( 1 [2] 3 > >     
Supplemental Results
Google supplemental result listings.
wiseapple




msg:773221
 3:46 pm on Jul 28, 2005 (gmt 0)

Greetings,
Using the "Site:" command on Google I have found a ton of our pages are marked as supplemental results. All of the cache are either: Nov. 27 2004, Dec. 1 2004, or Feb. 21 2005.

Does anyone have a similar problem?

Googlebot has visited these pages tons of times over the past few months. However, the cache remains out of date.

Thanks.

 

zeus




msg:773251
 3:40 pm on Aug 14, 2005 (gmt 0)

promis - I did that 4 month ago still nothing

promis




msg:773252
 4:21 pm on Aug 14, 2005 (gmt 0)

Zeus, I did something to "encourage" googlebot, as I noticed it was not checking for non-www pages and the cached pages dated back to September 2004. I pointed "in-site" non-www links to all the pages in the supplemental index only for a couple of days. Googlebot visited them all and verified 301 headers. Mind you, I never had incoming or in-site non-www links in the first place so I have no clue as to how google had those 38 pages in the supplemental index. One thing I noticed, my serps are still identical as those of yesterday before the removal of the suplemental index. Hope this helps.

g1smd




msg:773253
 4:29 pm on Aug 14, 2005 (gmt 0)

>> I did that 4 month ago still nothing <<

Set up a "fake sitemap" page that points to all of the pages that you do NOT want to be listed, and host that page on another site. Google will spider the page, see the links, and when it follows them it will pick up the 301 status for them all.

.

If your internal links do not include the domain name, and you also have no redirect, then a single incoming link to non-www is enough to get most of the site indexed under the wrong version. There is nothing in place to make the correction and include the www at any place on the site. The 301 redirect will correct that within a few months, weeks if you are lucky.

zeus




msg:773254
 4:35 pm on Aug 14, 2005 (gmt 0)

now I have pointed a link to the non www from another site and the supplemental pages which is not only non www in my case, I have removed the non www pages through the removal tool, let see what happen.

zeus




msg:773255
 7:22 pm on Aug 14, 2005 (gmt 0)

I found over 400 non www on 64.233.161.104 I have now linked to them from 2 sites splited up on 4 pages.

I have had troubles getting googlebot to spider my sites for real again after I was hit by the googlbur 302 and hijackers, after that I have not been present on google serps, once a month I see a single hit for my main keyword, but then its gone, maybe google have spidered the non www when I was hijacked and now im filtered because of those 400 non www pages, just a theory.

zeus




msg:773256
 11:35 am on Aug 17, 2005 (gmt 0)

The removal of the supplemental results pages in my site: search is complete, but nothing has changed when I do a site:mydomain.com they are still there, this could maybe mean that the supplemental results DB is in a way not a part of the ranking and those supplemental results is not ranked what so ever, but maybe still messes up former hijacked sites because of old caches from former bad 302 links

wiseapple




msg:773257
 12:17 pm on Aug 17, 2005 (gmt 0)

If a single non-www link is good enough to get the site indexed under the non-www... Should a single non-www link from another site after doing a 301 redirect from non-www to www be enough to get the non-www unindexed?

zeus




msg:773258
 12:34 pm on Aug 17, 2005 (gmt 0)

wiseapple - I would think so, but I had to read your question 2 times, I have just added 400 links to non pages index in google, I did that 4 days ago nothing has changed yet.

g1smd




msg:773259
 1:24 pm on Aug 17, 2005 (gmt 0)

>> Should a single non-www link from another site after doing a 301 redirect from non-www to www be enough to get the non-www unindexed? <<

No. Google will need to directly visit each and every non-www page in order to "see" the redirect for that page. That is why you need to set up a fake sitemap page that lists every page that you want removed. You need one link to each page that you no longer want to be indexed.

Once the redirect is in place, asking for any non-www page forces the browser or bot over to the www version of that page: it is impossible to navigate around the site as non-www.

Before the redirect was put in place, and for all pages linked using relative links, entry at any point of the site (any page, and any version: www or non-www) caused the whole site to be spidered as whatever the first page was (www or non-www). If the internal links all contained www.domain.com/... then this would force the site to be indexed as www; but, be clear, this would not stop any non-www pages that were linked from external sites from also being indexed (because the non-www URL would still serve "content" when asked {"duplicate" content too!}). The redirect forces the canonicalisation, on a per-page basis, whether or not the internal links are relative or absolute.

wiseapple




msg:773260
 1:52 pm on Aug 17, 2005 (gmt 0)

Here is my problem...

I have 21,000 pages which I do not know if they are indexed as www or non-www. The page counts under "site:" are way inflated. Google reports that we have 83,400 pages. Not sure how this happened. So...

- Should I create a 200 page (100 links per page) site map listing every URL with the non-www format from another site?

- If I do this, will the other site be penalized for having a 20,000 links pointing to the other site? Will both domains be penalized?

- I have been using a sitemap at Google. Why does this not correct the issue?

All thoughts are appreciated.

Lorel




msg:773261
 3:33 pm on Aug 17, 2005 (gmt 0)


Zeus,


Lorel - No that would not make it, I think those 2 pages totaly unique fit into my topic of the site and my writing it can not get more uniqu, no I think I know whats wrong, its former hijack pages which cache is still in google DB, that way there are more versions of those pages, its a shame they dont update this supplemental DB.

If someone has hijacked your pages then your changing the text on those pages by at least 15% should remove the penalty as they will no longer be duplicates of the other.


also have this theory, that many former hijacked or sites that where hit by the googlebog 302, still is not reapearing in the serps, because of all those old caches around. If the hijacker and 302 links still have the old cache where they have dublicated the original site, listed in google, then the original site still have troubles.

It is my theory that even though Google "claims" to have removed those penalities re the 302 redirects (because it no longer lists them in the site command) doesn't mean that's so. It's just harder to find evidence of the 302 redirects now.

zeus




msg:773262
 3:53 pm on Aug 17, 2005 (gmt 0)

you are right, maybe I should start doing that, but always hoped that things would fix itself, because I have NEVER had any troubles with ANY update for 3 years, my site stayed at no.1, then I got hit by hijackers and googlebug 302, then the party was over.

I think Im now filtered for dublicated, because of all those old caches that are still in the serps, also because 1-2 times a month I get a single hit from my main keyword in the logs.

Leosghost




msg:773263
 4:34 pm on Aug 17, 2005 (gmt 0)

Oldest supplemental page that l 've seen in "G" was yesterday ..it dates to 1999..404's just beautifully ...supposed to be a PDF ;) ...

Maybe the telling apart of "fact" , "fantasy" and "fell through the cracks" is getting to much for all these search engines ...chasing the "largest number searched is with us award" is not the way to inspire confidence if this is the result ...

BTW the page was not an old hijacked but from an impeccable site in Italian dealing with realtime news and financials...

For one of my static sites "G" has almost as many pages in their supplemental results as does the wayback machine ..maybe we have stumble on another "to be used later" resource here ...or maybe they have their eye off the ball again ..

webdude




msg:773264
 5:51 pm on Aug 17, 2005 (gmt 0)

I am seeing the same on one of my sites. Lots of supp listings for pages that don't exist anymore. I have been trying to use the removal tool to get rid of them with no results - "Request Denied." Has anyone else been seeing this?

zeus




msg:773265
 7:05 pm on Aug 17, 2005 (gmt 0)

webdude - I added those supplemental result in a robots.txt and used the removal tool, it says complete, but nothing in the serps. I dont think its possible to remove those with the tool, because those are listed on the supplemental results DB.

g1smd




msg:773266
 7:10 pm on Aug 17, 2005 (gmt 0)

Beware! If you use the removal tool to try to remove domain.com/somepage.html or www.domain.com/somepage.html it will automatically remove both of them for three months.

promis




msg:773267
 7:14 pm on Aug 17, 2005 (gmt 0)

"I have 21,000 pages which I do not know if they are indexed as www or non-www"

wiseapple, a search: -www inurl:yoursite.com should list upto 1000 non-www indexed.

zeus




msg:773268
 7:16 pm on Aug 17, 2005 (gmt 0)

g1smd - yes I know, but I get 1-2 visits from google these days so I dont care, before I had 32.000 unique a day.

zeus




msg:773269
 8:40 pm on Aug 17, 2005 (gmt 0)

I just looked at the inurl:mydomain which I now use to look for the 302 googlebug, I noticed more have come back with cache may 2004, jesus why cant they delete those, whats going on, I think after they added all those "site" things gone totaly nuts in the serps, not special the ranking, but wired stuff, url only, supplemental results, cache 2004, 302 googlebug, they must know this, so what are they working that is SO important that they let this stay in the serps/DB

reseller




msg:773270
 9:34 pm on Aug 17, 2005 (gmt 0)

g1smd

>>Beware! If you use the removal tool to try to remove domain.com/somepage.html or www.domain.com/somepage.html it will automatically remove both of them for three months.<<

If I recall correctly, GG mentioned that it will be removed for 6 months!

steveb




msg:773271
 9:44 pm on Aug 17, 2005 (gmt 0)

Linking to a non-www Supplemental that 301s to the www version will not get the Supplemental removed. Normal listings get removed, but not Supplementals. Those stay, well, permanently until we see differently.

It's one of Google's most silly inept problems these days.

wiseapple




msg:773272
 11:58 pm on Aug 17, 2005 (gmt 0)

Promis -

I executed the following command "search: -www inurl:yoursite.com"

It reported the following: "Results 1 - 3 of about 2,190 for -www"

It will only list 3 listings of which two do not belong to our site. The are all URL only.

Why wont it list the other 2,187?

Anyone have similar problems?

promis




msg:773273
 5:54 am on Aug 18, 2005 (gmt 0)

Wiseapple, it shows all of them in my case. Try also " -www site:yoursite.com". By the way, the removed all my non-www pages in their supplemental index at 64.233.161.104.

alorinagirl




msg:773274
 5:02 pm on Aug 20, 2005 (gmt 0)

My site was previously banned from the 7/27 update. It took a week after I submitted a reinclusion request and they responded they were sending it to their engineers to determine if they could reinclude my site. They did a week to the day later; however, I am just finding that 99% are just the individual product page url's and they are listed a supplemental. It appears they are going to my sitemap becuase there are a few pages listed from the site map, but the rankings are terrible as compared to before. Additionally, most of the sitemap pages aren't indexed at all. My page rank was restore to PR6 as it was before. Anyone have any ideas what this means?

g1smd




msg:773275
 6:21 pm on Aug 20, 2005 (gmt 0)

Do you see differences in various datacentres, like newer indexes in [216.239.59.104...] compared to older indexes in [216.239.37.99...] and so on?

wiseapple




msg:773276
 6:39 pm on Aug 20, 2005 (gmt 0)

>>>>Do you see differences in various datacentres, like newer indexes in [216.239.59.104...] compared to older indexes in [216.239.37.99...] and so on?

Yes, when I use the command: -www inurl:yoursite.com

It reports only 11 results on 216.239.59.104.

On the 216.239.37.99 - it is reporting 2190 results. However, it will only show me what three of those results are. I am not sure what the other 2187 results are.

g1smd




msg:773277
 7:23 pm on Aug 20, 2005 (gmt 0)

I helped a site that was a mess in Google (duplicate content www vs. non-www, many URL-only pages, and many supplemental results). The 301 redirects were set up months ago, and the listings were eventually fixed (there are at least a dozen posts about this site over the last 4 months in this forum).

Then at least a month ago, several datacentres (like [216.239.37.99...] etc) reverted to indexes from 7 or 8 months ago. I have no idea why they are hanging on to such old data, but I suspect it is something to do with fixing the 302 redirect problem.

wiseapple




msg:773278
 9:52 pm on Aug 20, 2005 (gmt 0)

Another issue on Supplemental Resutls... I do a search on our site using "site:oursite.com keyword". Many of the results show up as supplemental with cache dates of Sept. and Nov. of 2004. Googlebot has been buy to pick these pages tons of times. Does anyone have a hypothesis on why Google is not updating cache even though Googlebot has been by to pick up the pages many times. Why are cache dates from 2004 still showing up in the index?

g1smd




msg:773279
 10:37 pm on Aug 20, 2005 (gmt 0)

They have several datacentres with very old results like that, and other datacentres with very new results.

Compare [216.239.59.104...] and [216.239.37.99...] for example. Are they both very old, or just one?

wiseapple




msg:773280
 2:13 am on Aug 21, 2005 (gmt 0)

>>>Compare [216.239.59.104...] and [216.239.37.99...] for example. Are they both very old, or just one?

Both have very old serps - stuff back from Nov 1 2004. However, .104 has stuff from Mar. 28 2004. This stuff no longer exists on our site. It is basically 404.

Anyone else have stuff from 2004?

Lorel




msg:773281
 3:46 pm on Aug 21, 2005 (gmt 0)

Legoshost


BTW the page was not an old hijacked but from an impeccable site in Italian dealing with realtime news. . . .

Ahhhh, there you have it. Any site that posts news gathered from another site(s) is likely to receive a duplicate content penalty.

This 88 message thread spans 3 pages: < < 88 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved