homepage Welcome to WebmasterWorld Guest from 54.226.43.155
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 88 message thread spans 3 pages: < < 88 ( 1 2 [3]     
Supplemental Results
Google supplemental result listings.
wiseapple




msg:773221
 3:46 pm on Jul 28, 2005 (gmt 0)

Greetings,
Using the "Site:" command on Google I have found a ton of our pages are marked as supplemental results. All of the cache are either: Nov. 27 2004, Dec. 1 2004, or Feb. 21 2005.

Does anyone have a similar problem?

Googlebot has visited these pages tons of times over the past few months. However, the cache remains out of date.

Thanks.

 

Lorel




msg:773281
 3:46 pm on Aug 21, 2005 (gmt 0)

Legoshost


BTW the page was not an old hijacked but from an impeccable site in Italian dealing with realtime news. . . .

Ahhhh, there you have it. Any site that posts news gathered from another site(s) is likely to receive a duplicate content penalty.

texasville




msg:773282
 3:58 pm on Aug 21, 2005 (gmt 0)

>>>Ahhhh, there you have it. Any site that posts news gathered from another site(s) is likely to receive a duplicate content penalty. <<<

Ahhh..think again Lorel- not true...Site called topix does nothing but reprint news from other sources...and they have a pr of 7! Of course, getting 10,000 bogus listings from the odp probably boosts them pretty good.

Romeo




msg:773283
 6:27 pm on Aug 21, 2005 (gmt 0)

The [216.239.37.99...] seems not old for me.
It has all my existent pages. On a site:example.com command it just lists the www.example.com pages I want in the SERPs, no ancient supplementals at all, all pages cached between 2005-08-10 and 2005-08-20 (= yesterday), and no non-www pages.
SERPS for my main keyword show about 10% less total results, which could be explained by kicking out old supplementals. I am pleased with this.

The [216.239.59.104...] holds new and OLD stuff. Besides my existent pages with recent cache dates up to yesterday, there are many old supplementals, some of them cached back in 2004-09-28 (!), many old non-www pages, which have have been redirected to www by long standing 301s since 2004-09.
These results are very similar to those I get when calling www.google.com/.

Since I have "your visit" date/time stamps on every page it is easy to check the true age of cached data.

Google, pls roll out the 216.239.37.99 results, it will work for me ...

Regards,
R.

Leosghost




msg:773284
 8:50 am on Aug 22, 2005 (gmt 0)

Lorel ...actually you misunderstood there ..The site is similar to the financial times or Cnn's money programme on the web only for Italians ...not at all a "gatherer" ..it's own content generated in real time ...the supplemental result (I was looking for info on a finance/investment company client ...before saying yes or no to the brief ) was pointing to a PDF file that went back that far ..and of course wasn't in existance anymore ( except in "G"'s imagination ;)

awebguy




msg:773285
 9:52 am on Aug 22, 2005 (gmt 0)

I discovered Supplemental Result for a page that does not exist, and it was cached at 31 Oct 2004 17:40:20 GMT.
Does it mean that Google canít maintain its auxiliary index?
If it is true then Supplemental Results just spam produced by Google.

wiseapple




msg:773286
 12:17 pm on Aug 22, 2005 (gmt 0)

The 2004 issue. This is correct many of our supplementals are from the Nov and Dec 2004 range. We also have supplemental as far back as Mar 2004.

webdude




msg:773287
 12:33 pm on Aug 22, 2005 (gmt 0)

zeus - you wrote:
I added those supplemental result in a robots.txt and used the removal tool, it says complete, but nothing in the serps. I dont think its possible to remove those with the tool, because those are listed on the supplemental results DB.

Ah... I wish I would have known that. Is that why Iget "denied" whenever I try to remove these pages? The problem I am having is that these results are pages that don't exist anymore.

Lorel




msg:773288
 3:52 pm on Aug 22, 2005 (gmt 0)

Texasville,

>>>Ahhhh, there you have it. Any site that posts news gathered from another site(s) is likely to receive a duplicate content penalty. <<<

Ahhh..think again Lorel- not true...Site called topix does nothing but reprint news from other sources...and they have a pr of 7! Of course, getting 10,000 bogus listings from the odp probably boosts them pretty good.

PR has nothing to do with it. If they are reprinting news and not getting a supplemental results penalty for it then they must be adding at least 12% more text to the page to prevent that penalty.

See Brett's post re this being the amount needed to bypass that penalty.

zeus




msg:773289
 5:12 pm on Aug 22, 2005 (gmt 0)

webdude - I bet you have a space between the words or other in the url you are pating in to the removal tool, remeber google serps has a space in the url shown.

webdude




msg:773290
 5:38 pm on Aug 22, 2005 (gmt 0)

Sorry zeus,

I don't understand what you are saying. There are no spaces in my URLs. Should there be?

zeus




msg:773291
 6:03 pm on Aug 22, 2005 (gmt 0)

sometimes it like this in the serps

TITLE BLA BLA
discription bla bla
www.domain.com/ text.htm

see the space between (/ text.htm) sometimes users forget to close that space if they copy and past from google serps.

webdude




msg:773292
 6:14 pm on Aug 22, 2005 (gmt 0)

Oh, I get it...

No, I don't see any of my listing with spaces in the URL.

zeus




msg:773293
 6:37 pm on Aug 22, 2005 (gmt 0)

webdude - well anyway its no use anyway to try to remove supplemental results, so lets just hope that google update the supplemental results soon

webdude




msg:773294
 6:49 pm on Aug 22, 2005 (gmt 0)

I agree. I have old pages dating back to February from a site that was redesigned form the ground up, including file names. It has been a hassle since these 404s still get hit quite a bit. I ended up reorecting the more popular ones to existing pages.

g1smd




msg:773295
 6:58 pm on Aug 22, 2005 (gmt 0)

I have removed supplemental results before... just submitted each 404 URL to the removal tool in the "remove an outdated URL" category or whatever it was called.

webdude




msg:773296
 7:04 pm on Aug 22, 2005 (gmt 0)

Thay has not been working for me. I have been trying it for the past several months. I keep getting "Request Denied." Not sure why. I had others check the pages and headers. They do return 404s. I ended up recreating the pages and adding 301s.

steveb




msg:773297
 1:27 am on Aug 23, 2005 (gmt 0)

Sorry if I'm not getting this but... you just put up a page in the dead slot, use the URL removal tool, then just delete the page. The listing will be gone in hours.

zeus




msg:773298
 1:08 pm on Aug 23, 2005 (gmt 0)

I also noticed that everytime we talk about a topic like, the huge increase in URL only, supplemental results and then redicoluse quick omitted results, then there is no comment from google.

webdude




msg:773299
 1:58 pm on Aug 23, 2005 (gmt 0)

steveb

I have tried this on the site for the past several months. I cannot get it to work. I started a whole thread about it. I even joined Google Groups and asked there. I have also emailed Google but just got the standard response. I have had the missing pages checked by multiple different header checkers and have also had members here check to make sure that they were coming back as 404 pages. They are. I have tried everything to get rid of these pages in the supplimental results and I just can't get it done. Every time I use the removal tool, I get a "Request Denied." I am totally baffled as to why I cannot get this to work.

Any help or reason on this would be greatly appreciated. The other thread is...

[webmasterworld.com...]

webdude




msg:773300
 2:01 pm on Aug 23, 2005 (gmt 0)

There are several other threads on this that I follow too. Didn't want to give out too much info, just the thread that I started which is pretty buried right now.

zeus




msg:773301
 2:49 pm on Aug 23, 2005 (gmt 0)

webdude, if it accepts your request, I dont think it will make any changes in the serps, when it says supplemental results, I dont think you do anything, because those are listed on another server, that is just my theory.

steveb




msg:773302
 9:08 pm on Aug 23, 2005 (gmt 0)

webdude, I still don't get what you are trying to do. You don't want the pages to come back as 404. What is it that you are doing?

Put a page in the slot. Use the tool. The listing will be removed in hours. This has nothing to do with a page being 404, and the page shopuld not show as a 404.

zeus




msg:773303
 11:32 am on Aug 24, 2005 (gmt 0)

I made a little test, first: a new site was also listed as non www, then I made a 301 and 3 weeks later the non www was gone from the serps, as it should be.

The form hit site of hijackers and googlebug 302, also had a 301, but that was for 4 month ago and still nothing has changed.

About omitted results which also come extremly quick these days, I had a site where I deleted a lot of content just so it REALY did not look the same, not only those 15% some say is enogh, then all pages where listed with discription like in the good old days. Also remeber to change your meta discription.

About suplemental results, as soon you see that you have to be alerted, it could be that you have been hit by googlebug 302, scrapers has to much of your content or maybe hijackers, but remember it does NOT have to be that dramatic.

webdude




msg:773304
 12:18 pm on Aug 24, 2005 (gmt 0)

Steveb,

I am doing exactly what the instructions tell me to do on the Google site. I am going to "Remove an outdated link". At the top of that page, it states, "Enter the URL of your page. We will accept your request only if the page no longer exists on the web". In other words, it has to return a 404. I already verified this on other threads. To use this part of the tool, the page MUST return a 404 or this part of the tool will not work. I am then clicking the radio button that says "Remove anything associated with this URL." I wait a day or two, it shows the request as "Pending." After the wait, I get a "Request Denied."

If I am doing this incorrecty, please tell me in a step by step fashion exactly what I am supposed to do. Are you saying that the pages have to exist before they can be removed from the index?

webdude




msg:773305
 12:22 pm on Aug 24, 2005 (gmt 0)

Also direct from the Google website:
Remove an outdated ("dead")

Google updates its entire index automatically on a regular basis. When we crawl the web, we find new pages, discard dead links, and update links automatically. Links that are outdated now will most likely "fade out" of our index during our next crawl.

Note: If you believe your request is urgent and cannot wait until the next time Google crawls your site, use our automatic URL removal system. We'll accept your removal request only if the page returns a true 404 error via the http headers. Please ensure that you return a true 404 error even if you choose to display a more user-friendly body of the HTML page for your visitors. It won't help to return a page that says "File Not Found" if the http headers still return a status code of 200, or normal.


g1smd




msg:773306
 6:58 pm on Aug 24, 2005 (gmt 0)

Use WebBug to confirm that the returned status code really is 404.

steveb




msg:773307
 7:53 pm on Aug 24, 2005 (gmt 0)

I see, I've never used that option.

As I suggested above, just use the "Remove a single page using meta tags" option. It is simple as can be. Add a blank page with the correct meta tag, enter it in the urlconsole, delte the page, and its gone in hours. Repeat it as many times as necessary to get rid of all the pages.

webdude




msg:773308
 11:58 am on Aug 25, 2005 (gmt 0)

I'll give that a try. Thanks

This 88 message thread spans 3 pages: < < 88 ( 1 2 [3]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved