Welcome to WebmasterWorld Guest from 22.214.171.124
joined:Dec 29, 2003
Major corporate site nowhere, hotbot with a search extract from the companies page top.
Two months ago a 301 redirect was set up from www to non-www, and all internal links to folders (every page of the site is an index page in a folder) had a trailing / added too. At the same time all the external links with error 404 were tidied up, and several old content pages were deleted, taking the total number of pages back to 111.
Within days Google was listing all the URLs that were redirected TO, and most now had a title and description. It took 6 weeks for the other three versions of each URL to drop out of the SERPs. They slowly sank from about 60 to about 20 then rose up to over a hundred before declining again at a rate of about 6 to 8 every 3 or 4 days.
Finally about 10 days ago, all the other versions disappeared from the SERPs. Of the 111 pages listed with the correct URL format (that is "non-www with trailing / on link"), 109 have title and description and two are URL only.
A few days ago, the URL of a page deleted from the site a long time ago (like about 18 months ago) suddenly appeared in the SERPs. Today, two URLs for pages deleted two months ago have appeared again. All are www pages and are shown as URL only. They all produce error 404 when clicked.
I have no idea why Google should suddenly re-add stuff that they knew, as of two months ago, that the pages no longer exist.
"site:oursite.com" has a short list of pages but "site:oursite.com keyword" has much longer lists that reflect pages with the keyword.
Do sites in this thread still use 302 direction? We are seeing - once again - the phantom pages show up that are links to affiliate sites even after they were zapped with exclusion tool and robots.txt changes. Not clear if Google is indexing our OLD pages or if we these are stray old links at the site.
Its sad it have to be that way, but maybe it is us that are to white hat now and google is only interested in quick content and does not care about old made HTML sites.
Dont bother sent sticky about how I do scrapers, I wont tell because I hate I have to do this.
joined:Nov 1, 2003
Why worry about your white hat site that has disappeared from Google if you can be more successful with a scraper site?
I find them helpful sometimes when I'm searching for general keywords. They do a better job of sorting out the crap than the search engines do in most cases.
As for my sites that got hijacked or whatever, I think my problem was making too many big changes to my sites was why my site got penalized. Just before my sites dropped in ranking they were doing really well and I was allways working on them trying to make them better etc...
I didn't know google would start penalizing for making site wide changes until I read their patent recently. So now I just leave the sites alone and every month they are gradually doing better.
It's slow going but it seems to be working so all I know is to make small changes if any at all. Because google is very sensitive these days.
The last few notes carefully sent via the "google.com/support/" returned to me saying sorry, "you must use google.com/support/".
Maybe G is taking support lessons from MS?
Also what are all those old sites/pages doing in the serps with cache from over a year ago, since all the new "site" where added. There is something wrong with google SE, it can not be they also whant old caches,URL only...
joined:Nov 1, 2003
In order for a site to be included in any update it is a given that the site has been indexed by
google. What we are talking about here are sites that have been stuck in the supplemental index for 6 months or more, sites that no longer get spidered.
normaly googlebot just comes by 2 times every week, then 30-40 times, but never a full spidering, like in my case 2300 times. I also noticed today a site that had totaly copied my whole site and put the own banners on it, so things are jst great, but the good thing is its not listed on google just yahoo.
joined:Nov 1, 2003
YES! my sites may get spidered once or twice per month.
I had five well established sites fall out of Google’s index and into the supplemental results back in January; these sites are now listed by URL only. I attribute my situation to a domain name server problem with my host that prevented Google spidering my sites for over 2 weeks.
However what I don’t understand is that my sites have been in supplemental results for over 5 months. They get spidered only about one or two times each month now. Why is it taking Google so long to get them back into the index? Each site has over 1000 incoming links.
Someone copied your entire site Zeus and put their own banners on it?
Maybe you are both out of google for dupe content
1. Send them a cease and desist e-mail from your 'legal dept'.
2. Contact Yahoo
3. Contact their banner providers.
Usually if you raise some trouble for them they will move on to someone else who won't notice.
Everyone relies on a provider of some sort - even for an IP#.
If those first steps get no results then
1. Find out their Host and threaten them with spamhause reports and other blacklists.
2. If they are the host goto their IP registrar.
Nobody is immune and nobody wants to be on those lists.
send a DMCA to Yahoo.
Sometimes there is a fine line but if they just copied your whole site they are way past the line.
My situation whith the site is that I got hijacked by a site with metatag "no no for googlebot", it replaced my site in the serps, then I noticed 14 302 links to me that where count as a unique page on google, then everything was cleaned for 3 month ago and still no reapering in google serps and only 7-10% of the sites pages are listed.
because they know they are doing something wrong, so why bother.
Look in the [webmasterworld.com...]
you will find others who had the same problem and what they did.
Sometimes it's unintentional [webmasterworld.com...]
although in your case I doubt it because they are running adsense on it.
why dont googlebot visit for real, it comes by a few times and sometimes it goes through 3-5% of your site.
I think its maybe because we have got somekind of filter,because other hijackers and the googlebug 302 has copied our site.
Why aint getting back into the index after removing all redirectings and hijackers.
I think its because of the filter AND because google have not updated the supplemental DB for 6 month, they did 1-2 month ago update and the supplemental was gone, but after a few days the old cached pages where back as supplemental.
I dont think we will be back before a real update of the supplemental DB.
what are your thoughts
Look at every listing in site: may not be whole site listed but look for anything strange - old cache is a flag that bot can't update that page for some reason - other listing will suffer from it - lose title and description or fall supplemental.
I'm guessing that googlebot requests oldest-cache-first (naturally) so if it can't get past that oldest-cache then other listings will suffer. Then whole site will suffer.
I see this in the logs sometimes every 4-6 weeks where someone has typed my main keyword on google.com and got my site as in the good old days, but can we learn from that is that a clear sign of filter/ban/google mess or what, I want to learn this.