Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

My site has been First Now vanished from Google

My site has been the first of its kind, I drop off Google

         

sabine7777

6:35 am on Sep 20, 2005 (gmt 0)

10+ Year Member



For the past year I have experienced periodically being completely dropped off Google. My site has been the FIRST of its kind and is in all the natural search results on the first spot. I'm just a small business, but since spet of 2004 I have been vanishing off of Google every 6 weeks or so--recently it has been more often and for longer periods. Does Google discriminate against Older sites? Are they doing it so that we will advertise with them? Any help, advice, comment from a desperate single mother of 4!

reseller

4:18 pm on Oct 2, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Folks

Do you really believe that ALL changes we have been witnessing since 22nd Sept 2005 are due to [FILTERS] deployment? and how many kind of filters have we seen since that date?

It is not because Iīm "trying to figure out why did you drop that glass" (Copyright steveb 2005, All rights reserved) ;-) , but it will help alot to find out what the affected sites should change after this update/no update is over in order to recover (or before filing the next reinclusion request).

And suppose for "political" reasons, the folks at Google wouldnīt admit that they have been testing algo changes (possibly in addition to filters), shouldnīt we ourselves decide on whether its an update or just filters?

europeforvisitors

4:55 pm on Oct 2, 2005 (gmt 0)



it will help alot to find out what the affected sites should change after this update/no update is over in order to recover (or before filing the next reinclusion request).

If your sites adhere to the Google Webmaster Guidelines and you don't have any obvious technical problems (such as non-www URLs not being redirected to www versions or vice versa), then why change anything at all? Wouldn't it make more sense to file a reinclusion request and let Google sort out its problems?

Making changes for the sake of change (a common knee-jerk reaction to loss of rankings) is a bad idea for three reasons, IMHO:

1) It's likely to be a waste of time;

2) It may compound the problem;

3) It introduces new variables that make it harder for SEs to troubleshoot their own problems.

pescatore

7:25 pm on Oct 2, 2005 (gmt 0)

10+ Year Member



"your sites adhere to the Google Webmaster Guidelines and you don't have any obvious technical problems (such as non-www URLs not being redirected to www versions or vice versa), then why change anything at all? Wouldn't it make more sense to file a reinclusion request and let Google sort out its problems?

Making changes for the sake of change (a common knee-jerk reaction to loss of rankings) is a bad idea for three reasons, IMHO:

1) It's likely to be a waste of time;

2) It may compound the problem;

3) It introduces new variables that make it harder for SEs to troubleshoot their own problems"

europeforvisitors seems to admit that google has a problem.or i am wrong?

FattyB

7:50 pm on Oct 2, 2005 (gmt 0)

10+ Year Member



Well nothing is perfect so I guess he is saying better to wait to be rescued than head off into the desert looking for help...

I know I will not be changing anything on our site as a result of the recent traffic drop, unless Google issue some new guideline or such. Bar concentrating a bit more on repeat visitors...

No point trying to catch bubbles, especially when a lot of it is hearsay. Plus you might change a bunch of things then they fix whatever they had wrong regards your site, but to no benefit since you now have some other thing flagged or whatever.

I am just hopeful things will resolve, whether that takes a week or 6 months remains to be seen.

What I don't like is lack of transparency, though I understand that is open to abuse.

steveb

7:52 pm on Oct 2, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"then why change anything at all?"

The answer should be obvious if you understand the problem. Adhering to the guidelines and doing things right yourself is not the issue here. The problem is Google has gone a step further and is creating problems.

Google keeps data on deleted pages for at least two years. This data can reassert itself at any time and reappear as a supplemental result. It seems clear especially if the content on the two year old page is now on a different page, you could have a problem, even though you did everything right and sensible in deleting the page two years ago.

People should be working on getting rid of their Supplemental listings. Using the remove URL tool doesn't work. Redirecting a supplemental page doesn't work. Deleting a page and linking to the 404 location doesn't work. Personally I'm trying something else, which I am thinking has a good chance to work.

The point is simple: webmasters have to do things to make it less likely that Google will do something blunderingly stupid.

cleanup

7:56 pm on Oct 2, 2005 (gmt 0)

10+ Year Member



FattyB - No you are not wrong, (IMHO etc) Google is very broken.

I was about to ask, what is the best guess as to when Google might get around to fix these problems?

Seems like the fix is not at all trivial or they would have done something about it by now.

So are they really on a downward slope to oblivion? things don't usually work like that.. so when, how long might it take for them to regroup, fix the filter (or whatever) and get things moving again?

if (recent)past experience is anything to go by it could be one or two months ..... athough I really hope not!

europeforvisitors

8:09 pm on Oct 2, 2005 (gmt 0)



europeforvisitors seems to admit that google has a problem.or i am wrong?

No search engine is perfect, and any change to algorithms or filters is likely to have at least some undesirable side effects. Furthermore, algorithms and filters must evolve--with or without side effects--simply to cope with the exponential growth of junk pages that have no value to anyone but their owners.

Disclaimer: I'm not suggesting that "problems at Google" are responsible for every penalty or loss of rankings. In some cases, Google obviously wants to purge types of content that may have squeaked through in the past, and for good reasons.

reseller

9:32 pm on Oct 2, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Folks

Here are few DCs moving in the right direction ;-)

64.233.161.99

216.239.37.99
216.239.37.104

216.239.39.99
216.239.39.104

It will be very nice to wake up tomorrow morning and see the results of those 5 Dcs have propagated to the rest of the DCs. Who knows!

Good night....

walkman

10:17 pm on Oct 2, 2005 (gmt 0)



if I search for my "domain name" my site is not in the top 100. It was #3 as recently as last week. My guess is that Google has tightened filters regarding too similar anchor test, and I've got caught into it. I probably only have 4-5 inbound links with "domain name" as anchor, but dozens with "domain.com," since many people link to me that way. Plus, the domain name is descriptive (very generic).

my domain name was registered about 8-9 years ago, and it's a very good one. No dashes or anything.

Janiss

11:13 pm on Oct 2, 2005 (gmt 0)

10+ Year Member



>>Hi Folks
Here are few DCs moving in the right direction ;-)

64.233.161.99

216.239.37.99
216.239.37.104

216.239.39.99
216.239.39.104<<

Hmmm... they ain't doing a thing for my sites.

BillyS

11:19 pm on Oct 2, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Same here - a spot or two...

MrSpeed

2:13 am on Oct 3, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not seeing a big difference between filter=0 and regular results.

AndyA

2:43 am on Oct 3, 2005 (gmt 0)

10+ Year Member



My guess is that Google has tightened filters regarding too similar anchor test, and I've got caught into it. I probably only have 4-5 inbound links with "domain name" as anchor, but dozens with "domain.com," since many people link to me that way.

And since linking to a site by using the domain name is perfectly normal, NATURAL, and makes good sense, WHY would Google filter out results like that?

If you're going to link to Google, chances are you're going to have "Google" or "Google.com" in the anchor link, most likely it will not be "Search Engine", "Search Directory", "Feeling Lucky?", or something else similar.

If Google is in fact filtering sites because too many are linking to them with their domain name, then Google is 100% solely responsible for this mess, as well as being ignorant, short sighted, and out of touch to how people truly link.

To filter a site due to too many anchor links being similar is an INVITATION to manipulation on the part of Google. If it's a widget site, it's perfectly normal, and should be expected that the word widget will appear in most of the links.

I have no idea if this is really happening or not, I'll leave that to the more knowledgeable people here. But I will say that I've probably heard more ridiculous things than this, but I'm not sure when!

theBear

3:12 am on Oct 3, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Gee I guess I do it all wrong, I always link to widget sites using Google as part of my link text, I'll try linking to Google using widget instead.

Thanks for helping me see the light ;-).

PhattusCattus

4:45 am on Oct 3, 2005 (gmt 0)

10+ Year Member



Anchor text in internal links within the site also seems to play a role...

My directory sitee has a list of names and a link saying "see more info about (name)", which goes to a detailed data page.

The title and metatags in the detailed data page include the (name) used in the anchor text...

All those detailed data pages got crushed...

reseller

5:09 am on Oct 3, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Good morning Folks

Unfortunately I see that the 5 DCs I mentioned in my previous post have changed this morning in the "wrong direction".

Canīt those kind folks at Google leave anything in peace anymore ;-)

However, I still see some few "reseller friendly" Dcs left;

72.14.207.99

72.14.207.104

Kimkia

5:52 am on Oct 3, 2005 (gmt 0)

10+ Year Member



Anchor text links on my own site are killing me. That and repetitive keywords describing pages. Some repetitiveness is necessary and part of normal navigation. But some is not, and I believe Google has lowered the boom on keyword density and I've been caught in the trap.

I'm busy rewriting over 300 hand coded shtml pages that have worked just fine for three years. Such is the breaks. I'm taking the opportunity to update poor coding, working on each page individually, reducing keyword density, cleaning up generally and hoping for the best. I shudder to think of what will happen a month from now when Google decides my keywords are too infrequent...:(

It's not enough to offer solid content. I do that in spades. Each original page on my site represents a project that I have designed, constructed, photographed, written up and coded so that others can reproduce the project (for personal use, only, of course).

That's a lot of hard work, folks...and it so disheartening when you hit with www and non-www woes; 302 hijackers; and those thieving ppl at a well known network who adore frames and are now ranking higher than me for the search terms to my original projects - grrrrrr......

I've just started adding the frame-breaking code to the head of my pages. It's very satisfying to click on the links that are supposed to be "about" my subject and get directly to my site for once. HAH!

One unique circumstance added to my troubles. Months ago my web host had a server failure; the backups were very stale. I ended up with antique htm pages that I had thought deleted being restored to my server, resulting in duplicate content when I uploaded newer shtml pages. I thought I had deleted all re-uploaded htm pages but apparently missed a few. Thanks to Google, I'm now finding and exterminating these dinosaurs.

My point is: even if you have a valuable content-based site, chances are not all this mess is Google's fault. I think they are making a genuine effort to rid their index of scrapers and you gotta applaud that. I just wish it wasn't costing me so much in terms of time, money and ego.


zoth

5:53 am on Oct 3, 2005 (gmt 0)

10+ Year Member



As I see, &filter=0 not working anymore.

My site filtered completely.

No need more pray ... Gods of Google made decision.

I think, we have not too many chances:
1.) Wait 3 or 6 month perhaps we will get back
2.) Remove the latest indexed links from Google
3.) Contact will Google until somebody will give an answer.

Or?

reseller

6:04 am on Oct 3, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



zoth

>>No need more pray ... Gods of Google made decision.<<

Donīt lose faith..it isnīt over yet ;-)

taps

8:28 am on Oct 3, 2005 (gmt 0)

10+ Year Member



Yep &filter=0 does not bring my site back on the DCs mentioned here.

Optimistic approach: Maybe the penalty is over and now the pages are starting over to get back into the serps. At least I hope so.

almar

10:22 am on Oct 3, 2005 (gmt 0)

10+ Year Member



Notice that my Google sitemap .xml file has not been spidered in 2 days, instead of the usual daily. Anyone else notice something weird with sitemaps?

disspy

10:50 am on Oct 3, 2005 (gmt 0)

10+ Year Member



I've noticed that googlebot made 1/4 of their usual traffic for past month. Exactly the same ratio of serp traffic drop.

taps

10:53 am on Oct 3, 2005 (gmt 0)

10+ Year Member



almar,

yes, some of my sitemaps are set to "pending". I think they do some kind of update for sitemaps. However, I doubt it has something to do with the filter stuff.

anttiv

11:02 am on Oct 3, 2005 (gmt 0)

10+ Year Member



filter=0 no longer works which means even less traffic as some search sites use Google with the filter set to 0 as default.

It can't get any worse than this so hopefully there will be a real major update soon.

Dayo_UK

11:41 am on Oct 3, 2005 (gmt 0)



>>>>I've noticed that googlebot made 1/4 of their usual traffic for past month. Exactly the same ratio of serp traffic drop.

Yes, Googlebot decreases with the Googlebug

So what happens is you dont get crawled by Google, scrapers crawl you faster than Google, scrapers get crawled in Google therefore scrapers get rewarded as them having your content as the original - Google makes money of scrapers.

Google win win win.

Dont know if delibrate - certainly sickining.

Google - how many more times - work out the canonical url of my site please. FFS - what do I have to do - the 301 is done!!!!!. You know it is there - so what have you done sandboxed it or something.

rytis

11:47 am on Oct 3, 2005 (gmt 0)

10+ Year Member



Looks like Google is struggling to fight spammers and scrapers to the point that, in order to keep SERPs at least somewhat useful, higher and higher collateral damage percentage is becoming acceptable. If this continues (and I can't see it the other way) - the day will come when 50% collateral damage is acceptable.

reseller

12:07 pm on Oct 3, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Folks

Rumour has it that the folks at the plex have agreed upon NOT resolving the canonical issue befor Dayo_UK post all his 100 "Listen Google" calls.

I.e Dayo_UK & Co have to wait until Xmas Eve ;-)

thecityofgold2005

12:15 pm on Oct 3, 2005 (gmt 0)

10+ Year Member



I have also noticed very slowed down sitemap downloading for the past 4 days.

Dayo_UK

12:20 pm on Oct 3, 2005 (gmt 0)



Lol Reseller.

I just wish us Webmaster and Google could Cooperate on sorting out this problem.

I assume there is someone at the plex who is trying to trouble shoot - but a bit of 2 way communication on Canonical urls would probably get it fixed much much sooner.

I have tried to explain the symptons - but get nothing back - am I just wasting time and energy?

reseller

12:49 pm on Oct 3, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Dayo_UK

>>I have tried to explain the symptons - but get nothing back - am I just wasting time and energy? <<

I guess the situation at the plex is; whenever the folks at Google Search Quality Team hear about "canonical issue", they get more curious and ask; Is it a kind of "Bacon polenta"? Mmmm. Bacon-y goodness! ;-)

This 1014 message thread spans 34 pages: 1014