homepage Welcome to WebmasterWorld Guest from 54.196.24.103
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Accredited PayPal World Seller

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 1014 message thread spans 34 pages: < < 1014 ( 1 2 3 4 [5] 6 7 8 9 10 11 12 13 ... 34 > >     
My site has been First Now vanished from Google
My site has been the first of its kind, I drop off Google
sabine7777




msg:760006
 6:35 am on Sep 20, 2005 (gmt 0)

For the past year I have experienced periodically being completely dropped off Google. My site has been the FIRST of its kind and is in all the natural search results on the first spot. I'm just a small business, but since spet of 2004 I have been vanishing off of Google every 6 weeks or so--recently it has been more often and for longer periods. Does Google discriminate against Older sites? Are they doing it so that we will advertise with them? Any help, advice, comment from a desperate single mother of 4!

 

shri




msg:760126
 1:39 am on Sep 24, 2005 (gmt 0)

Looks like this is a directory-wide or site-wide filter (not too sure about this). Pages added in the last 3 or 4 days -- which never surfaced to the top 30 are in the top 30 with the filter=0.

So, I don't think this is a dupe content filter like a couple of people suggested. filter=0 seems to just disable the new thingy that google rolled out.

aliszka




msg:760127
 1:41 am on Sep 24, 2005 (gmt 0)

I added the: &filter=0

Sure enough my site is back at the top, very strange, several old sites have made a come back and most sites that were up on top have fallen, must be a ploy to spark pay advertising in my opinion

steveb




msg:760128
 1:48 am on Sep 24, 2005 (gmt 0)

Yes, pages created four days ago that have no duplicates in any way are banished to very low positions, but appear first for searches using &filter=0... so it is a sitewide penalty that may have to do with duplicates in general (a site has too many duplicates on the Internet) but does not have anything to do with the page in question having a duplicate itself.

aliszka




msg:760129
 1:55 am on Sep 24, 2005 (gmt 0)

I think this is related in some way to links, google is trying to put the people selling text links out of business in my opinion!

walkman




msg:760130
 2:10 am on Sep 24, 2005 (gmt 0)

many sites got hit last year on September 23rd. Anyone else remebemrs it?

nsqlg




msg:760131
 2:15 am on Sep 24, 2005 (gmt 0)

I think this is related in some way to links, google is trying to put the people selling text links out of business in my opinion!

I don't agree, because is a shot in foot. Obviously any company try increase profits, but G will do this right way, they have more money than can spend dont is like a small business fighting to keep alive. EarnMoney4Fun heh

andrea99




msg:760132
 2:51 am on Sep 24, 2005 (gmt 0)

My site banned 7/28 is back 9/22...

Unfortunately the traffic is NOT back but it does seem to be building slowly.

I think Google is making some improvements which have had an unintentional effect and they are now faced with cascading errors.

They whack one mole and another appears. It's a good thing they just loaded $4 billion into the back door, if they didn't have such huge resources I'd say they were certainly headed for a massive meltdown.

But money changes everything. :)

nanotopia




msg:760133
 2:54 am on Sep 24, 2005 (gmt 0)

This latest update totally devestated my traffic. I have a super white hat optimized website that's almost 10 years old. I have lots of high quality original content, and I've experienced very high SERPs for many keywords and phrases. Now I'm no where to be found. The same thing happened during the last phase of Bourbon. This is so frustrating. Do everything right, put an insane amount of time into a quality family resource, and bam! it's all gone from Google.

theBear




msg:760134
 3:14 am on Sep 24, 2005 (gmt 0)

nanotopia,

If I had a site both a .com and as .org with the same content on it something somewhere would give.

mgpapas




msg:760135
 3:39 am on Sep 24, 2005 (gmt 0)

Something I just now noticed it may be related it may not be (likely it is however in what way I'm not sure) I've been listed on hotscripts for a year, my detail page has the same title as my home page (which as I mentioned has dropped off the google radar for the primary key phrase) and the description is the same as the opening paragraph of that page.
Yesterday for no reason and without notification they removed me completely from their index.

Is it because they feared or knew that mimimal duplication of the content of my site hurt them or could hurt them...... the same category I am in had 23 listings and now SUDDENLY has 14. My listing was not recategorized it was completely removed.

mgpapas




msg:760136
 3:45 am on Sep 24, 2005 (gmt 0)

I wanted to add I was #1 in that category due to the high profile I gave them on my site and the resulting reciprocal visitors they recieved so removing me for not giving significant reciprocal hits was not the reason.

Gavolar




msg:760137
 4:28 am on Sep 24, 2005 (gmt 0)


"filter=0 seems to just disable the new thingy that google rolled out. "

I dont think so.. When I add filter=0 for my very original "domain name" it 0nly comes in 6th, but lists every single page of my 55 page site from 6th onwards. (1700 results)

nsqlg




msg:760138
 4:31 am on Sep 24, 2005 (gmt 0)

Nanotopia, I quickly looked your site at profile, seems very authentic.

I just found this issues:

- dot org version looks a trouble: old stuff cached, 302 redirecting to wrong address (adding extra slash after domain), maybe the main url was "hijacked" by org because rank higher now.

- More backlinks from medium/small sites can help too.

Rick_M




msg:760139
 4:49 am on Sep 24, 2005 (gmt 0)

This all seems to be the same phenomenon discussed in the thread almost exactly 1 full year ago that was titled:

"22-23 september Google traffic dropped dramatically"

[webmasterworld.com...]

which no longer appears publicly available.

The filter 1 year ago was the initial filter that hurt many of my sites. I have been convinced that my particular sites were hurt because I had a lot of datafeed driven pages that acted as doorway pages for affiliate content - like the Amazon Product Feed script. The effect seemed to be that my sites, even though many years old, were "sandboxed" with no results in the top 20 for any search terms, even the site's name. I am not sure whether the filter is triggered by having many pages with duplicate content, or some other structural features of the sites that use datafeed driven content.

And yes, 1 year ago, the sites would show up where they used to by using the &filter=0 string on searches.

My sites had finaly returned in August of this year, but again took a dive on September 22nd this year.

I'm already dreading what September 22nd will bring next year.

ct2000




msg:760140
 7:34 am on Sep 24, 2005 (gmt 0)

can confirm "&filter=0" does bring back my site too

what are they doing?

taps




msg:760141
 7:37 am on Sep 24, 2005 (gmt 0)

&filter=0 works like a charm. My site is completely back when applying this parameter.

A few things that had been changed on my site between bourbon and now:

- Added a meta description that contains the first paragraph

- My provider moved and so my IP adress changed

- using longer text snippets for article lists in different categories

Reading the previous post I think that maybe the lists are a problem. Because many articles can be found in different categories there is some kind of redundancy. Maybe Google does not like that.

But I cannot see a reason why my site is banned completely.

One other thing: A few articles written between Allegra and Bourbon can still be found at #1. Everything else has vanished. Why? And why is this penalty site wide.

As mentioned before: This site consists of > 3000 self written articles. Sticky me for the URL if you are interested.

pescatore




msg:760142
 7:39 am on Sep 24, 2005 (gmt 0)

"many sites got hit last year on September 23rd. Anyone else remebers it?2
I do , one of my sites was hit again like now and come up again Decemper ,lost after a couple of months later come up after a few months, lost again now .I don't know but in the future i will concentrate making pages for Yahoo and MSN ,and thanks God i earn my living from Yahoo and MSN traffic.I don't care anymore for every crazy update they do ,what is the point ,they update every few months ,new crap comes in ,good staff goes away ,they realise after a few months the damage and here we go again on circles.Why they dont make a universal filter or algo or anything that will block the crap,and just live it for ever ,live the sandbox and let the old and new good informative and content sites to rise up normaly ,not to make those updates because a few billions of crap have been added daily ,every day new domains with k's of pages coming up they get spidered and added to G's index even though they never show up to the top 1000 but still creating an overload to the engine ,that as well can be a proble with all those updates.But why clean and old pages with tones of great content have to pay the prise every few months .That has become somehow rediculous ,i am not gona let Google bother my life ,google is a temporary fashion ,the web and the webpages they will be alive out there after Google will be only a memory .

zoth




msg:760143
 8:22 am on Sep 24, 2005 (gmt 0)

Hi all,

I got hit at thursday. I lost my visitors more than 65%.

My site still on TOP 10 if I use the "&filter=0" paramater.

Without "&filter=0" my site somewhere 180th place,
and this is the default serp.

I really don't know what I have to do now ...

But I found some interesting tings ...
When I checked the muber of indexed pages with "site:"
command I get back more than 183000! pages.
I have near 28500 pages only!

Another interest thing:
I have printabe version of original pages. These shtml pages doesn't contain gfx only text.

As I see, google indexed this shtml pages also ... perhaps googlebot thinks this content is duplicated ... BUT in robot.txt I disallowed to craw entire shtml directory to prevent indexing still at last year ...

It seems something really screwed up ...

mcavill




msg:760144
 8:25 am on Sep 24, 2005 (gmt 0)

>>"&filter=0"

yeah that brings mine back - so what does that mean?

it seems a bit wrong to think I need to tweak my content thats been copied by others, or let google know mine is the orignal to try get my old ranking back?

hopefully the update / non update still cooking :-S

reseller




msg:760145
 8:55 am on Sep 24, 2005 (gmt 0)

zoth

>>I really don't know what I have to do now ...<<

Best thing to do now is not to make any changes on your site, but just wait till the update is over, IMO.

cleanup




msg:760146
 9:10 am on Sep 24, 2005 (gmt 0)

&filter=0 brings back my site too.

My site - original content first registered in 1998 semi informational - travel but not very competitive keywords mostly towns and villages that no one has ever heard of.

Results now shown - autogenerated snippets from my site.

What is happening?

taps




msg:760147
 9:23 am on Sep 24, 2005 (gmt 0)

I have print and mail versions of my articles too.

I've excluded them via robots.txt and with noindex,nofollow.

However these files are indexed too and may cause a dupe content penalty. When looking for site:www.widget.com printxyz.php I do see all these print versions as url only links.

In these days I see also Googlebot crawling these mail and print versions of my articles. What's going on there. Is my robots.txt defective? Did I do something wrong with noindex,nofollow?

[Added]
One more thing:
When invoking Google's URL removal console I saw something remarkeable: After submitting my robots.txt to the console I saw only "removing image xyz.php".

The possible cause for that: The URL console does not seem to interpret a robots.txt with
[code]User-agent: *[/url]
properly.
I simply duplicated the exclude list and put a
[code]User-agent: Googlebot[/url]
in front of that.

After resubmitting the robots.txt to the URL console Google shows a "removing file" in the status list.

My theory: Maybe you'll have to add a
[code]User-agent: Googlebot[/url]
explicitely in your robots.txt.

Anyone with duplicate print and mail versions: What's in your robots.txt?
[/Added]

JuniorOptimizer




msg:760148
 11:15 am on Sep 24, 2005 (gmt 0)

My sites is back with filter=0 also. I found about 10 complete copies of the site on other people's servers. I'm sending stern emails to them. I'll probably get on the phone and blow off some steam with some of these thieves later.

What really sucks is how a cloud is now cast on my domain. Even adding new material won't help.

mcavill




msg:760149
 11:39 am on Sep 24, 2005 (gmt 0)

hehe - I think I've found the possible cause of my filtering.

There's about 50 .ru scrapper sites, that have their scrapped results, and then at the bottom of each page the entire text content of my home page! - I don't think emails will do me much good in this circumstance - I think I'll wait and see how this works out.

reseller




msg:760150
 12:48 pm on Sep 24, 2005 (gmt 0)

mcavill

You may wish to submit a spam report to Google at once:

[google.com...]

I know...I know...

Many fellow members post that they have reported spam to Google with no results at all.

However, from reading GoogleGuy´s and Matt´s recent posts I sense that "Google Search Quality Team" is paying more attention to spam reports.

You have everything to win and nothing to lose.

I hope this helps.

JuniorOptimizer




msg:760151
 1:16 pm on Sep 24, 2005 (gmt 0)

It's so annoying to see a copy of my webpage with "Ads by Google" on top of them. The mentality of people who would copy an entire website is beyond me.

theBear




msg:760152
 1:18 pm on Sep 24, 2005 (gmt 0)

I will repeat my post of yesterday in an expanded form.

There is at least one site running an IP delivery script using a DMOZ dump as a data source.

If your site is in DMOZ then you are in danger of having duplicated data in Google's index. This duplication is not excerpts but entire pages.

I have found 3 sites that duplicated our home page using such scripts, there may be many more such sites out there.

This situation has been reported to Google through multiple channels.

Like a lot of folks here adding &filter=0 returns our pages in the SERPS.

JuniorOptimizer




msg:760153
 1:22 pm on Sep 24, 2005 (gmt 0)

It looks like a newer generation of scraper sites that now have the 10 snippets but have added entire text copies at the the high ranking websites at the bottom of their pages for their targeted keyphrase.

This is truly preposterous. I've already found out all the contact info on one of these people, and he as a dedicated server at the same place I do. At least it's my fellow small-time webmaster giving it to me this time.

cleanup




msg:760154
 1:28 pm on Sep 24, 2005 (gmt 0)

"preposterous" Yes. Good word.

How stupid. Not only are we competing with our own content but to add insult to injury we are penalized for it! and then not even found when people search for it.

Is there nothing we can do?

ct2000




msg:760155
 1:38 pm on Sep 24, 2005 (gmt 0)

I just wish Google Guy would show up and explain whats going on with this "update/non-update"

mcavill




msg:760156
 1:38 pm on Sep 24, 2005 (gmt 0)

JuniorOptimizer, thanks for the suggestion - I'll contact Google if it's not sorted out over the next couple of days - I'll probably just go for the [google.com...] option.

I tried to call the number listed in their WHOIS to give them a friendly warning, but of course, it didn't work.

I've still got my fingers crossed that not all the data for this update is folded in, and it'll work out OK.

This 1014 message thread spans 34 pages: < < 1014 ( 1 2 3 4 [5] 6 7 8 9 10 11 12 13 ... 34 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved