homepage Welcome to WebmasterWorld Guest from 54.205.144.54
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 1014 message thread spans 34 pages: < < 1014 ( 1 ... 9 10 11 12 13 14 15 16 17 18 [19] 20 21 22 23 24 25 26 27 ... 34 > >     
My site has been First Now vanished from Google
My site has been the first of its kind, I drop off Google
sabine7777




msg:760006
 6:35 am on Sep 20, 2005 (gmt 0)

For the past year I have experienced periodically being completely dropped off Google. My site has been the FIRST of its kind and is in all the natural search results on the first spot. I'm just a small business, but since spet of 2004 I have been vanishing off of Google every 6 weeks or so--recently it has been more often and for longer periods. Does Google discriminate against Older sites? Are they doing it so that we will advertise with them? Any help, advice, comment from a desperate single mother of 4!

 

Iguana




msg:760546
 6:22 pm on Sep 28, 2005 (gmt 0)

Aug 10 2004
Aug 25 2004
Sept 23 2004
Dec 16 2004
Feb 01 2005
(some others here but I don't have dates)
July 22 2005
and now Sep 20 2005

These were all dates when Google applied their Filter Updates (Allegra was also an algo update). Some sites dropped out of sight, some reappeared.

I'm very tempted to say "where have you all been for the last year?". There have been threads that have discussed this phenomena after each filter update. I don't think anyone has figured out exactly what the filter is (we all have our pet theories) but every thread has talked about the devastation to traffic of dropping 100+ places even on your sites own name.

If this is the first time you have been hit then you have been very lucky up until now. Some of us have been buried for a long time and have come back into Google over the past few filter updates. Whether our re-appearance is related to whatever actions we have taken is uncertain.

stargeek




msg:760547
 6:23 pm on Sep 28, 2005 (gmt 0)

"Aug 10 2004
Aug 25 2004
Sept 23 2004
Dec 16 2004
Feb 01 2005
(some others here but I don't have dates)
July 22 2005
and now Sep 20 2005"

How about May 20th?

Iguana




msg:760548
 6:30 pm on Sep 28, 2005 (gmt 0)

Thanks for reminding me - I stopped counting after a while! I was existing purely on Yahoo traffic then.

stargeek




msg:760549
 6:32 pm on Sep 28, 2005 (gmt 0)

we should compile a list of what the filters probably targetted on all those dates.

modemmike




msg:760550
 6:34 pm on Sep 28, 2005 (gmt 0)

Index count numbers are all over the place today... from 10100 to 782 which is a lot more accurate.

I see new counts on:
64.233.167.104
216.239.57.104
216.239.57.98
216.239.53.99

Nevermind, numbers went back as soon as I posted this, checked them 4 times before posting too!

steveb




msg:760551
 6:44 pm on Sep 28, 2005 (gmt 0)

The URL removal tool is just a URL hide tool. It's pointless to use, unless you like puttinga rug over a hole in the floor.

Google has had multiple technical failures the past few years, some catastrophic and some small. In some cases, Google employees were clueless about the problems until webmasters more or less forced them to see, with Google Guy's unfortunate comments about 302s six months ago as the best example. They didn't know how screwed up they were, despite it being obvious.

The Supplemental index was a stupid idea that has some merits, similar getting out of dusting the living room by setting fire to the house. However, unlike other things, the Supplemental index is seldom seen by the public, so the humiliation it brings Google is limited to people in the search-conscious community. Eric doesn't tell cnet about how proud they are to have listings for pages deleted two years ago; that a webmaster has even told Google to delete from its index more than once because it doesn't exist; that Google has crawled to and seen as 404 literally 100+ times, etc.

Google's problems include mistaken notions, but its mostly ineptness. No one at Google sat down before the recent mess and said "hey, let's look incompetent by making stupid decisions that anyone can see by adding &filter=0 after our search page URLs."

This isn't an easy job they have, and sometimes they do a terrible job at it, even if they do better than their competition.

There will always be tin hat conspiracy posts here, but the fact of the matter remains that dropping a glass on the floor is not "updating" it.

[edited by: steveb at 6:47 pm (utc) on Sep. 28, 2005]

Iguana




msg:760552
 6:47 pm on Sep 28, 2005 (gmt 0)

The filter(s) seem to be applied on "search phrases" and so only hit a proportion of the SERPS. That doesn't mean they are anything to do with theming/keywords just that that is the method of application.

Of course everyone affected has been searching for the answer. 302 redirects (everyone has a few pages with 302s pointing at them), www/non-www issues and/or duplicate content, adding large numbers of pages, Adsense, affiliate links.

I did examine closely 3 very good, non-spammy sites hit as well as my own. In all cases (including my own sites) I could see a quite large proportion of links coming from a few domains. So, as a veteran of the PR0 cross-linking penalty I tend to favour linking patterns as the trigger.

texasville




msg:760553
 6:53 pm on Sep 28, 2005 (gmt 0)

64.233.187.104 is the only one I checked. I'll try the others. Had to go back and look at the cache for this one and I have moved up to #202 on the research. Interesting.

wiseapple




msg:760554
 6:57 pm on Sep 28, 2005 (gmt 0)

We are fighting this "&filter=0" penalty.

There are a few other threads that has discussed this. Even one with a note from GG.

[webmasterworld.com...]
[webmasterworld.com...]

Anyone ever recovered? Any hints at what could be causing the penalty? (Ex: Add the "&filter=0" onto any term which you used to rank puts you back in the right place.)

My bet is on scrapers causing this issue.

texasville




msg:760555
 7:06 pm on Sep 28, 2005 (gmt 0)

I am #191 on 216.239.57.104- hmmm lot better than not in top 1000.

Iguana




msg:760556
 7:11 pm on Sep 28, 2005 (gmt 0)

wiseapple

so you think this filter isn't related to all those other filter updates? Certainly I've had one site come crashing back into the SERPS with this update after being missing since February. I just think that this regular pruning of search results has to be deliberate and stems from a filter (probably with multiple factors applied differently each time) and that this update is the just the latest in the line. Having the &filter=0 to spot the filter is useful though

stargeek




msg:760557
 7:21 pm on Sep 28, 2005 (gmt 0)

"My bet is on scrapers causing this issue. "

what do you mean causing this issue?
do you think these filters are designed to catch scraper sites and other sites are getting caught?

if so, how do you think these filters work?

stargeek




msg:760558
 7:27 pm on Sep 28, 2005 (gmt 0)

is anyone else seeing sites with no google cache are being dropped?

reseller




msg:760559
 7:42 pm on Sep 28, 2005 (gmt 0)

Hi Folks

To those fellow members who keep posting: This is not an update

Would you be kind to post your own definition of an update?

And lets discuss it.

Thanks.

Dayo_UK




msg:760560
 7:48 pm on Sep 28, 2005 (gmt 0)

Reseller

I dont know - as you know though I think it is another attempt at a fix for canonical urls - they still have not got it right though :(

Seriously - this should be the main priority at the plex at the moment (IMO - of course)

Listen Google - I will say this only another 96 times. The canonical url for my site is the homepage with the www - I have done the 301 - this is the page with the most backlinks - it is the page that should rank for the company name search. Etc.

wiseapple




msg:760561
 7:48 pm on Sep 28, 2005 (gmt 0)


---------------------------------------------------
"My bet is on scrapers causing this issue. "
what do you mean causing this issue?
do you think these filters are designed to catch scraper sites and other sites are getting caught?

if so, how do you think these filters work?
---------------------------------------------------

Scrapers are rampant. They copy titles and meta-descriptions. If I look at one article of ours, there is at least 100 different scraper sites with the same title and meta description of the article. On our site, the meta-description is the first few lines of article. We also have index pages that link the articles together. These index pages are made up of titles and meta descriptions of the articles.

I imagine that Google use some type of Bayes classifier for the filters. In order for these filters to work - they must be manually fed a list of bad sites or sites that they would like to penalize or rid the SERPS of.

Since the scrapers contain the same title and meta description tags, it not a stretch to see that our site would get caught up in the filter.

stargeek




msg:760562
 7:57 pm on Sep 28, 2005 (gmt 0)

"manually fed a list of bad sites"
two points:

1)google has said they prefer algorythmic ways to detect spam

2)that idea seems like something a building full of phd's would see right through, as well as any half-decent spammer.

reseller




msg:760563
 8:23 pm on Sep 28, 2005 (gmt 0)

Dayo_UK

>>Reseller

I dont know - as you know though I think it is another attempt at a fix for canonical urls - they still have not got it right though :( <<

You mightbe right!

However, I see the problem in more simple manner:

Google keeps listing 1000īs of duplicate pages while at the same time removing "original pages" as part og deduplication process

And its hurting...

pescatore




msg:760564
 9:04 pm on Sep 28, 2005 (gmt 0)

If anyone want evidence of what kind of spam is at the top 10 of the most competitive terms just stick me for the url.

steveb




msg:760565
 9:04 pm on Sep 28, 2005 (gmt 0)

reseller please stop trying to hijack the thread.

This is not an update tread.

This is not a "let's define what an update is" thread.

The thing is long and cluttered enough without this extra crap.

Start a new thread, or contribute to one of the other similar threads, if you want to discuss what an update is.

reseller




msg:760566
 9:40 pm on Sep 28, 2005 (gmt 0)

steveb

In my post msg #:86 on this thread, I wrote:

[webmasterworld.com...]

==========================================
reseller
Preferred Member

view member profile
send local msg
joined:Feb 6, 2005
posts:460
msg #:86 3:18 pm on Sept 23, 2005 (utc 0)

Hi Folks!

Just like any other previous update, its gonna be very tough as the update proceed. Google updates arenīt something for the weak souls. During Allegra and Bourbon some of our fellow members couldnīt take it any more and did a very wise thing. They took a break ;-)

..........etc

=====================================

And I have nothing to add ;-)

FattyB




msg:760567
 10:10 pm on Sep 28, 2005 (gmt 0)

Hmm,

Well I got a reply from Google Search via Adsense, who forwarded my enquiry, they said that we were not being penalized for anything on the site and that, as they add new sites and content, positioning moves about...

So I am not sure what to make of the huge drop, though I guess they are always cagey about any updates. At least it looks like it not due to anything on our site, which is good.

I guess more wait and see then.

djmick200




msg:760568
 10:14 pm on Sep 28, 2005 (gmt 0)

Have any of you guys taken any steps to change anything about your sites?

After drying up the puddle of tears below my desk and waited for my keyboard to dry out I made a list of alterations I would set about making.

I firstly listed anything different I'd done to my site within the past two months. Only 1 major thing so that was easy.

I gave up checking what the #1 - 10 ranking sites for my keywords were doing that I was or wasn't because I could find no sense or consistancy there.

I used the removal tool to take down 2 folders of pages that were recently added. I also added NO INDEX, NO FOLLOW tags to these and added them to my robots txt file.

The following had always exsisted on the site but decided it was now bad practice where before I found it worked very well.
Cross linking to another site I own on a simillar theme and done likewise to it. These were plentiful.

eg
site 1 big widgets interlinked with
site 2 small widgets and reverse.

Site 2 funnily enough has made a comeback, It hasnt been updated since May and Id meant to rework it into something else and never found the time. I still intend to at some point.

One other thing Im gonna do is use site 1 under a diff domain, juggle it around a bit, ban googlebot from the entire site and use it over at yahoo where the original does very badly.

After a little work on a seperate site yahoo seems a lot easier to rank in. Older techniques still seem to work there.

Should I ban yahoo from the original version?

EDIT: I just checked to see how removed pages request was pending or complete. Answer complete.
Try it on G. three word search.
Results 1 - 1 of about 44
shows only 1 result on serps though.
I add the &filter=0
Results 1 - 10 of about 18
shows all 18.
There were only 16 related pages in the rmoved folder that relate to this search.
18+16=44 - it does when you use the calculator sponsored by Big G - lol!

[edited by: djmick200 at 10:27 pm (utc) on Sep. 28, 2005]

BillyS




msg:760569
 10:25 pm on Sep 28, 2005 (gmt 0)

Well I got a reply from Google Search via Adsense, who forwarded my enquiry, they said that we were not being penalized for anything on the site and that, as they add new sites and content, positioning moves about...

Sounds like the standard Googlease to me.

FattyB




msg:760570
 10:33 pm on Sep 28, 2005 (gmt 0)

Billy, well I think that too. I wrote them back with a specific example showing the results are very odd. Given they list parts of our site that are not very busy with little updates way above those that are busy and updated all the time.

I should say we use subdomains for all our section so this maybe further complicates things.

Also they mentioned getting morehigh quality links, yet we probably have hundreds of them added every month, on everything from major news sites to blogs. I doubt many sites get as many so quickly in fact. Yet we seem to have been degraded despite these, which are all organic to articles.

So will see what they say.

FromRocky




msg:760571
 10:44 pm on Sep 28, 2005 (gmt 0)

18+16=44

?

djmick200




msg:760572
 10:51 pm on Sep 28, 2005 (gmt 0)

FromRocky

The 18+16=44 was a slight jibe at googles inflated number of pages per site count.

I used the removal tool to take away 16 pages that would contain 'big red widget'.

There remains 18 pages in my site that contain the text 'big red widget'.

The two together = 34.

When I done the initial search it said 1-10 of 44.

10 extra. No where to be found. Non existant.

steveb




msg:760573
 11:13 pm on Sep 28, 2005 (gmt 0)

Playing around with the site:example.com type search I tried:

site.com -word1 where word1 is on 14 of the sites 1000+ pages... it returned 9000+ results

site.com -word2 where word2 is on 19 pages... it retuned 997 results

After a few dozen checks, if a word appears on 15 or less pages on the site, the result comes back over 9000 results; if the word appears on 16 or more pages, the results come back at 1000 or less. This could be because I have 1015 pages on the site. I don't really have an accurate count, or it may be because 15 is somehow mystical to Google, but it is interesting that I am able to get an accurate count by -someword

modemmike




msg:760574
 12:09 am on Sep 29, 2005 (gmt 0)

Wierd, sitting here refreshing my site with the site: command and the count is bouncing back and forth... from 10100 to 937 and then back 10100 then back to 937... rolling change accross DC's and each refresh is hitting a different DC? Everflux has never been this odd.

tmartini




msg:760575
 12:29 am on Sep 29, 2005 (gmt 0)

Hard to believe but things appears to be getting worse. I'm getting two totally different set of results when I search via the Google toolbar versus searching at google.com . One set (via google.com) is the crappy results we've seen for the past week. The other set (via the toolbar) is even worse and seems to be mostly link farms and doorway pages.

A suggestion to the engineers at Google working on this problem. Back out of this trainwreck, whether it be an update or a new filter or whatever. You can't save it by trying to tweak it a little bit more. Back out and start over again once you actually have a solution that's been tested!

On the plus side my Yahoo traffic is rising as it appears that people are getting as fed up with Google's results as I am. My Yahoo referrals have doubled in the past two days and it's not because my rankings have changed. It's because more people are using Yahoo rather than Google. If this drags on much longer I think Google is risking large scale defections to the competition, which come to think of it, is a good thing! ;-)

-- T

BillyS




msg:760576
 12:52 am on Sep 29, 2005 (gmt 0)

This could be because I have 1015 pages on the site.

steveb - I've started a new topic on this very subject. I think we've got something in common here from what you're describing. My topic is still under review.

This 1014 message thread spans 34 pages: < < 1014 ( 1 ... 9 10 11 12 13 14 15 16 17 18 [19] 20 21 22 23 24 25 26 27 ... 34 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved