homepage Welcome to WebmasterWorld Guest from 54.234.128.25
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 45 message thread spans 2 pages: 45 ( [1] 2 > >     
Google's New Year Resolution
3 months later
Zapatista




msg:77899
 9:02 am on Mar 27, 2003 (gmt 0)

I recall reading GG's mention that Google's New Year Resolution for 2003 is to communicate better with webmasters.

Many here feel that the spam report is useless (even when you mention GG and WebmasterWorld) as well as emailing Google with questions or concerns.

Three full months into the New Year and I wonder how Google is doing. From my limited experience, not well. I emailed GG directly about a spammer, which he confirmed receiving, and the spammer still is there.

What is the opinion/experience of everyone else?

I still have lots of respect for Google. This is not an "anger" based thread, but one of disappointment and growing concern in light of Brett's Entropy Part Two thread which most people missed.

What's everyone else think? And what does GG have to say?

 

tigger




msg:77900
 9:05 am on Mar 27, 2003 (gmt 0)

Still another 9 mths left :)

JudgeJeffries




msg:77901
 9:11 am on Mar 27, 2003 (gmt 0)

GG has already said that they dont much go for hand to hand spam fighting so perhaps they use all the info in the spam reports to prioritise strategy for algo amendments.

Tor




msg:77902
 9:41 am on Mar 27, 2003 (gmt 0)

Maybe this [webmasterworld.com] will solve some of your frustration Zapatista? ;)

Zapatista




msg:77903
 10:19 am on Mar 27, 2003 (gmt 0)

Not bad, not bad at all.

mipapage




msg:77904
 11:07 am on Mar 27, 2003 (gmt 0)

I'm waiting for the result of a blatent spam issue myself, but what Tigger said seems like a good call. Let 2003 be the year of better communication with webmasters on many fronts, but also the year of cleansing!

Looking forward to seeing what happens to my spam case after the next dance!

cwebb




msg:77905
 11:21 am on Mar 27, 2003 (gmt 0)

...and after the next dance and the next and the next...

Be patient means waiting more than one or two updates, it could take years obviously

Fiver_321




msg:77906
 1:54 pm on Mar 27, 2003 (gmt 0)

Oh - brette wrote about Entropy? In what context? Wheres the thread anyone?

I definately missed it.

Buckley




msg:77907
 2:34 pm on Mar 27, 2003 (gmt 0)

If i had GG's private e-mail address I wouldn't be saying anything negative about support. That's like having a deck of "get out of jail cards" :)

mack




msg:77908
 2:40 pm on Mar 27, 2003 (gmt 0)

I think Google has only one option when it comes to fighting spamm and that is to work totaly on scalability. There is no point in banning one site when they could work on nutralising spamming methods. As they find out how the spammers are doing it they can then tweek the algo and these sites will be gone. I think spamm reports are not used to pinpoint indevidual sites, rather to work out what methods of spamming are popular or widespread within the index. Slowly but surely they are getting there. From a user point of view Google do a far better job than most other mainstream search engines at keeping their index clean but they will never be able to totaly remove all offending sites.

cwebb




msg:77909
 2:43 pm on Mar 27, 2003 (gmt 0)

Well, I've seen so many complaints over hidden text, which should be fairly easy to catch (so what if Google ignores the no-index on the CSS files to catch those guys?), but nothing has changed so far as I can see...

netguy




msg:77910
 2:45 pm on Mar 27, 2003 (gmt 0)

Zapatista... I've had the same experience. I have had a competitor stuffing dozens of links into non-existent image maps (the ole 'invisible image map' trick) for more than 6 months, and despite Google reports, they still have positions #1, #3, and #4 on page 1 for my primary keywords.

I've come to the conclusion that I have to get along with my 'neighbor on the hill,' and hope prospective customers will stop by my poor neighborhood occassionally...........

rankboy




msg:77911
 9:57 pm on Mar 29, 2003 (gmt 0)

These posts are always so funny to me. The truth is no one here has the authority to classify anything as "spam" The only people that have that authority is the folks over at Google. If Google does not classify a site as spammy then guess what, no other force on this earth has the power to say that it is. The buck stops at Google (or whatever other SE you are talking about) Last time I saw there is no federal mandate requiring or disallowing certain seo tactics. If a site is reported, checked, and comes up as clean by Google then there is nothing to complain about because the decision has been reached.

rankboy




msg:77912
 9:59 pm on Mar 29, 2003 (gmt 0)

P.S. The most important thing to Google is that they're end users go to a relevant site when they do a query. Thinking about "spam" in this reverse way explains a bit of why Google is hesitant to pull the trigger on sites.

AllEyes




msg:77913
 10:24 pm on Mar 29, 2003 (gmt 0)

Zapatista, nothing has changed.

You can put in all of the spam reports you want and name drop WebmasterWorld, GG, and everyone who is supposed to make a difference, but it's a bitter waste of time.

I'm not siding with Google, either. In my opinion, this is a clear disgrace to what was once a great search engine.

Having said that, I'd be the first to salute Google and GoogleGuy if (and when) they actually take the spam reports seriously.

It's almost April of 2003, and Google still has a major job to do.

I find it ironic and somewhat sad that the one aspect that made them what they are today is the same thing that's causing fierce dissent. This one aspect? Relevant results, which Google cannot have when cheap, low-end spam techniques still reign as king.

Perhaps I'd hold Google in a lesser degree of contempt if the spam problem was one of something more high tech, such as cloaking or whatever that'd require large, lengthy algo adjustments.

In this case, it's amateur night and they are winning.

Not because they’re good, but because Google has become lazy.

At least we have GG here at WebmasterWorld to converse with, and I personally hope GG steps up to the plate. It’d be fantastic to correct this problem through open discussion and debate instead of having the lawyers, media, and other SE’s deal with it on an unmerciful "big business" platform.

I have a glimmer of hope that Google will fix the problem so we can all get back to square one, whether you're an SEO or Surfer Joe.

Tapolyai




msg:77914
 10:30 pm on Mar 29, 2003 (gmt 0)

Could there be a mechanical or "programatic" way to work with reportings?

Say, a separate system, that does more intense review of the site? This would allow the regular crawlers to do their jobs at faster but shallower, so the abuse checking system(s) can do more detailed/deeper but slower review of a reported site(s).

This is still open for abuse - I could report someone several times, from various sources. Some protection can be built in, like drop all other abuse reports if the domain is already in the queue.

Repeat reporting of the same domain from the same address (be it IP and/or e-mail) could be reviewed as "reverse-abuse".

Could accept reporting only from fixed IPs. Could time delay, i.e. cool-down, say 15 days forcing some IPs to expire, and if the report could not be "reverified" by the submitter, then drop it. No throwaway e-mail could be used (yahoo.com/hotmail.com/etc.)

Showing how much (not necessarily what) is in the abuse review queue would also "placate" some of us as to the process is running.

All in all - some code could be thrown at the problem to reduce the overhead in hand-verifying each and every site that is reported in an irrate e-mail. Even if just the reporting would be through a non-free-form form (i.e. radio/checkbox, pick-list only, no texbox to describe the issue), would definitely accelerate the resolution process.

Whatever... I stop ranting now.

AllEyes




msg:77915
 10:34 pm on Mar 29, 2003 (gmt 0)

Rankboy, with all due respect, it is attitudes like yours thats lead us into the situation we're all in right now.

Google has defined what is spam and what isn't.

Google's SERPs, at this very moment, are flooded with sites that are in violation of these Google defined spam provisions.

Said sites as a whole are not using proverbial ray guys here, either. All of the spam is low tech and can be found by a twelve year old kid, much less a multi-million dollar algo.

Bottom line: If Google never took an anti-spam stance both in their TOS and here at WebmasterWorld where the so called enemy sleeps, we wouldn't be having this discussion today.

And, we'd also be discussiong an entirely different SE.

Google is being called out on the carpet for not enforcing the very rules we're supposed to conduct ourselves by.

If you can't handle it, then get out of the way so others who want to make a difference can. Then again, if our detractors are employing spam techniques, they have nothing to argue about, do they?

anallawalla




msg:77916
 11:14 pm on Mar 29, 2003 (gmt 0)

I have reported the practice of what I call "Freshbot riding" (is there an established term for it?). There is one site that is on the SERP for my term (abc consultant) but he is not a competitor - in fact they sell a product that just might be of interest to abc consultants and possibly to people looking for such consultants.

That term competes for relevancy in a page full of valid but irrelevant hits, e.g. jobs, resumes, articles about abc consultants.

I have filled out a spam report and a Dissatisfied report at least twice. There were two such sites hogging the top 4 spots. One was banned and I now rank #1 and #2.

This Freshbot rider employs these practices:

* Human visitors with Flash get whisked away to a product page served by a software vendor from another domain.

* Crawlers and others get gibberish text including the phrase "abc consultant" and links to other pages that lead to the same kind of cloaked page with the Flash ad.

* Re-saving the page daily so as to get a fresh timestamp and thus exploiting the Freshbot effect to be always in the first 10.

As the form complaints didn't work, I tried the advice here of writing to GG. Hard to know who replied, but they advised me as follows:

"Thank you for your note. We appreciate your help in maintaining the quality of our index. Please submit this report at [google.com...] .

Regards,
The Google Team"

I had written asking them to address the algorithm, since it is reasonable that they don't have the time to tackle individual problems in an obscure category. e.g. if crawled copy is identical to cache copy, then do not give a freshbot boost. It is possible that the two data sets are on different machines and this could slow things down.

I have replied saying that I have filled out a spam report at least twice. If I have to fill it out again, it could mean that the older ones were just deleted.

I will give it a rest for a while, but if others are seeing examples of Freshbot Riding, please report the practice.

rfgdxm1




msg:77917
 11:43 pm on Mar 29, 2003 (gmt 0)

>The most important thing to Google is that they're end users go to a relevant site when they do a query. Thinking about "spam" in this reverse way explains a bit of why Google is hesitant to pull the trigger on sites.

I mentioned this in another post, and it is something many don't get. If through search engine trickery I get a site about purple penguins to the top for "blue widget sales" THAT is spam because someone searching with that doesn't care about purple penguins. However, if you sell blue widgets, the searcher doesn't care if your "spammy" competitor is #1 on the SERP ahead of you.

anallawalla




msg:77918
 1:48 am on Mar 30, 2003 (gmt 0)

I mentioned this in another post, and it is something many don't get. If through search engine trickery I get a site about purple penguins to the top for "blue widget sales" THAT is spam because someone searching with that doesn't care about purple penguins. However, if you sell blue widgets, the searcher doesn't care if your "spammy" competitor is #1 on the SERP ahead of you.

I find this widget terminology confusing because my editorial mind equates it to a tangible object. Let's pick a service example.

Say I am looking for an "astrologer" and nearly all the hits on the page are for such people, even if some are resumes, some are job ads and some are newsletter articles. I have no problem with that.

But my complaint is about a reseller of a horoscope generating software tool that an astrologer might use, or an end user might use. This reseller uses the keyword for the occupation to get a SERP result (replete with the gibberish snippet), which might get a few clickthroughs to the software product. There is nothing here for a person who is looking for the service provider.

I am referring to the misuse of phrases to trick the SEs, not an accidental combination of words.

Other examples:

SEO consultant > points to mass-submission software

Fitness trainer > points to fitness software

Brain surgeon > points to a book "Brain surgery for dummies"

Does Google call this spam?

(My complaint is about riding the Freshbot by updating the time stamp and not changing content - which happens to use the above trickery)

rfgdxm1




msg:77919
 1:56 am on Mar 30, 2003 (gmt 0)

>Does Google call this spam?

I'm not sure it is because there is some relevance to the search term. In my purple penguin example, I was thinking about putting up something like a doorway page target at "blue widget sales" to lure these people to visit my totally unrelated site about penguins.

GoogleGuy




msg:77920
 8:43 am on Mar 30, 2003 (gmt 0)

Let's go back and find the original comment I made. [GoogleGuy roots around with the site:webmasterworld.com syntax on Google.] Okay, here it is. :)

"My two wishes (resolutions? ugh, I hate that word) would be: better communication with webmasters when they've run into problems, and closing up loopholes so that webmasters can worry about good content and not waste time trying to do tricks."

Those were my personal wishes, not Google resolutions. I don't get to decide what Google does each year. :) So the first wish was to improve communication for penalized sites. In the last couple months, I've tried to let people know when penalties were about to expire, etc. We also introduced a process where webmasters that have cleaned up their site can request reinclusion. The process is pretty simple: write to webmaster@google.com with the subject "reinclusion request" and explain what happened and how you cleaned up the site. We have more resources on user support to deal with those requests too. Overall, I feel pretty good about how we're doing on this wish so far this year.

The other resolution was to close loopholes. We've made solid progress on that too, both on things behind the scenes (improving scoring and link analysis) as well as some things such as expired domains that we've communicated to webmasters. When I look back over the last few months, I'm pretty happy with the improvements we've done in both these areas already this year, and I think we'll get even better over time. Maybe I shouldn't have listed how I wanted Google to improve--I don't think most search engines engage in a dialogue like that--but I think it's healthy to get out there with a couple goals to keep things focused. Zapatista, I appreciate the three-month status check. :)

Okay, so now let's talk about the spam report form. :) I want to set expectations about spam reports at the right level. Spam reports give us different views of how to improve our quality. Many reports (e.g. off-topic porn for an innocent search) are addressed manually to get a quicker turnaround. For the most part however, the only way to handle spam in a fair, scalable way is to program a computer do it. We use the data from spam reports to help determine what the biggest issues are, and as training data to make sure that our algorithms work well. We tackle projects in the order that we think will improve our quality the most, but often those scoring changes aren't easily visible to the outside world. One thing about Google is that even when it looks like things are quiet, we're still working to make sure that we improve our quality.

Shew! Hope that provides a little more information and my personal take. Maybe next year I'll play it safe and just resolve to lose a couple pounds, eh? :)

le_gber




msg:77921
 9:58 am on Mar 30, 2003 (gmt 0)

better communication with webmasters when they've run into problems

My problem is that I have nightmares when the update is late ... Could google improve the communication on that? ;)

I've tried to let people know when penalties were about to expire

I second that, I've read thread(s) where you mentionned the comeback of sites that were penalized

Many reports (e.g. off-topic porn for an innocent search) are addressed manually to get a quicker turnaround

I think that everybody will agree that this sort of spamming reports is more important and must be dealt with more quickly than those who report that they have a competitor filling up hidden <div> with meta keywords, ie some of my competitors :(

the only way to handle spam in a fair, scalable way is to program a computer do it

Yep, that seem the only way, why manually 'rerank' site abc for something they did and was against google TOS, when we can 'rerank' all the site using these techniques, ie ... hidden div's :).
By 'rerank' I meant change the algo so the site is more fairly ranked, according to the content, not spammy techniques.

I've got one question though. I have read many people reported this 'hidden div' technique. Will google and/or google algo take into account hidden div at one point? Not that I intend to stuff my site with it if you don't ;)
But it seems to me, here down my ground level, that this 'hidden div' should be fairly straightforward to deal with. Could you (not you GG but Google - well if YOU can do it then feel free:) modify the algo so it says:
'ok div, is it visible, if yes - nothing to do ; if not check whether in this div there isn't h1, h4, links, keyword stuffing, very small text, and if there is, relativize the content (weight less than visible content - or ignore it).

Or may be it takes time because you are trying to get rid of the css hidden spec alltogether? I realize that you have to be able to access the stylesheet and that robot can be disallowed from it ... Damn! seem a lot harder all of the sudden ;)

Leo

Zapatista




msg:77922
 12:22 pm on Mar 30, 2003 (gmt 0)

Good post GG. Status checks keep you on your toes doesn't it? :-)

I speak for everyone when I say we appreciate GG's presence.

The spam problem seems to still be there but after reading so many posts on this here at WebmasterWorld, I really think we need to redefine what is spam and make it required reading. Many people hit the SPAM ALERT button whenever the competitor is ranking higher than them.

Google doesn't like to issue penalities by hand. I wonder if they could issue "warnings" to wayward webmasters. It's not a hand penality but it would probably have the desired effect. It might work as compromise between those who submit the spam report and Google's goal to not issue hand penalties.

If I got a "warning" from Google, I guarantee it would have the desired affect.

Just An Idea

Zapatista

vitaplease




msg:77923
 4:53 pm on Mar 30, 2003 (gmt 0)

I agree on the warning Zapista,

There are many respected companies around using the wrong SEM/SEO. I am sure that whilst they acknowledge that Google provides them with immense amount of referrals, they have absolutely no idea what strange or hidden things have happened to their site.

A warning from an anonymous source would probably be cast aside as a jealous competitors whining. A polite warning from Google would probably waken up all the right responsible people (problem is Google would have to add a telephone number to make it authentic).

GoogleGuy




msg:77924
 12:56 am on Mar 31, 2003 (gmt 0)

Zapatista, I like the idea of a friendly notification from Google. To be sure it had an effect, you might have to remove a site for a few days at the same time. Maybe the notification could say "we'll automatically reinstate this site in a few days, but there may be a spot check later to make sure the (hidden text, hidden links, whatever) is completely removed." I like the idea of something like that. We could automate that on Google's side, so it would be fair, but it would also get the message across loud and clear to the site owner without doing permanent harm.

GrinninGordon




msg:77925
 1:01 am on Mar 31, 2003 (gmt 0)

Hi GoogleGuy

Is there anyway an update could be given by Google as to what constitutes Spam? Also, would you encourage webmasters to report Spam to either you or the form if they find it in other (not related to them) areas?

Zapatista




msg:77926
 1:30 am on Mar 31, 2003 (gmt 0)

Glad to be of help. Does it earn me a Google coffee mug?
:-)

Livin on the web




msg:77927
 2:11 am on Mar 31, 2003 (gmt 0)

Ok, here's an idea.

Each G update, a list of the bad factors which Google has discerned about each site might be generated, this could look like this:

A 'bad technique' listing page by number would be available to site operators -
[code] [description........] [action]
1 - excessive dead links (warning)
2 - no text on page (warning)
3 - linking to bad neighborhood (fatal)
4 - robots.txt prevents crawl (fatal)
etc...

And then a page-by-page listing of what Google found to be the case on the site's pages:

mysite.com: 1,2,5,37,678a
mysite.com/innerpage.htm: 2,36,37
mysite.com/innerpage2.htm: 678a

Any pages which would be '100% OKAY' would not show up.
They could still leave out factors to keep everyone still wondering why their PR dropped (wouldn't want to stop all the fun)

Google could maintain these pages online and for security
purposes only allow the IP that the domain is housed on to retrieve the information.

This would give everybody an equal playing field to know that they have 1) site problems or 2) done something stupid and that's why they are banned -- and allow them to correct things, and become wonderful netizens.

Just a thought.

MetropolisRobot




msg:77928
 3:47 am on Mar 31, 2003 (gmt 0)

To quote le_gber:

<i>My problem is that I have nightmares when the update is late ... Could google improve the communication on that? ;) </i>

Why? I'm not sure I have ever read anywhere that there is a set date for Google to update. The problem for Google is that if they set an expectation then people will get riled when that expectation is not met.

Hence there is an expectaction that Google will update/dance at some point but it could be any time.

A small dose of randomness actually helps the topic matter of this thread. If the update/dance time were known then people would work to that, including the spammers etc. The fact that the update/dance time is variable is actually a good thing.

This 45 message thread spans 2 pages: 45 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved