homepage Welcome to WebmasterWorld Guest from 54.211.95.201
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Matt Cutts Interview: Google Search Quality and Web Spam
engine




msg:4001335
 11:44 am on Oct 5, 2009 (gmt 0)

Matt Cutts Interview: Google Search Quality and Web Spam [businessweek.com]
n a recent interview for my story on how Google’s trying to stay ahead of rivals in search, Cutts provided insight not only into how Google tries to reduce Web spam but also into the search quality process at large.

This is an interesting interview on BusinessWeek and reveals some aspects not mentioned previously.

 

tedster




msg:4001365
 12:52 pm on Oct 5, 2009 (gmt 0)

Well, the Web spam team does have the ability to say this result is spam, so it should be demoted or penalized or pushed down in some way. But we don't have any ability to say for this query ...we think that this page should rank No. 1.

We've made a deliberate choice that we don't want to.

Interesting point of view to take, and I agree that it makes sense. I wonder about two things:

1. OK, the Web spam team doesn't have the ability to say "make that url the #1 result for this query". Does any other part of the Google team?

2. How about "make this result #4, or #11"?

JoeSinkwitz




msg:4001401
 1:41 pm on Oct 5, 2009 (gmt 0)

Way too much moral hazard to give any team the ability to preset a #1.

#4/#11 is more of a ceiling than a floor, so I don't think it applies.

whitenight




msg:4001408
 1:51 pm on Oct 5, 2009 (gmt 0)

As always, premised with the theory that one never needs MC to announce anything if they follow the algo diligently

reveals some aspects not mentioned previously

Like? What in this interview was new (to you)?

1. OK, the Web spam team doesn't have the ability to say "make that url the #1 result for this query". Does any other part of the Google team?

We already know that certain Top 3 results are handpicked.
Both you and I have sites or clients sites that are beneficiaries of this, so whether it's hand-picked or algo-picked, it has the same end result.

------------------

I love this quote
. If you go to senior citizens and ask, “What do you like about Google?” they’ll say “Clean, fast, and relevant.”

lol really?
Is that the key demographic now?
Talking to a lot of senior citizens at the 'Plex?
Let's see...

Fast - yea, i throw my computer out the window when i load up ASK, Yahoo, or Bing. <rollseyes>

Clean - umm, you mean the actual google.com page, not the SERPS, right?
With all the Universal search, images, videos, onebox, and related searches, Google SERPS are more congested than a personal Yahoo! portal page.

Relevant - 4 straight months of Bing gradually growing marketshare might point to a different story.
Oh yea, and the real "experts" here in these forums saying otherwise.

--------

Back to other threads related to word choice and syntax, MC's comments on Caffeine are telling.
(Trial lawyers, NLP specialists, and "Lie to Me"/"The Mentalists" fans will note this)

"Caffeine was primarily an infrastructural change"

2 points in the first sentence.

He uses the word "was", NOT "is". or "will be".
Why?

He also uses the word "primarily", not "exclusively" or "only".
It obviously is NOT just an infrastructure change.
He has continually used the word "PRIMARILY" indicating it is/was also more.

"That was a huge undertaking over many months from the crawl and indexing team"

Again, he's obviously talking about Caffeine in past tense

Hmm, what took "many months" that had much of the forum confused over the summer?
"Long National Nightmare Update?"

"To most of the world, they probably wouldn’t be able to tell the difference. Maybe just a few search experts can really tell any kind of a difference at all.

Again, he's talking about Caffeine as if it's ALREADY out.
Why would anyone be able to notice anything if it hasn't been implemented yet?

-------------

but in, say, Germany, they have a lot more dashes on averages.

This settles the "how many dashes are flagged as spam" debate.

It's always been more of a branding issue.
Telling someone your site is "cheap dash blue dash widgets dash online dot com," isn't great for branding, but doesn't hold any inherent "spamminess" for the algo.

----------------

I think Google was in the mood to have someone rake us over the coals a little bit, and Anil’s post came at the perfect time to remind us our purpose to is make the Web better, our purpose is to return the best search results we can. Our purpose is not to be closed to outside feedback.

Well see Goog. You quickly are going down a dark slippery path much more insidious than a "Microsoft moment", and it's the employees who are the real voice than need to stand up and collectively say,

"ENOUGH! This isn't what we were sold on when we came to work here.
Can we forget about the profits and data collection for a moment and just work on ORGANIZING ALL THE INTERNETS INFORMATION without dollar signs being attached to it?"

[edited by: whitenight at 2:19 pm (utc) on Oct. 5, 2009]

Wicketywick




msg:4001529
 4:11 pm on Oct 5, 2009 (gmt 0)

"Caffeine was primarily an infrastructural change"

through my project I noticed a significant change in the results since Caffeine. The main change being that the google.com with german settings and the google.de now produce very similiar results while before that the SERPs were very different.

MrSavage




msg:4001880
 3:46 am on Oct 6, 2009 (gmt 0)

Okay, wow. How is it that one person, or a group of persons at Google can shove a site down in rankings or off page 1 because they decide to. Now, let's see here. That doesn't concern anyone? In some circles that would be considered communist rule. At stretch yes, but when a few people can control information to that extent it's scary. It's scary because of Google's market share. I go back to China being the example, if we only use Google then we are trusting that they aren't manipulating results. A group of people pooping on a site to me, is manipulating the results. That group or person is deciding what is or isn't a good result. It may have been fine in the past, but not when they own >80% of the internet information that the world sees. Now, this may be fine if you trust these people making those decisions, but what if they had another agenda? A bias? What if they didn't like you and had that ability to poop on your site? Well sounds like they do have the ability to poop on your site. What does this mean? In cyberspace, it's called playing God. If I could control what you do or don't see, I'm essentially God.

tedster




msg:4001902
 4:28 am on Oct 6, 2009 (gmt 0)

MrSavage, if there is to be any set of results at all, then there must be an algorithm and someone, or a group of someones, will write that algorithm. Those criteria cannot be anything but subjective to a degree. There is no such thing as a #1 url or # anything url, in absolute terms.

Yes, that is a lot of power, but how else can it possibly be? Any search engine must "manipulate" results - that's their business. How else could there be any results at all, should they just be in alphabetical order, like a phone book?

If you think this through a bit more, learn a bit more about the science of Information Retrieval [IR], then you may find yourself in a better position to build websites than do well in acquiring search traffic.

MrSavage




msg:4001915
 5:22 am on Oct 6, 2009 (gmt 0)

I may not have made a clear point. It's the hand picking of sites that I'm talking about. Of course algo is human powered. I've never seen search engines as being a panel of humans who comb over the results and press a button to manually remove, shift, squash sites at will.

"Well, the Web spam team does have the ability to say this result is spam, so it should be demoted or penalized or pushed down in some way. But we don't have any ability to say for this query ...we think that this page should rank No. 1. "

Think about that for a moment. This is not something, with my limited intelligence, realized was actually happening. I can understand adjusting algo, but hand picking site to demote is startling to me. For me, and obviously not to you, having a person or a group of people hand picking sites to push down, it's a bit scary. It's not scary if you trust who is doing it. As w webmaster how can you even get a grasp on what is happening with your rankings? Do ugly sites get pushed down? What if you are on page #1 and are just learning about designing graphics? Ugly color schemes? I always say, what constitutes spam exactly? Eye of the beholder?

Again, look at that quote again. It's nuts. If can push a site down off pages, duh, then you can essentially cause the other surrounding sites to rise up. If you push 5 down, jeez, that #6 site just went to #1. I'm baffled by the logic in the comment.

People can be fine with that, that's great. I'm not so trusting. My concern is the hand picking. That's news to me, so maybe I'm just a complete idiot who knows nothing about SEO and search engine companies. Maybe they all hand pick sites to squash. I have no idea. This is the first I've learned of it happening. It sounds like Mr Cutts is part of a panel that judges what can or cannot show up. They have the power to hit the delete button. Perhaps that's why Google is #1. I've just never thought that search results were open to judgement by a panel or a small group of people. In communism, it's the government that decides what the people should see. They think it's the right thing for them to see, therefore it is. I would be a lot more comfortable with penalties, but when you are open to a panel judging you, I'm not really okay with that. With their dislike of Microsoft, don't you ever wonder about that? Hmm, a slight bias can't enter into a panel who judges websites? Isn't to bias to be human?

I'm having a bit of fun with this. I thought penalties were a way of dealing with results on Google that were suspect. I thought that was the role of the spam team. I didn't know that they could manually squash websites. You would think they could at least send you an email letting you know that your site sucks. At least with a penalty, there is a path you can follow to lead yourself back to rankings and or back to the index. That info may explain why there are hundreds of "what happened to my site rank, i've dropped from 2 to 10!" I think folks you now have your answer.

Perhaps Google wants to be more like a social networking search where everyone has a say in what ranks and what doesn't rank. That's great. The big difference is, I would trust a search result that is based on the opinions of 15 million people, rather than what, 10 or 20 Google employees.

[edited by: MrSavage at 5:46 am (utc) on Oct. 6, 2009]

tedster




msg:4001922
 5:44 am on Oct 6, 2009 (gmt 0)

Thanks for clarifying - now I understand a bit better where you're coming from. You may find this thread informative about how these human website raters work:

Google Patent - human editorial opinion [webmasterworld.com]

TheMadScientist




msg:4002073
 12:44 pm on Oct 6, 2009 (gmt 0)

I can understand adjusting algo, but hand picking site to demote is startling to me. For me, and obviously not to you, having a person or a group of people hand picking sites to push down, it's a bit scary. It's not scary if you trust who is doing it. As w webmaster how can you even get a grasp on what is happening with your rankings? Do ugly sites get pushed down? What if you are on page #1 and are just learning about designing graphics? Ugly color schemes? I always say, what constitutes spam exactly? Eye of the beholder?

I think that was addressed in the answer to the question following your quote: (Emphasis Mine)

A: That's correct. We've made a deliberate choice that we don't want to. Because if you think about it, those kinds of choices tend to get stale, it's not very scalable, it doesn't work very well in other languages.

But in our group, we vastly rely on algorithms. We try to write new techniques and algorithms. But if someone writes in and says I typed in "Rob Hof" and got "#*$!", they're really unhappy if the reply is well, we think we'll have a new algorithm to deal with that in about six to nine months, so check back and the "#*$"! may be gone maybe by the end of the year. So we'll take action. Even then, we try to do it in a scalable way.

I think sometimes we look at things on too small a scale compared to what they have to do... Think about managing just 1,000,000 pages (not sites, pages) and it's your job to find a way to do it by yourself.

The only way is when you 'push down' one site, you are able to review that site's 'pattern' heuristically and then 'match' sites exhibiting the same characteristics on a large scale. Not based on ranking position, but on the 'pattern of unwanted results' no matter where they rank.

It seems like you are forgetting the job of the Google Search Quality and Web Spam Team is to keep the entirety of the if the index (results) clean, not to clean out sites from the index one-at-a-time.

You say, it's ok if you trust the people doing it, but what about the way a site looks, could it get pushed down? I would say the Search Quality and Web Spam Team is much less likely to try and do this than you might think, because it's not something they can detect heuristically and do on a large scale. They are much more likely to find a 'spam site' (a site (or page) showing in a result set it should not show in, if it should show at all) and find a way to detect that site (or page) over a large scale regardless of appearance, because that's their job.

As far as it being scary to you with regard to Google showing or not showing a site, based on look, feel, etc. you must remember Google is a website and they have every right to base their results on whatever they feel like, so if you don't think your websites look as high quality as the others in the surrounding results and you think they might 'filter you out' as a result of the look and feel presented, then don't worry about it, get a nice shiny new template and put your mind to rest, because it's their website and whom they choose to link to and why is up to them, much the same way whom you choose to link to and why is up to you.

Do you have a bias on what sites you link to from your website(s)? Would you link to a site you thought was off topic based on what someone was looking for on your website? Would you link to and possibly send your visitors to a site you thought looked like garbage? Google's a website (with way more traffic than your's or mine) but a website nonetheless and they have no obligation to anyone, except their shareholders. Period. Not you. Not me. Not anyone. It's not bound or obligated to webmasters. It's bound and obligated to shareholders.

Here's one for you:
Do you always notify and inform webmasters when you remove a link to their website? Do you take into account how removing their link will directly impact their level of income by no longer sending them any visitors? Do you take into account how the removal of the link will indirectly impact their income by causing a change in the inbound link count and inbound anchor text to the website? If you don't why should they?

If I said it was scary you don't do any of the preceding or that you could 'hand pick' sites to show on your links page(s) would you think I was nuts? Yep, Google generates and send websites way more traffic than yours does, but it's just another website and where's the threshold? When does a site become busy enough they should be obligated to the people they link to?

Seriously, they have less, much less, control over the websites they link to than anyone here and you think that's scary and they have too much control over what they show in their index?

What's more scary to me is how so many webmasters have allowed themselves to become dependent on another website which is out of their control and think somehow that website is obligated to them in some way, shape or form, because of their dependence on it... Talk about scary.

Also, if you think Google should be obligated to you or anyone else, 'Because of all the free content', disallow GoogleBot in robots.txt and be done with it. Everyone, has the same legal ability to access and display the content Google uses for it's index, including you, so how is it possible Google is obligated to another site owner when you are not? If you link to another website using the title and description from other websites, you are using exactly the same 'free' content Google is on a smaller scale... Scale up if you feel the need, but they are no more obligated to you than you are to the webmaster you got the free page title and description from.

[edited by: tedster at 6:06 pm (utc) on Oct. 6, 2009]
[edit reason] fixed character set problem [/edit]

MrSavage




msg:4002075
 12:57 pm on Oct 6, 2009 (gmt 0)

When you monopolize the internet search, the game changes. It's not just some small issue. It's not their fault they have >80% market share of the free world. I'm not exaggerating by much. When we, society is becoming more and more search and internet dependent, the thought of a few being able to manipulate manually what you see or don't see, should be troubling to you. That is, unless you have a fairyland view of Google (a corporation) and that they could never have bias. Um, like disliking/competing head on with Microsoft. You can't dumb this down and say, oh it's their search they can do what they want to the results. If necessary reconsider what <80% of search means. That owning the internet and what you see. What if they liked affiliate sites? There is certainly a bias against anyone running a site with affiliate programs. Right there you have bias in search results.

TheMadScientist




msg:4002078
 1:11 pm on Oct 6, 2009 (gmt 0)

It's not their fault they have >80% market share of the free world.

Yes, it is...
How is it possibly not their fault?
They do a better job than anyone else.
It's absolutely their fault.

That is, unless you have a fairyland view of Google (a corporation) and that they could never have bias.

Nope. I have a brutal reality view of the situation and the brutal reality is...

...oh it's their search they can do what they want to the results...

Reality Bites.

There is certainly a bias against anyone running a site with affiliate programs. Right there you have bias in search results.

File a law suit...
See where it gets you, because the reality is:

...oh it's their search they can do what they want to the results...

##### @ ##### @ #####

But to play along, how do you suggest it be changed?

Robert Charlton




msg:4002329
 6:32 pm on Oct 6, 2009 (gmt 0)

...the thought of a few being able to manipulate manually what you see or don't see, should be troubling to you.

As TheMadScientist just explained quite well, they are not manipuating results manually... they are adjusting their algorithm overall to filter out what they perceive are low quality results.

There is certainly a bias against anyone running a site with affiliate programs. Right there you have bias in search results.

As has been discussed numerous times in this forum, there is no bias against affiliate sites per se. Google does like and value original content, though, which its users also value, and Google tends therefore to filter out pages which are identical to thousands of other pages, all competing for the same keywords, that simply copy affiliate feeds, with no original content or value added.

This dedication to search quality is one of the reasons why Google is doing well and why many cookie cutter affiliate sites are not doing well in Google.

Some of the best quality and highest ranking sites on Google, we should note, are affiliate sites. They are sites, though, where lots of work has been done to make them very, very good.

TheMadScientist




msg:4002466
 9:46 pm on Oct 6, 2009 (gmt 0)

There is certainly a bias against anyone running a site with affiliate programs. Right there you have bias in search results.

I would also like to add: In my opinion when you analyze the data Google has to the nearly 'microscopic' level, the main bias for anything is based on Visitor Behavior. This is a quote from the interview with Amit Singhal, Head of Google's Head of Google's Core Ranking Team.

Please, find the full thread and link to the article here: [webmasterworld.com...]

A few years ago, our engineers were noticing that on acronyms, we were returning lots of good results but the bolding on the page was not sufficient. If you type CIA, it could mean Central Intelligence Agency, or it could mean Culinary Institute of America. If we did not highlight that this result will go to the government agency, and that result is related to food, users were taking more time to click, wasting more of their time. Could we shave off 30 or 40 milliseconds off in their reaction time? Within a few weeks, we had an experiment, users were liking it, the clickthrough rates were great, and their response times were down.

The above is slightly OT in so much as it is from another interview with a different member of the Google Team, but is On Topic with regard to highlighting the fact they analyze data to a nearly microscopic level to make decisions and just because a site or group of sites fitting a pattern disappear from the index (results) it Does Not indicate it was Matt Cutts and the spam team 'pushing down' sites...

The most likely culprit is actually Google's Visitor Behavior over a period of time, which would then lead to a 'conclusion', which would be tested in a result-set, and would have to prove itself to be beneficial to Google's Visitors based on their continuing behavior, where the determination of 'beneficial to Google's Visitors' is guided by constant monitoring and studying of Google's Visitor Behavior at a level most could only dream of.

If you want to blame a 'bias' or 'perceived bias' on anyone blame the stinking visitors Google caters to for their 'biased and unconscionable behavior' within the index (results) Google provides for them to click on, because somehow I'm sure if Cutts and his team or Singhal and his team received a quantity of e-mails and/or 'help us improve' suggestions from Their Visitors requesting the affiliate sites which were 'oh, so valuable' to Their Visitors be placed back in the results, they would have take action.

Seriously, who do you think cares if an affiliate site ranks, besides the webmasters hoping to capitalize on Google's traffic? Do you really think the 'missing affiliate sites' provide a better visitor experience for Google's Visitors than the sites you see as replacing them based on a 'bias' at Google? (I'd wager they have data to back up their position if there is a 'bias' of any type.)

<added>
I always say, what constitutes spam exactly?

The answer, according to Google is:
A site violating Google's Quality Guidelines:

If you believe that another site is abusing Google's quality guidelines, please report that site at https://www.google.com/webmasters/tools/spamreport.

[google.com...]

Check out the actual Quality Section of the page, as it seems to directly address affiliate sites and domains with duplicate content among other things.
</added>

physics




msg:4002494
 10:17 pm on Oct 6, 2009 (gmt 0)

Interesting read, guess I'm in a funny mood though because the main things I noticed were a couple of quotes that seemed pretty amusing to me.

“Given this URL, how spammy do we think this URL is?” And we might use dozens of signals—what sort of spammy words do they use, the backlinks to this URL, how spammy do those look. All of those blend together into a master ranking algorithm.

Sounds like every website is assumed guilty until proven innocent :p Basically he's saying every URL is spam, just more or less spammy than others.


So there’s a bunch of stuff going on in the background where Google is querying itself sometimes to make sure we’re returning the right results.

Google Google's itself!

tedster




msg:4002495
 10:22 pm on Oct 6, 2009 (gmt 0)

Basically he's saying every URL is spam, just more or less spammy than others.

Since Matt is the head of the spam fighting team, I guess that viewpoint comes along with the job - or at least it takes over after a few years :)

TheMadScientist




msg:4002496
 10:23 pm on Oct 6, 2009 (gmt 0)

I'm in a funny mood though

Me too, because in reading your post I just had this 'opposing viewpoints' thought with Cutts in one office and Singhal in another office each looking at the same URL and one repeatedly pushing the 'spammy' button and the other repeatedly pushing the 'quality' button at the same time, and I keep wondering who would win...

AG4Life




msg:4003799
 4:31 pm on Oct 8, 2009 (gmt 0)

I think Google is losing the battle against spammers in a big way. It's using penalties to fight against unscrupulous operators that don't really care about penalties, not when they can just set up a chain of new sites once the old ones are taken down. And they use risky techniques that often do work to get their rankings up, even if it is just for a relatively brief moment before the sites get taken down.

I actually came onto this forum just now because I just had a really bad experience in relation to spam sites on Google. I just did a search for "chipotle bleed" - yes, I just watched the latest episode of South Park - and it was malware sites galore on the search results - the top 5 all appeared to be hacked sites with scripts added to launch malware - luckily, my anti-malware software was there to spring into action and prevent any serious damage.

All the pages seemed to have been new, probably set up after the new South Park episode aired (about 12 hours ago). This is probably Google's new live search algorithm on show, bringing me the freshest content for the hottest topics. Shame then that it can't tell the difference between a malware site and a legitimate one, and I fear this could be a very successful exploit for spammers in the future.

Reno




msg:4003818
 4:52 pm on Oct 8, 2009 (gmt 0)

It's using penalties to fight against unscrupulous operators that don't really care about penalties, not when they can just set up a chain of new sites once the old ones are taken down.

Bullseye. Google is constantly fighting the latest battle, in a never ending war. I wish they would significantly increase the value (in the algo) to those sites that have genuine "historical authority" -- that is to say, those sites that most fulfill the query AND have been around for awhile. Whether those sites trade links with a thousand other sites is irrelevant; whether they pay for outside links is irrelevant; whether they include affiliate banners is irrelevant; whether the site owner has numerous other websites is irrelevant. The only thing that matters is this: Do they fulfill the query and have they shown their longterm committment? (and thus have trust & authority). That is mostly what people care about.

..........................................

tedster




msg:4003836
 5:01 pm on Oct 8, 2009 (gmt 0)

Reno - I'm with you. Google's need to deliver a "smackdown" instead of just ignoring the hoped for effect of those link building methods is hurting the results.

TheMadScientist




msg:4003873
 5:42 pm on Oct 8, 2009 (gmt 0)

Bullseye. Google is constantly fighting the latest battle, in a never ending war. I wish they would significantly increase the value (in the algo) to those sites that have genuine "historical authority" -- that is to say, those sites that most fulfill the query AND have been around for awhile. Whether those sites trade links with a thousand other sites is irrelevant; whether they pay for outside links is irrelevant; whether they include affiliate banners is irrelevant; whether the site owner has numerous other websites is irrelevant. The only thing that matters is this: Do they fulfill the query and have they shown their longterm committment? (and thus have trust & authority). That is mostly what people care about.

I've got to agree with almost everything you say...

The real question is:
How do you algorithmically (heuristically) determine the preceding?

IMHO, Google's fundamental theory is flawed: Links are *not* the answer, because I think spam gets much easier to fight when you can't game the system with inbound links...

dstiles




msg:4003997
 9:57 pm on Oct 8, 2009 (gmt 0)

A lot of the malware sites are on established IPs or hosted with known malware-friendly hosts or on known bad DNS servers. Does that suggest a way out for google?

Ok, a lot of them may not be but here is a good start. And it shouldn't be beyond them to detect poisoned DNS, surely?

Malware and spam isn't necessarily the same thing, although it could be. I've mentioned a few times that I'm also in favour of google dropping the useless (but spammer-friendly) links system. I'm sure it's only retained as a prestige thing now.

tedster




msg:4004009
 10:36 pm on Oct 8, 2009 (gmt 0)

A lot of spam today is based on parasite hosting served from hacked websites - edu pages and others. There are major networks of this type, interlinked, supporting each other, appearing and disappearing. Very hard to catch "on a dime", although within a short period of time, most are detected and whacked.

dstiles




msg:4004518
 9:26 pm on Oct 9, 2009 (gmt 0)

That is, of course, true, but surely google could do something to detect their subversion. Some botnets are still relatively predictable.

Blocking dynamic IP ranges could easily catch hijacked domestic machines: there is no reason why a serious web site should be hosted on broadband. Admittedly most of these trojans are more concerned with spam and hacking but one wonders if some are used to cloak feeds to google.

It would be interesting to see if browsers catch on to DNS checking in addition to relying on "bad sites" lists. It could be erratic to begin with since many sites running on virtual servers have no proper rDNS but it could certainly check for sites where DNS is commonly hijacked (eg banks and google itself), either by false DNS or by local modification of Hosts or trojan-controlled substitution.

tedster




msg:4004520
 9:44 pm on Oct 9, 2009 (gmt 0)

The hijacked .edu pages I've seen this year were served from major university servers, not someone's dorm room or home. And yes, sometimes a server hack also involves cloaking for googlebot - the site owner can have a devil of a time understanding why their rankings fell through the floor. Or, in some cases, the student pages are no longer maintained - they're just sitting there like over-ripe fruit that didn't fall off the branch.

dstiles




msg:4004928
 8:34 pm on Oct 10, 2009 (gmt 0)

There are still ways to get around this, though. Google sends out enough non-bot bots from their own IPs and it's been suggested that they also send from other services' ranges so a little bit of extrapolation on their part using non-google IPs...

Although since they seem to be getting block info from a third party I would have thought that was an even better way around it.

I don't believe there isn't some way of detecting and killing / not listing malware and well-known black hat sites. There are certainly publicly available lists of open proxies and zombie machines - does google use these in any form?

tedster




msg:4005005
 11:38 pm on Oct 10, 2009 (gmt 0)

It's easier to find malware, I think, than it is to find parasite hosted content and links that are used to boost a real offering. Even so, Google does find that junk after a while.

There are certainly publicly available lists of open proxies and zombie machines - does google use these in any form?

I can't say for sure, but I assume they do.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved