Welcome to WebmasterWorld Guest from 3.214.184.124

Forum Moderators: open

Message Too Old, No Replies

what does google think about programmed queries like link builders?

Thinking about getting a link builder. Yes or No?

     
10:08 pm on Nov 25, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 4, 2002
posts:31
votes: 0


I'm thinking about trying zeus and or opti link to increase links to our clients sites. Does any one have any experiance with these programs, good or bad? and does anyone know what google or the other major SE's think about using these programs?
I'm looking for a way to find a good amount of relevent links to our clients sites, if the above programs are not the way to do this I'm open to suggestions.
10:17 pm on Nov 25, 2002 (gmt 0)

Administrator

WebmasterWorld Administrator rogerd is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Aug 2, 2000
posts:9687
votes: 1


Hello, anoryu. Google will hand out PR penalties when it identifies artificial linkage for PR enhancement. Search for "PR0" and "zeus" here to read past discussion. I don't know much about optilink, but if involves dodgy linkage, stay away.

I'd recommend getting quality links the old-fashioned way: submitting to relevant directories, contacting owners of related sites, etc.

10:21 pm on Nov 25, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 4, 2002
posts:31
votes: 0


The only problem is that we have 1800+ sites that we are trying to get links for. We're not trying to spam or get into lists that have nothing to do with our products. We're just looking for an efficient way to get quality links for our sites.

Woz

10:31 pm on Nov 25, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member woz is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Aug 13, 2000
posts:4823
votes: 0


The challenge is not what is done, rather the how the engines interpret what is done.

Getting links is really down to legwork, or should we say fingerwork. There is great advice and idea in the Reciprocal Links [webmasterworld.com] forum on the best ways to go about this.

Your first step should probably be to visit the relevant categories in the Google directory and work your way down the lists there.

Used properly Zeus is actually a good frogram for "finding" relevant sites that may not be listed in the Google Directory or are otherwise off the radar a little. However, using the inbuilt email and publishing systems in Zeus can often lead to disaster. Rather, I would be using Zeus to collect data and then hand process from there.

Onya
Woz

11:43 pm on Nov 25, 2002 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 26, 2002
posts:535
votes: 0


A direct quote from Google on the use of zeus...

"Like any other program, Zeus is a tool that can be used or misused. Google judges the quality of a site partly by the quality of the pages that site links to. If a webmaster links to poor-quality or spammy sites, that can affect his or her site's ranking. As a program that actively engages in searching out links, Zeus can amplify that factor. Webmasters who use Zeus should be extremely careful - adding links to sites tagged as spam can lead your site to be tagged as spam, and Zeus can clearly play a role in that process."

Matt Cutts

Google Software Engineer and Spam Czar

If you only use zeus to find potential link partners you aren't at any risk. Automated emails are rarely appreciated by webmasters and have been proven to be less effective than those with personalization.

As for actually allowing zeus to actually manage your directory of links... do so at your own (hearty) risk.

12:20 am on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 6, 2002
posts:4768
votes: 0


How can anybody do SEO for 1,800+ sites? That's insane.

I don't see how the manual approach is going to be feasible, but expect to get your clients in trouble if you use Zeus.

1:43 am on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 30, 2001
posts:1739
votes: 0


>How can anybody do SEO for 1,800+ sites? That's insane.

Surely not as insane as anyone thinking they have enough content for 1,000 sites!

2:24 am on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 26, 2002
posts:8
votes: 0


anoryu,

One of my customers suggested I answer your question
concerning OptiLink.

OptiLink does not find or make links, but instead analyzes
the links you do have and the ones you are considering
asking for.

It is a standalone desktop program built around an highly
specialized browser, but the easiest way to explain what
it does is to describe how to do the same thing manually
with a standard browser and some office software.

1. Using your url, do a search at your favorite engine
for link:<myurl>. This will return a list of
every page that links to yours.
2. Next, visit every one of those pages, find where the
page links to you, and copy the link text that it uses
to a spreadsheet.
3. When you’re done with that, divide the links up into
separate words and use your spreadsheet program to tally
what percentage of links use each word.

About now you're probably wondering why? Because this is
pretty close to what the search engine is already doing
to gauge the relevance of your site.

Google is currently "conflicted" concerning OptiLink,
having taken a stand, retracted, and gone back again, but
this is only an issue for people _promoting_ the product
and will soon be resolved one way or another anyway.

Selling is always a very visible activity and prone to
getting you shot at, but using OptiLink is completely
invisible to both search engines and the linking sites
you access, so what you do in the privacy of your own
office is pretty much noone elses business but your own.

2:31 am on Nov 26, 2002 (gmt 0)

Full Member

10+ Year Member

joined:July 10, 2002
posts:232
votes: 0


Anoryu,

My experience from looking at a lot of Zeus pages is that Google gives PR0 to the links pages when it can. A links page with no pagerank lacks a certain je ne sais quoi!

But Google doesn't otherwise seem to penalize those sites -- only the links pages seem to be PR0'ed. There may be exceptions of course.

Me? I don't even link to Zeus sites when they ask me to. I tell them why and I suggest they move to a more conventional linking system.

2:49 am on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Oct 4, 2002
posts:666
votes: 0


1800 sites? When do you sleep?
3:45 am on Nov 26, 2002 (gmt 0)

Administrator

WebmasterWorld Administrator rogerd is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Aug 2, 2000
posts:9687
votes: 1


Meditation Man, I don't know how penalties are determined by Google, but I'm aware of at least one site that was penalized (as a whole) due to the presence of a Zeus links directory.
4:20 am on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Oct 8, 2001
posts:2882
votes: 0


Someone said that "Google is currently "conflicted" concerning OptiLink..."

I think I can solve that conflict. Our Terms of Service do not permit programmatic queries without permission. Someone who uses OptiLink may have their IP addresses banned or their domains removed from our index. It's that simple.

This stance should be pretty clear from our webmaster guidelines, but I'll check with folks to make sure that the next time we update our webmaster guidelines, we make it more clear.

Added: just to clarify on one point, Windrose states that "using OptiLink is completely
invisible" to search engines. As always, you may want to take such claims from the creator of a program with a grain of salt. Other people have claimed their programs were completely invisible before, and such claims have also been proven false before.

In all seriousness: please do not use programs to send any automatic queries (of any kind) to Google. Such programs take server resources away that should be used to serve queries to actual searchers.

6:53 am on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 26, 2002
posts:8
votes: 0


Dearest GoogleGuy,

Google's Terms of Service are at best ambigous. ALL queries are "automated" to the extent that programs are the only means to make queries. IE is an "automated" query. If anyone doubts this, they should try doing one by hand. Or if you can't figure out how to get your hand stuck into an RJ11 jack, then use telnet. Mechanically that's less painful, of course telnet is a program (read: automated), but in trying to query Google with nothing but telnet you will soon find out just how much today's browsers "automate".

What does Google imagine separates a standard browser, which presumably you would not classify as automated, from programs that you presume the ToS bans? Programs that run unattended or run from server farms might arguably qualify, but what about programs that will not operate w/o the end user pressing the start button from the comfort of his own desk chair? OptiLink does nothing more nor less than fetch the same result page IE does when doing a link query and provides a very similar graphic to the user. This is in some way more automated than IE? A strange notion indeed.

But no more strange than Google's notion of what constitutes appropriate countermeasures, which in most legal contemplation would be instantly deemed both arbitrary and vindictive. Blocking an IP due to server load, a variation of denial of service, is reasonable and justifyable. But banning an unrelated domain? Or what about banning a site for linking to a banned site? Or banning a domain unrelated to the "offensive software" just because the whois data shows that it is owned by someone having made a public endorsement for said product? What about doing this three times in one month? I suspect that you know first hand that these are more than just theoretical possibilities. Who knows, we might find out through discovery.

And speaking of that, to clarify your point of clarity as to what "Windrose states", my name is Leslie Rohde. Just who are you?

Talk is the cheap. It's easy to level threats of retribution, and create fear and doubt, shielded all the while by anonimity, but can you prove what you say? I don't think so.

In all seriousness(?). Compared to what? Banning someone's business without notice for 5 months because of a 3 sentence endorsement on someone else's sales letter? That's not serious?

OptiLink produces no "automated queries" -- assuming that you could even define what that means -- a conclusion reach by even the most cursory technical analysis.

"Such programs" take far less server resources than used just to serve the Google logo. If you could show any legally cognizable damage I would withdraw my argument, but I have not heard one yet. Maybe the ninth circuit can help us.

With the very best regards,

Leslie Rohde
creator of OptiLink

7:23 am on Nov 26, 2002 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
posts:38258
votes: 115


Of course it difficult to lay out any specifics. It has to stay generic. As stated above, it is impossible to deduce what is and what isn't a human or acceptable tool to access any web site. It really doesn't matter. The only qualifier that matters, is the usage abusived? Any site has to take security steps to thwart abuse the best way they can. Programmed or automated querying represents a potential to abuse that few of us experience on our own sites. I don't think any of us can blame G for taking whatever steps neccessary to protect the operation and integrity of their site - we'd do the same. That in no way is a comment about the quality or creativity of any program.
8:27 am on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Oct 4, 2002
posts:31
votes: 0


With regards to Zeus, I just got this in an email from them:

<snip>

The part about not misusing Zeus I can believe, but '...Google is happy with the quality...'? Can this be true or has Mr. Cutts been 'creatively quoted'?

Any thoughts?

[edited by: Brett_Tabke at 8:43 am (utc) on Nov. 26, 2002]
[edit reason] see tos - please, no email quotations at all - thanks [/edit]

10:05 am on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Oct 8, 2001
posts:2882
votes: 0


Windrose, I think our terms of service are pretty straightforward. The essence behind them go something like:
- we intend for our searches to be invoked by hand by a human, not invoked by any program
- programmatic queries use server resources that cost us money
- therefore, programs such as optilink are not welcome to query on Google

I'm trying to be as unambiguous as I can and give a good-faith heads-up. You're welcome to take that good-faith heads-up however you want. I just wanted to clarify that queries from optilink are not welcome at Google, because they are triggered from a program.

Anyway, not if anyone else said it so: welcome to Webmaster World! There's always lively discussion around here, and I hope you decide to stick around and take part. :)

10:44 am on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 29, 2000
posts:1133
votes: 0


Brett >> Programmed or automated querying represents a potential to abuse that few of us experience on our own sites

Yeah, like that Zeus spider abusing our sites!

11:51 am on Nov 26, 2002 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
posts:38258
votes: 115


An interlude,

I used to program query Google for rank checking.

Then after WebmasterWorld took off here in 00, I started to get an old fashioned first hand education in the real world effect of programmed requests.

It wasn't bad at first with a couple thousand requests a day at most. Then our content started growing by multiples and the rogue bots started increasing. In 01, the worst of it wasn't just a couple thousand it day, it was tens of thousands. By mid june of this year, it was a couple hundred thousand at some points. The scope of the problem can't be understated. There were a few days when it was over 10 gig in bandwidth just for bots. It would slow the site down and even knocked it offline a few times.

That's just another form of a DOS attack. This is where the lightbulb started to go on and I had a new found sympathy for anyone else who'd experienced the same.

You'd have to take our experiences times a million to even begin to get a handle on the scope that Google has to deal with. When you throw in actual programs that were designed to do just that sort of thing, who could fault Google for looking at these programs in the same vein as the latest DDOS program branded about on irc channels?

I spend an hour a day here analyzing logs for those types of problems. I've worked on a detailed set of passive and aggressive counter measures. In some cases, it has caused our members to become innocent bystanders [webmasterworld.com] and obviously hurt the operation of the site.

I also had a problem with of the ambiguous nature of Googles TOS for awhile. Then I tried to write one here to address the same situation of programmed bots. You spell it out and 10mins later someone has figured out a way around it. Even this very post is a threat to the system and I have no doubt will result in some bot querying just to probe the edges and/or tweak me - whichever comes first.

It's not like Google isn't without alternatives for those that would like to uses it's data either. They put the API in place and let you have at it up to 1k queries day, and they offer both free and paid search services.

On the other end, they've taken every step they can to thwart the bots (ddos attacks). When a program comes along, that specifically targets their service, we can't blame them for doing everything in their power to protect their system. They are no different than us in that regard (just bigger, with much bigger pipes).

I guess that concludes my education.

Please excuse me for the next hour - instead of answering member email and message, I'll be off studying log files for bots and paying bandwidth fees.

12:58 pm on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Sept 24, 2002
posts:31
votes: 0


This might be a very simple question (or it's being asked by a simple person). Who gets penalised for an 'automated' query directed at Google - the URL being searched for or the IP which originates it?

If the URL gets punished then surely all your competitor has to do is use automated tools to query your site positions. If the IP gets punished then just do all your ranking queries from a dialup account.

Or am I missing something?

1:03 pm on Nov 26, 2002 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
posts:38258
votes: 115


It can only be the ip, isp, or agent. There are many that spend alot of time looking at the competitions sites.
1:10 pm on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Sept 24, 2002
posts:31
votes: 0


The more I read, the more I realise that I have to get my tiny site off of a virtual server and onto something where I can be absolutely certain that any penalties incurred by the site will be due to my own actions and not somebody else's.

Now if only hosting was as cheap in South Africa as it is in the States... ;)

1:47 pm on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 29, 2001
posts:2145
votes: 0


People have to realize that when you visit a website (no matter how big or important) you are a guest. Just like your home, if the owner of a home does not want you doing something in there home they can ask you to leave. Or change the locks.
I hope that I am clear with that analogy.
I had a certain company visit one of our sites hundreds of times a day. I wrote, phoned, ratted them to their ISP. It did not stop. Finally I hunted down the owners e-mail address, and all the company's directors. I repeatedly blanketed them with e-mail. When the President of the company finally contacted me, he asked what was my problem, I told him his bot was using my bandwidth. He told me could do anything he wanted to my site.
Suffice to say I fixed him and his little bot.

So I am in total agreement with Google's policy on automated queries.

3:49 pm on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 4, 2002
posts:31
votes: 0


I'm glad that I sparked such a heated debate.I think that we are all in aggreement that I shouldn't use zeus (it was a half hearted thought any way) I was hoping that someone would have an Idea to replace zeus, because I really don't want to do 1800+ sites manualy.

>How can anybody do SEO for 1,800+ sites? That's insane.
that's what I have been finding out... but it's now my job and I have to do my best.

>Surely not as insane as anyone thinking they have enough content for 1,000 sites!
Comming up withe the content for the sites is suprisingly easy. optimising it now that's the hard part.

>1800 sites? When do you sleep?
sleep what's that? I seam to remember something from a former life but it's all clou...zzzzzzzzz....dy.

So google doesn't like automation I'm good with that our company doesn't like automated things hitting our sites either. But I would like to pose this to google.... inorder to stave off automated queries to the google engine, why don't they offer reporting? it could be a pay service I can guarentee that seo's all over the world would pay through the nose for that service, more so than adwords, or any other servicethat is currently provided. it would pay for more servers, more band width, and it would make every one happy. google would be able to stop worring about automated queries, seo's would beable to stop running automated queries, and everyone would get the information that they are looking for. the reports would allow google to have more accurate information, and seo's wouldn't have to piece things together we could just bring up a page with all the information.

4:27 pm on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 6, 2002
posts:4768
votes: 0


>> why don't they [Google] offer reporting?

Great idea! Webmasters and SEO's would eat that up. You could pay a certain amount and register one domain, and Google would report links to you, positions for search terms, EXACT pagerank of your pages, impressions & maybe even clickthroughs, permit you to run WebPosition and similar software, etc.

None of this information would really make the slightest difference in terms of actually getting visitors, but there's a huge curiosity factor and people love to crunch numbers.

4:53 pm on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 4, 2002
posts:31
votes: 0


>None of this information would really make the slightest difference in terms of actually getting visitors...

I think it would make a diffrence in getting visitors because it would allow us to know if our seo efforts are working, and if not we can tweek them inorder to get higher on the list thus bringing in more visitors.

It would also allow us to go back to our customers with the reports and say "see seo is important".

I think that all search engines should implement reporting. or baring that have a company that does search engine reporting, someone could set up a company that has direct access to the SE's so that they can run queries at low times or run the queries on a back up databaseso as not to intrupt the normal server usage.

every one wins. (except the poor programmer who actually has to write the reporting engine)but with all the money that this would bring in the power that be should be able to reward said programmer well.

All I ask is that when someone makes billions from this Idea that they give credit for the idea to that guy on the webmaster world forum.(and maybe a car).

4:54 pm on Nov 26, 2002 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
posts:38258
votes: 115


It would expose the algo to manipulation - that's really the only thing they have that separates them from everyone else.

Googles data is proprietary - how the rankings look - how they are ordered - which excerpts they use - which sites are included - that's their ball game. Open that up to possible algo crackers, and it's all over but finger pointing, the crying, and the tell all book by GoogleGuy.

Lastly, who says there will be anything like perm rankings from google in the future? We've all watched how "everflux" has slowly walked down the pr scale and fresh bots activity is now constant and increasing. We might see a perpetual update and the end of the monthly updates very soon. That would mean rankings could fluctuate all the time.

5:17 pm on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 4, 2002
posts:31
votes: 0


>It would expose the algo to manipulation - that's really the only thing they have that separates them from everyone else.

It wouldn't expose the algo all it does is report on your positionand links to your site etc. your right we could refine things so that we would have a good guess as to what the algo is but google has the ability to change that when ever they want. it would force both sides to be the best at what they do. and in the end the people who are searching would get excelent results. it would keep google at the top of the SE pile. users would come back because of the execlent returns we would still be trying to get better by giving google the ability to find the correct information to serve up. and the algo would be tighter inorder to bring up the correct results. making google's reputation even bigger. and the users would get accurate search results. which is what we all want in the end.

>Googles data is proprietary - how the rankings look - how they are ordered - which excerpts they use - which sites are included - that's their ball game. Open that up to possible algo crackers, and it's all over but finger pointing, the crying, and the tell all book by GoogleGuy.

Every thing would still be proprietary we would be fitting in to what they [google] want. again allowing google to serve relevant information to the users.

>Lastly, who says there will be anything like perm rankings from google in the future? We've all watched how "everflux" has slowly walked down the pr scale and fresh bots activity is now constant and increasing. We might see a perpetual update and the end of the monthly updates very soon. That would mean rankings could fluctuate all the time.

all the more reason for the reporting. as it is now every month google has to get slammed with webmasters attempting to see what their pr is and then trying to figure out why the pr is what it is. with the report it would be one hit compaired to the hundreds per site.
If the ranking are going to fluctuate constantly there would be hundreds of hits per site per day. where as if they had the report it would be at the most one hit per site per day.

5:42 pm on Nov 26, 2002 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 29, 2001
posts:2145
votes: 0


Google is a private company, their website is private property. Some think it is their right to rank well and have access to Google's private property. Wrong!

Why would they want to encourage SEO, when I would guess that it is SEO's that cause most of Google's problems.

6:54 pm on Nov 26, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 4, 2002
posts:31
votes: 0


>Some think it is their right to rank well and have access to Google's private property. Wrong!

I agree it's not a right to rank well it's a privilge and if you follow their rules seo's only strengthen googles results.

>Why would they want to encourage SEO, when I would guess that it is SEO's that cause most of Google's problems.

google would want to encourage seo's because it allows them to serve up more accurate results to their users. the problems come up with poor seo practices (ie. spaming, key word stacking, automated searching etc.). if they encouraged proper seo practices they would only get stronger. please see above post(27).

4:10 am on Nov 27, 2002 (gmt 0)

New User

10+ Year Member

joined:Nov 26, 2002
posts:8
votes: 0


Brett_Tabke,

The Google API is for "non-commerical use", which as far as I can tell from my contact with google means any use other than as a hobby. It was expressly and explicitly ruled out, by Google, as an approach for OptiLink to take. The API is the best alternative technically, but has been withheld by Google.

This 44 message thread spans 2 pages: 44
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members