Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
I'd recommend getting quality links the old-fashioned way: submitting to relevant directories, contacting owners of related sites, etc.
Getting links is really down to legwork, or should we say fingerwork. There is great advice and idea in the Reciprocal Links [webmasterworld.com] forum on the best ways to go about this.
Your first step should probably be to visit the relevant categories in the Google directory and work your way down the lists there.
Used properly Zeus is actually a good frogram for "finding" relevant sites that may not be listed in the Google Directory or are otherwise off the radar a little. However, using the inbuilt email and publishing systems in Zeus can often lead to disaster. Rather, I would be using Zeus to collect data and then hand process from there.
"Like any other program, Zeus is a tool that can be used or misused. Google judges the quality of a site partly by the quality of the pages that site links to. If a webmaster links to poor-quality or spammy sites, that can affect his or her site's ranking. As a program that actively engages in searching out links, Zeus can amplify that factor. Webmasters who use Zeus should be extremely careful - adding links to sites tagged as spam can lead your site to be tagged as spam, and Zeus can clearly play a role in that process."
Google Software Engineer and Spam Czar
If you only use zeus to find potential link partners you aren't at any risk. Automated emails are rarely appreciated by webmasters and have been proven to be less effective than those with personalization.
As for actually allowing zeus to actually manage your directory of links... do so at your own (hearty) risk.
One of my customers suggested I answer your question
OptiLink does not find or make links, but instead analyzes
the links you do have and the ones you are considering
It is a standalone desktop program built around an highly
specialized browser, but the easiest way to explain what
it does is to describe how to do the same thing manually
with a standard browser and some office software.
1. Using your url, do a search at your favorite engine
for link:<myurl>. This will return a list of
every page that links to yours.
2. Next, visit every one of those pages, find where the
page links to you, and copy the link text that it uses
to a spreadsheet.
3. When you’re done with that, divide the links up into
separate words and use your spreadsheet program to tally
what percentage of links use each word.
About now you're probably wondering why? Because this is
pretty close to what the search engine is already doing
to gauge the relevance of your site.
Google is currently "conflicted" concerning OptiLink,
having taken a stand, retracted, and gone back again, but
this is only an issue for people _promoting_ the product
and will soon be resolved one way or another anyway.
Selling is always a very visible activity and prone to
getting you shot at, but using OptiLink is completely
invisible to both search engines and the linking sites
you access, so what you do in the privacy of your own
office is pretty much noone elses business but your own.
My experience from looking at a lot of Zeus pages is that Google gives PR0 to the links pages when it can. A links page with no pagerank lacks a certain je ne sais quoi!
But Google doesn't otherwise seem to penalize those sites -- only the links pages seem to be PR0'ed. There may be exceptions of course.
Me? I don't even link to Zeus sites when they ask me to. I tell them why and I suggest they move to a more conventional linking system.
I think I can solve that conflict. Our Terms of Service do not permit programmatic queries without permission. Someone who uses OptiLink may have their IP addresses banned or their domains removed from our index. It's that simple.
This stance should be pretty clear from our webmaster guidelines, but I'll check with folks to make sure that the next time we update our webmaster guidelines, we make it more clear.
Added: just to clarify on one point, Windrose states that "using OptiLink is completely
invisible" to search engines. As always, you may want to take such claims from the creator of a program with a grain of salt. Other people have claimed their programs were completely invisible before, and such claims have also been proven false before.
In all seriousness: please do not use programs to send any automatic queries (of any kind) to Google. Such programs take server resources away that should be used to serve queries to actual searchers.
Google's Terms of Service are at best ambigous. ALL queries are "automated" to the extent that programs are the only means to make queries. IE is an "automated" query. If anyone doubts this, they should try doing one by hand. Or if you can't figure out how to get your hand stuck into an RJ11 jack, then use telnet. Mechanically that's less painful, of course telnet is a program (read: automated), but in trying to query Google with nothing but telnet you will soon find out just how much today's browsers "automate".
What does Google imagine separates a standard browser, which presumably you would not classify as automated, from programs that you presume the ToS bans? Programs that run unattended or run from server farms might arguably qualify, but what about programs that will not operate w/o the end user pressing the start button from the comfort of his own desk chair? OptiLink does nothing more nor less than fetch the same result page IE does when doing a link query and provides a very similar graphic to the user. This is in some way more automated than IE? A strange notion indeed.
But no more strange than Google's notion of what constitutes appropriate countermeasures, which in most legal contemplation would be instantly deemed both arbitrary and vindictive. Blocking an IP due to server load, a variation of denial of service, is reasonable and justifyable. But banning an unrelated domain? Or what about banning a site for linking to a banned site? Or banning a domain unrelated to the "offensive software" just because the whois data shows that it is owned by someone having made a public endorsement for said product? What about doing this three times in one month? I suspect that you know first hand that these are more than just theoretical possibilities. Who knows, we might find out through discovery.
And speaking of that, to clarify your point of clarity as to what "Windrose states", my name is Leslie Rohde. Just who are you?
Talk is the cheap. It's easy to level threats of retribution, and create fear and doubt, shielded all the while by anonimity, but can you prove what you say? I don't think so.
In all seriousness(?). Compared to what? Banning someone's business without notice for 5 months because of a 3 sentence endorsement on someone else's sales letter? That's not serious?
OptiLink produces no "automated queries" -- assuming that you could even define what that means -- a conclusion reach by even the most cursory technical analysis.
"Such programs" take far less server resources than used just to serve the Google logo. If you could show any legally cognizable damage I would withdraw my argument, but I have not heard one yet. Maybe the ninth circuit can help us.
With the very best regards,
creator of OptiLink
The part about not misusing Zeus I can believe, but '...Google is happy with the quality...'? Can this be true or has Mr. Cutts been 'creatively quoted'?
[edited by: Brett_Tabke at 8:43 am (utc) on Nov. 26, 2002]
[edit reason] see tos - please, no email quotations at all - thanks [/edit]
I'm trying to be as unambiguous as I can and give a good-faith heads-up. You're welcome to take that good-faith heads-up however you want. I just wanted to clarify that queries from optilink are not welcome at Google, because they are triggered from a program.
Anyway, not if anyone else said it so: welcome to Webmaster World! There's always lively discussion around here, and I hope you decide to stick around and take part. :)
I used to program query Google for rank checking.
Then after WebmasterWorld took off here in 00, I started to get an old fashioned first hand education in the real world effect of programmed requests.
It wasn't bad at first with a couple thousand requests a day at most. Then our content started growing by multiples and the rogue bots started increasing. In 01, the worst of it wasn't just a couple thousand it day, it was tens of thousands. By mid june of this year, it was a couple hundred thousand at some points. The scope of the problem can't be understated. There were a few days when it was over 10 gig in bandwidth just for bots. It would slow the site down and even knocked it offline a few times.
That's just another form of a DOS attack. This is where the lightbulb started to go on and I had a new found sympathy for anyone else who'd experienced the same.
You'd have to take our experiences times a million to even begin to get a handle on the scope that Google has to deal with. When you throw in actual programs that were designed to do just that sort of thing, who could fault Google for looking at these programs in the same vein as the latest DDOS program branded about on irc channels?
I spend an hour a day here analyzing logs for those types of problems. I've worked on a detailed set of passive and aggressive counter measures. In some cases, it has caused our members to become innocent bystanders [webmasterworld.com] and obviously hurt the operation of the site.
I also had a problem with of the ambiguous nature of Googles TOS for awhile. Then I tried to write one here to address the same situation of programmed bots. You spell it out and 10mins later someone has figured out a way around it. Even this very post is a threat to the system and I have no doubt will result in some bot querying just to probe the edges and/or tweak me - whichever comes first.
It's not like Google isn't without alternatives for those that would like to uses it's data either. They put the API in place and let you have at it up to 1k queries day, and they offer both free and paid search services.
On the other end, they've taken every step they can to thwart the bots (ddos attacks). When a program comes along, that specifically targets their service, we can't blame them for doing everything in their power to protect their system. They are no different than us in that regard (just bigger, with much bigger pipes).
I guess that concludes my education.
Please excuse me for the next hour - instead of answering member email and message, I'll be off studying log files for bots and paying bandwidth fees.
If the URL gets punished then surely all your competitor has to do is use automated tools to query your site positions. If the IP gets punished then just do all your ranking queries from a dialup account.
Or am I missing something?
Now if only hosting was as cheap in South Africa as it is in the States... ;)
So I am in total agreement with Google's policy on automated queries.
>How can anybody do SEO for 1,800+ sites? That's insane.
that's what I have been finding out... but it's now my job and I have to do my best.
>Surely not as insane as anyone thinking they have enough content for 1,000 sites!
Comming up withe the content for the sites is suprisingly easy. optimising it now that's the hard part.
>1800 sites? When do you sleep?
sleep what's that? I seam to remember something from a former life but it's all clou...zzzzzzzzz....dy.
So google doesn't like automation I'm good with that our company doesn't like automated things hitting our sites either. But I would like to pose this to google.... inorder to stave off automated queries to the google engine, why don't they offer reporting? it could be a pay service I can guarentee that seo's all over the world would pay through the nose for that service, more so than adwords, or any other servicethat is currently provided. it would pay for more servers, more band width, and it would make every one happy. google would be able to stop worring about automated queries, seo's would beable to stop running automated queries, and everyone would get the information that they are looking for. the reports would allow google to have more accurate information, and seo's wouldn't have to piece things together we could just bring up a page with all the information.
Great idea! Webmasters and SEO's would eat that up. You could pay a certain amount and register one domain, and Google would report links to you, positions for search terms, EXACT pagerank of your pages, impressions & maybe even clickthroughs, permit you to run WebPosition and similar software, etc.
None of this information would really make the slightest difference in terms of actually getting visitors, but there's a huge curiosity factor and people love to crunch numbers.
I think it would make a diffrence in getting visitors because it would allow us to know if our seo efforts are working, and if not we can tweek them inorder to get higher on the list thus bringing in more visitors.
It would also allow us to go back to our customers with the reports and say "see seo is important".
I think that all search engines should implement reporting. or baring that have a company that does search engine reporting, someone could set up a company that has direct access to the SE's so that they can run queries at low times or run the queries on a back up databaseso as not to intrupt the normal server usage.
every one wins. (except the poor programmer who actually has to write the reporting engine)but with all the money that this would bring in the power that be should be able to reward said programmer well.
All I ask is that when someone makes billions from this Idea that they give credit for the idea to that guy on the webmaster world forum.(and maybe a car).
Googles data is proprietary - how the rankings look - how they are ordered - which excerpts they use - which sites are included - that's their ball game. Open that up to possible algo crackers, and it's all over but finger pointing, the crying, and the tell all book by GoogleGuy.
Lastly, who says there will be anything like perm rankings from google in the future? We've all watched how "everflux" has slowly walked down the pr scale and fresh bots activity is now constant and increasing. We might see a perpetual update and the end of the monthly updates very soon. That would mean rankings could fluctuate all the time.
It wouldn't expose the algo all it does is report on your positionand links to your site etc. your right we could refine things so that we would have a good guess as to what the algo is but google has the ability to change that when ever they want. it would force both sides to be the best at what they do. and in the end the people who are searching would get excelent results. it would keep google at the top of the SE pile. users would come back because of the execlent returns we would still be trying to get better by giving google the ability to find the correct information to serve up. and the algo would be tighter inorder to bring up the correct results. making google's reputation even bigger. and the users would get accurate search results. which is what we all want in the end.
>Googles data is proprietary - how the rankings look - how they are ordered - which excerpts they use - which sites are included - that's their ball game. Open that up to possible algo crackers, and it's all over but finger pointing, the crying, and the tell all book by GoogleGuy.
Every thing would still be proprietary we would be fitting in to what they [google] want. again allowing google to serve relevant information to the users.
>Lastly, who says there will be anything like perm rankings from google in the future? We've all watched how "everflux" has slowly walked down the pr scale and fresh bots activity is now constant and increasing. We might see a perpetual update and the end of the monthly updates very soon. That would mean rankings could fluctuate all the time.
all the more reason for the reporting. as it is now every month google has to get slammed with webmasters attempting to see what their pr is and then trying to figure out why the pr is what it is. with the report it would be one hit compaired to the hundreds per site.
If the ranking are going to fluctuate constantly there would be hundreds of hits per site per day. where as if they had the report it would be at the most one hit per site per day.
Why would they want to encourage SEO, when I would guess that it is SEO's that cause most of Google's problems.
I agree it's not a right to rank well it's a privilge and if you follow their rules seo's only strengthen googles results.
>Why would they want to encourage SEO, when I would guess that it is SEO's that cause most of Google's problems.
google would want to encourage seo's because it allows them to serve up more accurate results to their users. the problems come up with poor seo practices (ie. spaming, key word stacking, automated searching etc.). if they encouraged proper seo practices they would only get stronger. please see above post(27).
The Google API is for "non-commerical use", which as far as I can tell from my contact with google means any use other than as a hobby. It was expressly and explicitly ruled out, by Google, as an approach for OptiLink to take. The API is the best alternative technically, but has been withheld by Google.