Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
Especially take time to read the Content Guidelines:
and the Content Policy FAQ:
If the purpose is to serve alternate pages to different human users, based on locality, browser, machine type etc., we do not consider that cloaking.
That's really splitting hairs. Reads to me that the se's reserve the right to split those hairs based on intent of the page. To put it another way, they can't come up with a working definition of cloaking either. There are just too many types of cloaking these days to know which-is-ok and not ok - so they have to base it on the intent of the page.
It seems like this is all coming full circle here because that is exactly the way we fell about spam in the forums. We can't come up with a definition of because 10seconds later someone figures out a legit way around the definition.
This one is very welcome:
Q: Noting that you do not want multiple sites which have the same content, I registered many domain names, for instance our name in .com, .net, .org and .co.uk. I pointed them all to our main Web server, so each domain serves exactly the same content. Is this OK?
A: Yes, this practice is common and considered reasonable.
What is considered "spam"?
Inktomi defines spam an inappropriate use of Inktomi's search engine involving any effort to deceive the search engine into returning a result that is unrelated to the query or whose position has been artificially inflated in the result set.
From the "Examples of Spam" section
Cloaking/doorway pages that feed Inktomi crawlers content that is not reflective of the actual page
To go from that statement to one that basically says Ink doesn't want any pages that differ from what a human sees, seems like a pretty major shift in direction to me.
So the question in my mind is what does this mean for their partners? Their original statement on cloaking always read to me like it was intentionally written to give a few of their partners some wiggle room, but the new statement has clearly removed it.
Do they give their cloaking partners a secret pass, or are they really going to force them to clean up their acts?
Yes, that statement seems straightforward to me as well. I can definitely see the difference between designing pages for optimal layout which people actually see that are created to enhance their user experience as opposed to creating pages that "the web public" never sees that are served up only to the spidering bots specifically to affect search engine placement.
That's a hell of a brush to be painting with...
So, in other words, if I do all the right things for a client and get them to rank in some reasonable position below the directory and Go-verture listings, Ink may consider that spam, because it was "artificially done???
And I get to pay for that opportunity???
>Yes, that statement seems straightforward to me as well.
Every page on this site is dynamically generated. With simply logging, we can detect:
ip address, domain name, browser make & model, referring string, and cookies
That gives a site the ability to generate pages based upon:
Country of origin, some speed info in domain names (dsl/asdl/dialup/ethernet/cable), text, no text, css support level, plugins support, encoding support, language support, screen size, graphic level (png/gif/accept), activity level, referring string, time on site, path through site, time between pages, and cookie status.
That also gives us a chance to deliver pages based upon native spoken language.
We can feed IE, NN, Opera, Lynx, and Wap in HTML 3.2, HTML 4.0, and XML flavors. As well as several different spoken languages using server based translation (which is installed).
Umm which one of those pages should I give the search engines? Or should I just do the right thing and add a generic "search engine spider" to the agent list and feed them the lynx page?
Seem muddier than ever to me. There is no working definition of cloaking that can be used. It's all about intent. Hence, the opening statement If the purpose is to serve alternate pages.
But.. I will tell you more when we start with that! :)
So, is it possible that INK sees the PFI pages as an SEO thing and so a little shift in the algo and those SEO pages (that were ranking) are now in the toilet.
And do they see IC as the heavy hitters and so have now those pages have been moved to the top.
Inktomi's current treatment of pages paid for in good faith has got to be the best thing that has ever happened to Overture.
None of this leaves me with that "reach for my visa card feeling".