| 3:41 am on Oct 19, 2008 (gmt 0)|
This clarification is certainly more detailed than the short blurb on Google News Help [google.com].
The first-click-free program seems like a pretty solid idea to me, especially because it's opt-in for the publisher. I am suprised that every page of a multi-page article needs to be made available to vistors who arrive from Google search, but that's a judgement call, and I could see it going either way.
I'm curious about sites who are using the program - whether it increases registration or not. My guess is that the raw number of registrations would increase, but the percentage would decrease. Am I right?
| 4:13 am on Oct 19, 2008 (gmt 0)|
"Again, if a user comes in from Google multiple times a day, he should be able to see the full content of the article (as it is crawled and indexed by Googlebot) every time he comes in from Google. It is not limited to the first time that the user comes in from Google"
Lame.. I suspect that means that they could basically search your entire site via Google without ever registering.
| 5:15 am on Oct 19, 2008 (gmt 0)|
blaze, some people surely will - but you're assuming people are smart. Many "less smart" people will find an article let's say from xyz.com/articles/something.html from Google. Then they bookmark it or try to go back directly to that page somehow not from Google, then they need to register, many people will
| 10:06 am on Oct 19, 2008 (gmt 0)|
The point being that any content delivered to the googlebot should be available to users, otherwise it's dishonest "cloaking".
| 10:47 am on Oct 19, 2008 (gmt 0)|
We use GEO Targeted content delivery and serve 0(ZERO) content to the users from several Countries with spicy 403 and Fries where we simply do not do business in. And could honestly give a rats %$# if we do not rank on Google.tld except for .COM and .CO.UK in US and UK(area).
Letís say that We have ranked for "Red Widgets" as #1 for the past 5 years. That page has 30 links to individual Red Widget Pages. are those considered "every page of a multi-page article"? or will G simply rank it as an ecommerce catalog pages?
My Question is: What is the point of "CACHED" link in Google SERP then?
| 11:55 pm on Oct 19, 2008 (gmt 0)|
Freebie hunters might be interested in doing a site: search just to get multiple 'first clicks'.
| 12:29 am on Oct 20, 2008 (gmt 0)|
Sure, that might happen - as with any change, you should test the results. The question is, even with freebie hunters, would there still be a valuable increase in conversions for your site? If there isn't, then you drop out of the program and stop googlebot from indexing those pages.
| 2:29 am on Oct 20, 2008 (gmt 0)|
I would like to know what they mean by multi page as well. What if it is a book. Is the document the chapter or the whole book? Also what if you serve an image of the article to the user and text to google.
| 3:07 am on Oct 20, 2008 (gmt 0)|
The second part of your question is the easier of the two. It seems clear to me that a site should serve googlebot the same thing it serves to other visitors. Google would consider anything other than that to be deceptive form of cloaking.
The first-click-free program was originally conceived for News sites like the NT Times that require registration or in some cases, subscription, to view content. So the language that Google uses is focused around "articles".
How the policy applies to other types of multi-page content is not so clear, but the intent is not so hard to discern. I doubt that Google expects access to one page out of an entire book should imply access to the entire thing.
| 3:12 pm on Oct 20, 2008 (gmt 0)|
I don't completely understand the benefits of this but the downside seems clear:
If you don't do what Google wants, they will promote other pages above you on purpose?
This isn't just SEO, this is Google dictating the rules, is no-one going to point that out?
Mission-creep, I'm just saying...
ps. Google in stealth mode always sends a referer? I don't see that always.
| 4:45 pm on Oct 20, 2008 (gmt 0)|
Sites want google to view the entire site so that every page they have has the potential to rank in Google. On the other hand they don't want visitors to have the same kind of access without a free or paid subscription.
The reason I mentioned showing an image to visitors and text to google was so that people can not easily download your content and steal it.
One more question I have does this change what Matt Cutts said about the whole WebmasterWorld cloaking issue? I have based what I do now on that.
| 5:32 pm on Oct 20, 2008 (gmt 0)|
I see it as a way for sites with restricted content to get that content indexed and enjoy the free Google traffic - without cloaking. If a site doesn't want to allow googlebot access, then they don't have to.