1script - 7:14 pm on Nov 26, 2012 (gmt 0)
Sarge, thanks for the tip! I have done that in the past though I admit I have never been diligent enough with this technique. I did, however, reason that if I visit my own post a split second after it's been published AND the post has AdSense code on it, the Mediapartners bot will do the same job for me. Question is how quickly the bot shares the data with the main indexing engine and whether the datastamp actually survives. But I have never had problems with quick indexing/caching of the page per se, it's either the AdSense code and the immediate Mediapartners bot visit or the simple ping of Feedburner for the new RSS (I haven't tried RSS delay yet).
Anyhow, thanks for your tips guys, but the reason I started this post was not to ask if there's *any* way to ping Google about a new page - I've been using submission forms and later sitemap and RSS pings since last millennium - but rather what on Earth is that "fat ping" that MC talked (and tedster later echoed) about and how it could possibly be helping with establishing authorship?
The particular issue I'm struggling with is not the fact that you ping an open hub - I assume the hub retains the data stamp and so your post will have an older data stamp than the scraper's - but rather what benefit is there in sending "fat" pings (XML with content) as opposed to "slim"(?) pings with URL only. Can Google now index a page before even visiting it, based only on the content of the "fat ping"? I sincerely doubt it - they don't trust self-created signals. What other benefit could there possibly be then?