I've no idea Brett. All I do know is that it's about time we started getting nervous (it's been 3.5 months since the last update). And that whatever they do update the 301 and 302 problems are here to stay. :(
|Why not use a plain-vanilla link? |
The reason is that I have had a large and successful site up and running since the nineties and have never had my site stolen -- despite the fact that my site gets hit numerous times every day by consumer spiders (Teleport pro, Wget, etc.) and by spiders that masquerade as Internet Explorer. I've had individual pages copied, but never very much because the text isn't of much value without the links.
I'm confident that my collection of tens of thousands of web resources, organized and annotated, is copyright protected as a unique database of information, but I have no wish to get into trying to enforce that after the data has been stolen, in a grey area of the law, probably in a foreigh country.
(I'm saying this for GoogleGuy's benefit as much as yours. I really hope that Google never takes it into its head to penalize redirects.)
[edited by: jomaxx at 3:57 pm (utc) on Sep. 8, 2005]
|nsqlg, still doesn't make sense. What about a noindex meta tag? If you really want to keep Google off the page surely that's better than making all the links rel=nofollow. |
1) dont spend our bandwith/cpu
2) dont busy googlebot with these pages that dont need be indexed instead of keeping fresh the indexable pages.
jomaxx, I wrote a perl script that calcule average (with some rules, 10s, 30s, 1m, 5m, 30m...) to ban IP (through firewall) of these robots, work very well, we ban about 30 IPs by day ripping our site, 1 or 2 false positive, is very fast too why run background parsing access_log "on-the-fly", if you like perl I can help you.
(really is very hard communicate badly writing english... sorry)
I'd add a spider trap to that algo if you aren't using one. Nonetheless it sounds like you're seeing the same thing I see -- a site hammered by bots of unknown purpose coming from all over the world. I think I'll keep using the protection I have for as long as possible.
Maybe it was justa 3 hour tour of the algo? ;-)
Yup, maybe it'll really start when the "weather" starts getting rough?
>>Maybe it was justa 3 hour tour of the algo? ;-)
Yup, maybe it'll really start when the "weather" starts getting rough?
I am waiting on Mary Ann to bring me a coconut creme pie.
All I do know is that it's about time we started getting nervous (it's been 3.5 months since the last update).
From the site I watch that keeps a record (and my own research confirms this) the last PR update was on July 14, i.e., almost 2 months.
Previous PR updates were every 3-4 months as follows:
April 22 05,
Jan 1, 05
Oct 6, 04
Jun 22, 04
Am I wrong about July 14? Hope so cause that would mean another one is imminent.
If memory serves me, the April PR update was the last one. I don't watch those too much anymore, but I certainly don't remember any PR changes in July.
The last PR update was on 14 July.
@Lorel: Yup, 3 months. We're talking algo update (not flux). Why is everyone confusing that with a PR update? For those jumping in without reading previous pages - there is no algo update going on at present. Both GG and BT have confirmed that so there's a good chance it's true.
Just woken again in New Zealand. Same question, why is there still an update headline on the main page when there isn't one or is naming backlink updates the new standard. It seems a bit sensationalist and confusing for people.
Just incase you don't read oddsod post
THIS IS NOT AN UPDATE >> LOL
Same question DAVEN, why does the headline still say it is?
I really don't wish to be critical but by the time some people get to this thread yours and GGs definitive statements will be 5 pages back.
Hello GoogleGuy, nice to see you here.
Just received a reply from the Google Team that my message has been passed to Goodle's Engineering Team for further investigation. This follows an exchange of messages regarding a sidewide penalization of my site that won't allow it to appear in the first two pages of search results even for search strings exclusive to my site. I explain what I did to remedy the situation, mainly redirecting non-www to www, re-validation of html code, removal of interlinking that might seem to be excessive and a few other minor issues. Any idea of how long I should expect for my site to be examined by your Engineers? Thanks a lot.
|Hey, joeduck, nice to see you. Things at the plex are the same as usual (crazy busy) |
...and please tell Dr. Schmidt, who I had the honor of meeting at the Google Dance, that I know he can take Balmer in the upcoming one on one / hand to chair combat match
They should have known that an update called Gilligan would be messed up. Where is the skipper when you need him.
I am seeing a relaxation in the duplicate content "filters". I know of several sites that have loaded exactly the same the content on to two domains (or who have two domains pointing at the same webserver). One set of pages has always been filtered out of the SERPs, but now both appear in the SERPs, but with one set of pages marked as Supplemental. I see this for several such sites. This is new, sometime today.
> But just to repeat: there's no algo changes going on right now.
It is time we asked the question, then what is an update?
- new serps
- new backlinks
- new numbers of results
- new pr
- fresh filters applied
If that isn't an update, then what is?
Where is the Tipping Point?
Has the term "update" become irrelevant?
...If not for the courage of the fearless crew, The Minnow would be lost
I with the Duck!
Jeez. I can see pages from www.google.es in the main Google SERPs; even one like: www.google.es/ie?hl=es&lr=&output=sch&q=related:***********.***
What is going on?
|I am seeing a relaxation in the duplicate content "filters" |
I sure hope so but not for our site yet - I think we've suffered from that filter since Allegra due partly to the filter's ruthlessness but also due to our own configuration mistakes (302s rather than 301s, multiple domains, thin affiliate pages, etc.)
My latest notion is that large directory sites like ours - quite reasonably - have many very different pages for a city/state but the filter assumes (reasonably in most cases but not ours) that a large number of pages from same site for same term indicates spamminess.
... and then of course it could just be the wrath of the invisible evil alorithmic demons
|...then what is an update? |
I say SERP results.
Since Brett indicated he's seing SERP changes but GoogleGuy says "no update", I'm conflicted ... and sad...and...and...I just can't handle this anymore!
I'll go with DaveN's 2 weeks - he calls a lot of shots right on.
|...If not for the courage of the fearless crew, The Minnow would be lost |
I agree Brett - a very important philosophical observation.
[edited by: joeduck at 7:39 pm (utc) on Sep. 8, 2005]
>If that isn't an update, then what is?
Quite right Brett Tabke it is technically an update .. just when you give something a name one usually expects something big to happen like major serps changes
Dont worry though lets just call it Brenda and forget about it (waves to Brenda as she leaves the building)
Is it an update nor not? Who really knows, but the important question still remains;
Ginger or Mary Ann?
In his blog, Matt Cutts writes "Technically Update Gilligan is just backlink/PageRank data becoming visible once more, not a real update". Straight from the horse's mouth (I mean, not that Googleguy is a horse).
My Brenda was ginger :)
A update is big changes in serps, nothing else, new backlinks can give a hint that we are close to a new update, this now is no update.
Brett - recommending you post a link to the excellent summary of "Google update" at MC's blog.
:-( : Update...false alarm Sept 2005
What *is* an Update?
Thats very interesting. I still think that Brett and GoogleGuy were describing two sides of the same coin.
Brett called it update while GG said its just everflux and no algo changes going on right now.
IMHO, we are witnessing the new kind of Google updates, which don´t necessary include an algo changes.
Actually if you study GoogleGuy´s posts in connection with Allegra and Bourbon and now, you discover that he has been telling us that what we usually understand of update has changed to something new which he prefer to describe as "everflux" which doesn´t include algo changes while an "update" must include an algo changes.
That means that today everflux actually covers the changes we usually see on the old updates, for example:
- New serps (but they are everchanging)
- Sites drop or gain in ranking
- New baclink count