I'm sure the "domain-name renewal metric" was mentioned in the goofy "trustrank" paper referenced here a while ago. (That's when I stopped reading it.)
Hmm, something has changed. My #3 site for the last year (occasionally #4) is suddenly #2 in approx 125 million results.
Well, registered in the past year is a whole different kettle of fish. We already know that's an issue.
Single year renewals OTOH are not a place to look for site penalties, or anything of the sort.
For the past 3 days the serps in my sector has not budged. For these days my site has had the same ranking on every datacenter...as well as all of the others in the top 20.
Is anyone else seeing this calm? Or is there movement elsewhere?
I think it's a result of the patent discussed at [webmasterworld.com...] . Google did become a domain registrar some time ago, so presumably they have ready access to domain records.
|Where you guys getting this domain renewal thing from? |
They COULD be using this data as an indicator of a site's quality but I doubt it would be given significant weight. It could also be used as a "tie breaker" in a hand review I suppose.
And as expected, changes on the serps always happen when I´m on my way to bed.
Again hitting another set of serps on my google.com. This time my beloved DC 188.8.131.52 and you may call it "The Mother Of All DCes" ;-)
And that was the end of The 0.5 Monday.
On April 19th 2005, in a thread called “Google's 302 Redirect Problem”, [webmasterworld.com...] , you gave instructions on how webmasters can contact Google to apply for re-inclusion requests because their sites had been for hit with 302 redirect and/or duplicate penalty problems.
Everyone that followed your instructions and applied for re-inclusion just received a canned automated and irrelevant email response!
Q. What progress has been made regarding re-inclusion requests?
Interesting, when I go to 184.108.40.206 at the bottom is this
Go to Google.com [google.com]. I've never seen that before.
Google Update Bourbon Part 4
Is it possible, in the age og minimalism? Is there a least common denominator to these 4 posts?
Make it simple, as simple as possible, but no simpler.
[edited by: kgun at 11:46 pm (utc) on June 6, 2005]
Thanks Dave, interesting.
Man, there's a dreadful assumption there, that just because domains are not expensive, people with quality sites would necessarily renew their domains for umpteen years at a time.
The only businesses I know of that don't care about cash flow are large copororations and overfunded start ups. Wait a minute, those companies care about it too. Maybe it's only those writing the algos that think cash flow doesn't matter. ;-)
There is of course some wisdom in renewing important domains for multiple years, just to be safe. The issue here is in assuming that all legit businesses necessarily do that, and assigning a negative score of any kind to those that don't.
|trashed registering your domain names for more than 1 year on each renewal? |
My larger site is registered 5 years at a time and it's doing fine, the smaller one is yearly and has really been hurt by Bourbon.
heavy crawling right now. This might mean that G is done tweaking at least for a while. Anyone see any SERP changes?
Walkman: ""heavy crawling right now. This might mean that G is done tweaking at least for a while. Anyone see any SERP changes?""
Well, GG posted about something or other: [webmasterworld.com...]
I'm a slight bit lost as to what it means to "turn on the datacenters this week?" Are there changes which aren't yet visible to us and yet will be in the next week?
Also he'd mentioned how the final change should be less noticeable? I don't know but that sounds so very final. Does it mean that folks who got hit very hard don't have much to look forward to in the coming weeks?
My site position and links have rolled back to their previous condition.
|My site position and links have rolled back to their previous condition. |
Their condition Pre-Bourbon?
I haven't seen any improvement here at all.
My pages are being indexed but not being ranked at all. I have sent in a reinclusion request (dup. content fixed with a 301). I'm getting frustrated...
|I have sent in a reinclusion request |
Where do i send a reinclusion request. Could somone post the URL please.
I asked that before, but since it was soo mentioned here nobody has responded me.
I found a link in an old post, but that's no longer working. :(
I think reinclusion requests should go to google.com/support/ with "reinclusion request" in subject line
annej, we got a handful of sites with domains that renew annually, and got hit also. Then again, we got a bunch of sites that renew domains annually that are hotter 'n ever right now...including some neglected sites. Annual domain renewal means nothing. Can anyone really imagine a sophisitcated SE, or even a stupid SE for that matter, basing any meaningful part of their algo on domain renewal patterns?
When such a day comes, the monkeys will be running the zoo. ;-)
Walkman says he is seeing heavy crawling. I am not seeing that just yet. I deep crawls on two subequent days about 10 days ago...and thats it.
After reading billions of post on bourbon I am now coming to my first post regarding this topic.
First of all: My sites have been hit very hard. Not one site in the TOP30 and no ranking for any pages of the sites.
I have noticed some "things":
I have uploaded a new database with new content to one of my domains, which led to the fact, that Google indexed 120.000 pages instead of 40.000 before. All pages were indexed in a short time and everything seemd to be okay, but then I noticed traffic decreasing from Google while I was expecting to see it raise, due to the new content I addded. Now I can't find any page of this site in the TOP300. It has come to my mind that it maybe not a so good idea to add tons of content to your site at once.
On the one hand it could be a good factor to target spammers, because they will also upload tons of content in a very short period of time, but if it affects all other sites, too it is not a good filter.
Msybe we can hear a clarification from GG regarding adding lots of content to a site? :-)
Was the content you uploaded original content? Or was it from an affiliate database?
And regarding the original 40K pages of the site... had they been ranking well previously? And were they original content pages or also from a database?
" Walkman says he is seeing heavy crawling. I am not seeing that just yet. I deep crawls on two subequent days about 10 days ago...and thats it."
We saw the same thing. If I remember right someone posted of a deep crawl on their site and a day or two later we got a quick but deep crawl. I would like to see it hit us hard again real soon.
Good morning Folks
I see the same serps on my Google.com this morning. I'm hitting 220.127.116.11 and the serps look like those of late yesterday night my local time.
I read a very interesting line that our good friend GoogleGuy wrote this morning:
>We'll turn on one datacenter, and then the rest of the datacenters over the course of the next week or so. After the other changes that went out, this last change should be less noticeable.<
Maybe we should keep an open eye on those gorgeous rock n roll DCes for the next week or so.
Words on the street say that GoogleGuy told the folks at Googleplex; leave reseller´s site alone during Bourbon, because we already took away 75% of his referrals during allegra and reseller has a house rent to pay, you know ;-)
Wish you all a great sunny day
I've read about that those pikes in growth can lead to a sort of penalty. And for the experience and cases shown it seems to be true to certain extent.
But, now, guys, please, tell, don't you think that's the most idiotic of all penalties?
I mean, really, common, now are they going to rule the pace to which a site most growth!?!?
Let's say a big retail, i.e. walmart put a 3 extra
pages for each of his products overnight, are they going to be penalized? Well, of course they will not since they seem to be in a sort of whitelist.
But, for the average joe, is that just not possible?
What about a site that is born big? If that site instead of having 40k, then add 80k, it is released the first day with 120k pages? Will that be ok, then?
That's not make anysense at all!
I looked at my stats and I see deep crawls on May 27 and May 28. Hit all pages mostly from the Mozilla/5.0 Googlebot. Spider activity has lessened since then.
A dumb question, but how do you differentiate between a "deep crawl" and one that only gets to your index page?
Billy: I look at my access_log files. Your host ISP has them someplace or another.
For me, a deep crawl is when G or Y uploads many or most of my pages. The opposite is then they just
look at my robots.txt and index files, then go away, or else don't show up at all. -Larry
Just seen about 400 more of our lost pages show up in some the datacenters. Bad news is that many of them that appeared are showing Dec/Jan/Feb cache dates.
18.104.22.168 and 22.214.171.124 are a couple datacenters showing this.
the database is a kind of an affiliate database.
But you won't find this kind of db anywhere else on the web. it's like a combination of several affiliate dbs becuase our website compares products and for several products we write own quality reviews, which are unique.
With the 40k pages the site ranked very well-
Another thing which I noticed since several months, maybe years:
My site is written in german and is targeting the german speaking audience, means germany, austria and switzerland. My site is a .info domain. Now Google displays the "Translate this page" link on google.de, which means that google thinks my page is NOT german.
I then checked google.com in english and voila my page ranks on #12 while it is on #151 on google.de. It should be the other way round! I had noticed this thing before but there was never such a difference in ranking. I tried everything to remove this effect (meta language, ISO, just everything) but nothing helps, Google keeps thinking my site is english an maybe that is one cause for the drop in ranking and accordingly the difference in ranking on .de and .com
GG what is the best way to make it clear to Googlebot that a site is in german not in english?
Anyone seen any re-inclusion requests go through?
And also I guess the next thing is to guess the dc which may be the future.
[126.96.36.199...] - looks a bit different to me. I am sure it will become clearer when the days unfold.