Forum Moderators: Robert Charlton & goodroi
No, that is zero sum game. The most useless posts here are from people saying the serps on some datacenter suck or are good because their own stuff ranks bad or good on that datacenter. Not only does nobody else care, there is someone thinking the exact opposite due to how their stuff is ranking.
In any case (repeating mantra from past several updates), a lot folks should consider that screw ups are not deliberate policies. Google has been a technical mess for more than a year now, just over two years really. Allegra was just a blip of an update, but was a huge technical disaster. Google also has a horrible time figuring out canonical pages, particularly when webmasters deliberately do inconsistent things.
This update seems to me to be another minor bit of shuffling, with the added "bonus" of a lot of anomalies, most caused by lazy or uniformed webmastering (meaning if you have been reading webmasterworld and haven't had a 301 on for non-www and www since at least last summer, you only have yourself to blame).
I see almost no changes in my niches, except... a HUGE increase in straight redirect domains. This tactical trash gets discovered fairly quickly but apparently a new tactic has been discovered and needs to be squashed; authority sites performing same as recently; sites still in the sandbox dumped back to pre-Allegra levels, while sites that got out of the sandbox with Allegra doing a bit better.
Spreading from where? - the 7s - I have not seen that spread.
One thing that I have noticed - A site that deffo had a problem and has not been crawled recently - the problem was fixed 6-8 weeks ago and the homepage started ranking ok - however was a low pr site and was still not deep crawled.
However, this site is starting to get traffic from supplemental pages crawled at the beginning of November. These supplemental pages are beating competition from un-supplemental pages.
Is anyone else seeing supplemental pages appearing higher up in the serps than you would imagine.
I'm repeating myself... I was #1, Bourbon made me #145.
Here we go.
When I looked at the general "free widget plans" in allinanchor, allintitle, and allintext, my site didn't show up when Bourbon started.
Until now. Now it's in first place for these commands.
It's not showing up in the rankings, but I hope it will, soon.
Is Google gradually resetting everything to zero, and building up from scratch?
How long will it take for the update to get over?
We can all speculate, nobody really knows but google, and I'm not so sure about that either. My observation, based mostly on odd pattern of googlebot vists, and secondarily on rankings: update is nowhere near complete; maybe approaching mid-cycle.
On a very small site (< 1,000 pgs) this morning, the 25th, googlebot has visited about 700 times. On the 23rd, in a 9 hour period it visited index 27 times in succession without requesting other pages, then went back to it's usual random spidering. Total visits for May around 4800.
That said, despite what seems like rather heavy spidering given the small site, allinurl continues to report: url only, non-www, (301 in place), cache dates on some pages from Oct '04 and Mar '04. Pages which were accidently nuked re: the non-www issue discussed a few weeks ago are not back despite following googleguy's reinclusion instructions though googlebot is spidering these pages. As for serps, down in the hundreds. The one 3 kw search which ranked about 3 in the past was for a page/section which got nuked.
Just my 0.02 cents. Comments?
Pre-update DCs
66.102.9.99
66.102.9.104
Updated DCs
216.239.63.99
216.239.63.104
216.239.57.99
216.239.57.104
216.239.39.99
216.239.39.104
72.14.207.99
72.14.207.104
66.102.7.99
66.102.7.104
64.233.161.99
64.233.161.104
Updated DCs + without sandbox filter
216.239.59.99
216.239.59.104
64.233.163.10
64.233.167.10
66.102.11.99
66.102.11.104
Dont think it is as simple as that.
I think some dcs are updated in some ways (eg backlinks)
Other DCs are updated in other ways (eg crawled data)
and other dcs updated in other ways (301 pick-up for some non-wwws)
etc
etc
And they still look like/may be adding other things to the mix.
I would put the 72s by themselves still and not call 66.102.9.* pre-update.
I might say that 66.102.7.* and 216.239.53.* sometimes looks pre-update too.
I have not got enough information to look at sites which may or may not be in the sandbox.
Ok, no-one seeing the supplemental results coming high?
Just did a search for a product and top 10 was all supplemental with the following cache dates.
Pos 1 - 30th April 2004
Pos 2 - 27th May 2004
Pos 3 - 7th May 2004
Pos 4 - 18th May 2005
Pos 5 - 9th May 2004
Pos 6 - 27th April 2005
Pos 7 - 1st Nov 2004
Pos 8 - 17th April 2004
Pos 9 - 2nd Nov 2004
Pos 10 - 18th May 2005 (The only not supplemental result - but it is a redirect)
A few 404s and redirect to homepages as product no longer available in all that as you might imagine.
I have to admit, I don't even know why this is necessary, but if it helps Google crawl my site, or if somebody searches for me without the www, I guess this is supposed to do the trick.
If I am incorrect on any of this, please let me know and/or correct me in a subsequent post.
Here's what I know. The code below is correct as far as I know. Discussion on a couple of points below.
Below is the code I am using to redirect non-www to www on an Apache server (Unix).
The following should be placed in a plain text file named ".htaccess" That file should go in yor root directory, which I'm assuming for my purposes is my www directory. I have the following structure in my ftp setup: / -- home -- mysitename -- www -- subdirectories.
Added: OK, now I'm really confused. I am on a shared server. In directory / are all kinds of files I don't mess with like bash_history, bin, boot, etc. In directory "home" are everybody's sitenamed folders, including mine. In mysitename directory are more files I don't mess with, bash_history, .pinerc, .redirect (I have to look at that one), etc. plus my www folder. I now have .htaccess file in both the mysitename and www directories. Somebody please help me out here?
I am assuming that the www is what most of you are calling "root". Please correct me if I'm wrong.
-------------
Options +FollowSymLinks
AllowOverride FileInfo
RewriteEngine on
RewriteCond %{HTTP_HOST} ^example\.com
RewriteRule (.*) [example.com...] [R=301,L]
--------------
The first two lines were suggested by others here and seem OK, though I have little idea what they do. "example" must be replaced by your actual site name (what's in between www. and .com). I did not know this at first, but questioned it and realized it made sense. I learned to write articles assuming that the reader knows nothing. The same should apply to passing on important code, because chances are, the reader knows a little. No offense to Powdork, who passed me the code to begin with (I'm actualy very grateful to him), but I did throw the thing up with the word "example" intact instead of my site. I don't believe it made any difference.
Lines 4 and 5: I have seen various iterations of these lines, but what's important, I believe is that if you want to redirect non-www to www, you have it the way I do above.
I have seen on line 4,
www.example\.com
and
www\.example\.com.
Narrowly speaking, I think if you had www in both lines 4 and 5, this would redirect only www to www, accomplishing nothing. I may be wrong.
Also, from my limited knowledge of Perl, I would think the 2nd iteration, with the backslash after the www, would be correct, and you would use that on line 4 if you want to do the reverse, i.e., redirecting www to non-www. In that case, you would leave out the www on line 5.
I hope I'm correct in my assumptions and we can work together here to develop a code that works for everyone.
[edited by: fearlessrick at 1:46 pm (utc) on May 25, 2005]
What feedback do you need? Your code looks correct if you are trying to redirect from non-www to www.
If it is not working you are best to ask in the Apache Forum:-
[webmasterworld.com...]
The guys in there really know there stuff (and have been answering related question a lot recently - might see what you need in recent posts in that forum)
RewriteCond %{HTTP_HOST}!^www\.site\.com
RewriteRule ^.*$ [site.com%{REQUEST_URI}...] [R=301,L]
Just a thought, which may or may not be relevant. I've noticed my traffic being VERY slow in the AM, picking up through the day, but getting progressively better in the evening. I am on the East Coast.
What this is (maybe) telling me is that the datacenters in Europe are not treating me very well, and if there's an east coast / west coast bias, the west is best for me.
Traffic is still off roughly 70%, but it keeps improving daily. Keeping hopes up and working on non-Google traffic. I really think they are screwing themselves with continual changes to their algo. Business people like consistent results and their search and Adsense results have been anything but.
My site has been hit (again) by Bourbon, removing most of the traffic recovered after Allegra. But my site dropping only means our household doesn't have as much disposable income - which is sad for me but leaves the rest of the world unscathed. However, when big companies get their traffic decimated, people lose their jobs and many lives are drastically altered. I think in that regard if Google doesn't start to act more responsibly in regard to the vast power they wield they will end up being regulated.
*By big, I mean in partnership with the BBC and with an Alexa rating under 20,000.
Although we are not that big - we have links from the BBC, The Times, etc rank about 30000-50000 on Alex- I have already stopped any non-essential spend, including Google ads, until the position becomes clearer.
I guess I won't be taking on a techie this year after all.
By that, I mean whenever an economic decision is to be made, the moral hazard has to be entertained. Examples run the gamut from individuals to governments, as in the following:
Should I buy new shoes or a week's worth of groceries?
Should the US invade Iraq or keep status quo?
Should GM build bigger SUVs or smaller hybrids?
etc....
In Google's case, their moral hazards in any kind of major change (additions to Adsense, Adwords, changes to index, etc.) involve many questions. I don't think they weighed these:
Should we give preference to Adsense publishers or Adwords advertisers in search results? (In fairness, they should not, but from a practical standpoint, it would be good business. They have boxed themselves into a conflicting position)
How will dramatic algo changes affect our business relationships? (I honestly don't think they took this into account very professionally. From the looks of things, they harm some, help others, but overall, results vary widely from update to update. It's all very inconsistent.)
etc.
I believe they've blown it here, trying to do too much while yielding a great deal of power over many, many small and large web publishers and advertisers.
As with any business service provider, if the service is not consistently providing excellent results, the users will look elsewhere, and I believe that is happening now (it is in my case, at least. I have no other choice).