Forum Moderators: Robert Charlton & goodroi
[webmasterworld.com...]
My website has plenty of outbound links, but they are on relevant pages. The problem my site has always had, was a lack of "inbound links." I got tired of searching for people to link to me (with all the spammy sites around) and gave up. So my pages have acquired some links naturally I guess(and I'll bet I still don't have more than 30 inbound links for the whole site) Still have a PR4, which I've had since it disappeared in Nov.
[edited by: Brett_Tabke at 8:54 pm (utc) on May 27, 2005]
The wrong thing to do is to start thinking about doing a 301 to resolve the non www
When you convince Yahoo!, Microsoft, Apple, WalMart, Barnes & Noble, Amazon, etc. that they should not resolve to www I will remove mine and read the rest of your post, until then...
Next time, please start a new thread so I can either skip or reply more thoroughly, without going too far off topic.
Justin
[edited by: jd01 at 6:39 pm (utc) on June 3, 2005]
and I am a frustrated webmaster ;) (slightly)
I finally got around to initiating the 301 redirect the other day after discovering (weeks ago) hundreds of my pages in the SERPs as WWW and NON-WWW.
Anyhow, thereafter I contacted Google and they explicitly told me that I'd had no penalties for my site whatsoever....or so they say.
[google.com...]
Looking into my crystal ball I forsee another long thread...
activeco,
Not knowing about googles motives is being naive. Are you afraid of airing your view?
Japanese, I am 44 yo and still very rebellious, usually going against the stream.
But surely enough, not without a reason. I have expressed my view, I don't think they manipulate serps for financial reasons. That's it.
I don't support them unconditionally, their algo's still have IMO, a tendency to prefer big guys (but I attribute it to probable Hilltop usage for very competitive terms) and I always claimed they have senseless discriminatory policy in their Adsense EPC distribution.
Bottom line: talk about it = off topic? No, it is very on topic.
Bad guys? Still not.
[edited by: activeco at 7:13 pm (utc) on June 3, 2005]
Changed your mind jd01 ;)
cyberfyber
I dont think they would class it a penalty in the traditional sense - after all your pages are probably still in the index.
Depends how they check these things out.
It is the duty of the webmaster or hosting company to resolve the non www to the canonical www version before a new site gets up and running.
It is also the duty of the hosting company to insure that as many errors in a url resolve to the canonical url such as in a missing trailing slash, a . dot in the url before the slash etc etc etc.
All of these errors invariably cause a 302 FOUND Temporarily moved directive in a poorly constructed or poorly maintained server. By default some later apache servers default some of these issues with a 301.
God help you if you use a IIS or similar windows server that a competitor can target. All he has to do is submit your site here and there or place a [yourwidgets.com....] or [yourwidgets.com....] or [yourwidgets.com...] links all of which google will pick up as totally separate urls poisoning your websites pagerank. It is when googlebot follows into your server and attributes your internal pages to any of the above then dilution of your pagerank occurs.
A simple engine on mod rewrite will fix these issues if done before googlebot has access to your website and its internal pages. Root level .htaccess will cause a loop so it must be done in the configuration files of the apache or internet control panel of the IIS.
Google will punish any site using a php redirects in its root folder pointing out 301’s
Hosting companies are also to blame for not helping website owners resolve the non www at the root level of ANAME RECORDS. All of the above problems are avoided if done before site goes live. When you purchase a name, the first thing is to resolve the non www to the www version obviating the inexperience of hosting companies to look after your URL.
So the previously aggressive guy who claimed I an against 301 is talking a lot of cr*p. I wholly endorse resolving a website but not in the middle of a major update such as bourbon where “Clint’s website” has gone into total oblivion.
.
? Are you saying a 301 redirect in .htaccess causes a loop?
eg:-
RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST}!^www\.example\.com [NC]
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
or
RewriteCond %{HTTP_HOST} ^example\.com
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
We might be talking at cross purposes
[edited by: Dayo_UK at 7:33 pm (utc) on June 3, 2005]
That's what I go too.
Could not decide whether to go to Babelfish or Dictionary first :)
I hope to be of help, not to confuse.
If you run a server with the root folder index file containing 301 php redirects to subfolders within the same server is what I meant. This is a big subject and off target in this thread.
Can you provide any evidence that it were not detrimental and how you conducted your array of 301 redirects so that we too can avoid being penalized for it? Just like your many successful sites that used this method?
If we only could get some real answers we could get on with building and writing content...
"? Are you saying a 301 redirect in .htaccess causes a loop?"
The htaccess script you present will not take http://www.example.com and redirect it with a 301 to http://www.example.com/
If you try by using htaccess you may indeed create a loop.
This is all he is saying
You can fix the trailing slashes in non root directories but index versions (directory/index.htm, directory/index.html, directory/index.php, etc.) still cause a problem. Again you run into loop problems.
[edited by: arubicus at 7:56 pm (utc) on June 3, 2005]
The results are bad now. Only half the update elements have been applied. It's like a car on an assembly line that is half done being pointed at and called a butt ugly useless car. It isn't finished. Perhaps that is bad for your business, but it still isn't finished in any case.
If these results -- littered with redirects, blog comment spam sites, etc... -- were the end results that would be one thing and Google would have obviously done things wrong. But there is simply no point in complaining that they haven't put the engine in a car because they haven't gotten around to it yet.
Instead of complaining, some folks ought to be using this time to be working on their site construction and related issues. We won't know how good or bad the engine is until they put it in!
Are you claiming in will not? I sense a challenge?
I am not here to give lessons on 301. It is off topic.
Write to Clause, a genius on htaccess. I am sure he will confirm what I said.
Back to Bourbon and the demise of thousands of websites.
notawebmaster,
In a few days I will show number 1 in the serps for that colorful metaphor. And the first to speak out on behalf of angry webmasters who lost their bread and butter like clint who was unfairly treated. Pity I could not make it an anchor link.
Of course the majority of us are wanting to redirect the non-www to the www (only)
Dont want people being put off doing it - thats all. :)
[edited by: Dayo_UK at 8:07 pm (utc) on June 3, 2005]
Surely if a browser is clever enough to not be confused by errors then Google is too. Remember Google wants to add pages to its index, there is no reason for it to punish you for syntax errors. Look at the trouble they go to to find url's on a page so that they can say they've indexed more pages than anyone else.
Also when considering if this is a reason for being dropped think whether the syntax error is recent. It is very unlikely that the new algo has introduced some sort of syntax check on purpose. It's not impossible, just unlikely.
Far more likely to be something else.
One set (set A) of dc's will rank me at 9 while the others (set B) 23.
Then on another shift the dc's will switch...set A at 23 and set B at 9.
This does seem rather curious...is anyone else noticing the same? If so, would this add validity to the theory of rotating algos?
Hope this is not an indication of things to come. Googleguy eluded to the possiblity of "everflux" as being a part of the new algo.
RewriteEngine on
# Because we need rewrite
Options +FollowSymLinks -Indexes
# In particular the -Indexes to prevent the server from providing a list of files to a bot or visitor when there is no index file in that directory
rewriteCond %{THE_REQUEST} ^.*\/index\.shtml
rewriteRule ^(.*)index\.shtml$ http://www.example.com/$1 [R=301,L]
# This stops the exposure of the index file and prevent s someone from linking to both the directory and the index file presented by default and causing a duplicate content problem.
RewriteCond %{HTTP_HOST}!^www\.example\.com [NC]
RewriteCond %{HTTP_HOST}!^$
RewriteRule ^(.*) http://www.example.com/$1 [L,R=301]
# This set redirects all valid server aliases for the site including IP address if allowed..
# Please note the general rewrite rules for joining the aliases is technically correct only for servers on port 80 (default for http)
As japenese said it is best to deal with this before going live, better to deal with this before getting hit, but deal with it and clean up the mess left behind.
The best way of dealing with this is only one valid server ailias.
In addition some rules that work in the httpd.conf files need work before they will function in an .htaccess file.
Please note the index file being talked about in the above example is index.shtml this has to be handled for each index file type you allow.
[edited by: theBear at 8:09 pm (utc) on June 3, 2005]