Forum Moderators: Robert Charlton & goodroi
A site we have has seen decent results for over 2 years, all of a sudden today all SERPS have dropped for us by about 4-10 pages lower all at once.
This is very odd, we have worked very hard to put out decent content and relevant content.
Any reasons for this, is their some sort of weird dance taking place?
We adjusted the page two days ago to make it better for the reader and seems we got slammed very badly for this.
Google fresh date (on serps) shows a date April 24, 06 but when clicking on the "Cached" link on the results page it shows a date of April 18, 06
Curious to what the webmasters think of this out there.
----
G1, or anyone, does internal linking cause problems too? Example.. I want to a href to a page www.site.com/stuff.html and I currently use a href="stuff.html" is that fine? Or do I also need to use a href="http://www.site.com/stuff.html"
?
i only say this because i recently dropped adsense from half of the pages on my site, and a few weeks later i lost roughly 50% half of my pages from the google index.
if google is relying on just one or more bots to trawl the web, maybe that explains why some people are doing better out of this than others. after all, if you an adsense site, and the adsense bot is working fine, then it is not going to be affected.
Thank you for your reply. We understand your concern and have passed your message on to our engineering team for further investigation.
We appreciate your patience.
Regards,
The Google Team
!PLEASE TELL ME THEY ARE GOING TO FIX ME!
Thank you for your reply. We understand your concern and have passed your message on to our engineering team for further investigation.
Wouldn't get too excited over this, da95649. It is a common response and unfortunately in my case I got no reply in a previous situation. However you never know, and I wish you good luck. Hopefully it IS a problem with your site, and pass on to you an inkling of what has offended their algorithm, so that you can make corrections.
However, I am sure there is a common denominator here. Question is what? Why have some sites been penalized in a similar fashion, yet other sites are untouched? Would others agree with JimLahey's view (and mine) that it can't be a canonical issue? da95649 had a subdomain that was duplicating his site. So did I, but was only duplicating a portion of the site. Anyone else have duplication issues?
Google are obviously trying to improve quality (and from the end-users point of view have succeeded)
Duplicating a site on a sub-domain so that Google can index it sounds like a Canonical issue to me.
Silent_Bob
The advice has always been to do the 301 - however, dont expect Google to fix things if you do.
As I do not think this is the cause of all the drops, I would like to understand what to do which is correct.
[webconfs.com...]
Above is a link to search competitors and indeed your own site.
My question is, if my main site is [domainname.co.uk...] which picks ups the index.php, should I have a 301 direct setup for:
[domainname.co.uk...]
[domainname.com...]
[domainname.com...]
to [domainname.co.uk...]
my second question is I have purchased many other domains that hold keywords, whats the best way to divert these domain names I own to my main site, would this be via DNS/301 redirect?
At the moment when I got www.keyworddomain.com my site appears, however the www.keyworddomain.com stays in address bar until you click somewhere on the site?
Ive looked at matt cutts blog, and he uses 301 directions so its must be a good thing?
php_flag session.use_trans_sid off
Options +FollowSymlinks
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_HOST}!^www\.domainname\.co\.uk$
RewriteRule ^(.*)$ http://www.example.co.uk/$1 [R=permanent,L]
[edited by: tedster at 12:12 pm (utc) on May 10, 2006]
[edit reason] Note: add a space before ! in line 4 [/edit]
Would others agree with JimLahey's view (and mine) that it can't be a canonical issue?
Well, I'm among those who doubt that.
If I understand it well, the "canonical issue" is based on the "duplicate penalty". G sees exactly the same page under two different URLs (like under www/non www urls). This however can't cause a penalty on an entire website and it doesn't - simply the second (third.. etc) url is put as supplemental result. I think so.
If I were wrong - then lots of sites would have been penalized, especially those based on php scripts, where you can access the same page in a variety of ways. Let's look at the phpbb forums - there are at least two ways to get to a certain topic - and so - two different URLs pointing to the same contents. So if you have a forum with hundreds of topics/posts, then you have also hundreds of "duplicate" contents (from the robots point of view).
Another example - theoretically I can put links to my competitors website looking like this: www.mycompetitorweb.com/?var1=something
and another one:
www.mycompetitorweb.com/?var2=something_else
and hundreds of links like this. If a php script installed on this website doesn't care about the "var.." parameter then it simply show the contents of the page. Always the same.
If the theory of "duplicate contents" penalty was true then G would punish the website - because from G's point of view there would be hundreds of duplicate pages (many URL's - the same contents).
So, if my asumption, that the "canonical problem" is based on "duplicate contents problem" is correct (is it?) then it simply can't be the reason to penalize the entire website.
Of course there can be something specifical about the www/non www website, but did anybody actually see a proof of that? I have a few more websites, none of them uses redirections to "www" version of the URL, and - none of them has a similiar problem like the one which I'm worried about. I hope G will respond to you, da95649, and that you won't forget to tell us what was it :)
And, please forgive me my English.
[edited by: konrad at 2:09 pm (utc) on May 9, 2006]
This however can't cause a penalty on an entire website and it doesn't - simply the second (third.. etc) url is put as supplemental result.
It won't cause a penalty, but may reduce the ranking of a page. If Google sees www.domain.com/page1 and domain.com/page1 as two seperate pages, Page Rank may be split between them since some incoming links will use the "www" and others won't.
-- Roger
So maybe its not needed YET for google, although this board seems to think it does, why wouldn't you just make sure that your site only shows for either non www or for www, it can only be a matter of time before it IS needed IMO
Konrad.
I agree for a site that shows duplicate content on.
www.domain.com/desc/product1.html
www.domain.com/shop/product1.html
www.domain.com/uk/product1.html
Google will filter out at search level or not index all formats as they are on the same domain (or sub-domain)
However when you get indexed under domain.com/desc/product1.html and www.domain.com/desc/product1.html then you get issues with page rank etc accross sub-domains.
Quick example of what might happen:-
You have www.domain.com with 150 BLs and a PR6 - Google comes along and finds 1 or 2 links to domain.com which would perhaps give it a PR1 (although prob near PR0).
Duplicate content on domain.com and www.domain.com and Google would try to consolidate listing/pr etc - now this is where Google can easily get confused.
What penalty is applied to the domain is unclear. Duplicate content penalty accross domains, sub-domain penalty - or just a very confused Google.
But We are not just talking about the homepage.
If you have www.domain.com as a PR5 and internal pages such as www.domain.com/product1.html has PR4 what will happen when Google finds domain.com is that it splits of a new site - so domain.com/product1.html is now a new unindexed page in Google which gets issued a PR0 until it is crawled under the non-www version.
So yes it effects the whole site.
Before Canonical error queries to the following page would be a PR4:-
www.domain.com/product1.html & domain.com/product1.html
www.domain.com/product2.html & domain.com/product2.html
Once Google indexes under domain.com seperately then those domain.com/product1.html queries refer to an unindexed page with a PR0
Yes, I have seen sites like that - it makes me wonder if Google prefers the domain.com version and tries to over-ride the www version with the domain.com version......
Now say the domain.com version has virtually no BLs and PR and the www version has loads and PR6 - Google prefering the domain.com version would cause all sorts of problems surely? and all backlinks to www would be wasted.
[edited by: Dayo_UK at 2:36 pm (utc) on May 9, 2006]
It won't cause a penalty, but may reduce the ranking of a page
However when you get indexed under domain.com/desc/product1.html and www.domain.com/desc/product1.html then you get issues with page rank etc accross sub-domains.
ahh yes - I do understand that www/non www can cause some problems, and that the PR is split, and so on, and that it is good to do the redirection.
But what happened is not "some" problem - it's a tragedy. When we search for our own domain names we are on pages 4th or 5th, and at higher positions there are websites with really low PR.
Nevertheless, of course I'll remove this problem with my website.
By the way, Matt Cutts doesn't do anything about that - there are www and now-www versions, with no 301, but, he's got higher PR on the other hand :)
Everyone ask yourselves, does Google care?
How hard can this be to fix, I use to work on the QA team for the Ingram Micro worldwide sites, for all countries and all formats of the web.
If we had a problem like this it would be less than a month to fix it for sure. It's a joke!
What I think is they are just blowing smoke up all our __s's and doing what they want when they want.
Oh but do not forget, give them all they ask us webmasters for so they have all our information and all our data and all our VALUABLE complaints so they can make more money and care less about you.
Analytics - they know everything.
Sitemaps - they know all your managed sites at one time (Yah they dont use that info for anything I am sure) - Remember Enron, you think anyone is Honest or even cares about you?
Send E-mails to Matt Cutts - Get real, what proof does anyone have that Googleguy or Matt ever directly fixed something and pointed it out in detail.
Who knows what the hell is going on. That's the way they want it, but (as they always request) keep feeding them all the information!
Google - please loose the search battle asap! I can not wait! - I really wish webmasters would do something about it for once, like in the old days. We have power folks, get off your ___, stop putting up with the Google )(*&.
He says it is an expirement (so obv. there is something to look at)....besides his homepage used to get dropped all the time not long ago.
>>>>But what happened is not "some" problem - it's a tragedy.
Indeed.
There are many types of duplicate content: www vs. non-www (use 301 redirect), multiple domains (use 301 redirect), same title and/or meta description (make them all different), same on-page content for multiple URLs, (e.g. parameters in a different order, or extra parameters, just don't, right!), and so on. The last is more work to fix, and needs certain versions of the page to be excluded using the robots meta tag, or maybe using the robots.txt file.
As webmasters we build websites. Those websites are prone to duplicate content because of X number of factors.
They are prone to Canonical problems due to X number of factors.
They are prone to being scraped by X number of websites.
Webmastery is not an exact art and as such design and coding is not standard.
Many web designers ... probably the majority of web designers that have pages out on the web are people that have very limited technical skills. 301 means nothing to them. Canonical is surely related to the church. etc etc.
So that would suggest there are numerous sites out there with limited technical management that have all these possible holes and no resources to fix them or even know about them. But the information on those pages is no less valid than the information on my pages or on your pages.
So here is the maddness. We provide the words on the page and the search engines do what they do to make those words available to people that search for them. But we all know that Google will miss loads of that data because of IT'S technical limitations.
So here we are...we have to fix issues that no other search engine has so that google can present a similar set of results. Why are we fixing issues at our end when google should be addressing the challenge at their end?
One to Many solutions are far better and more accurate to implement than Many to One.
We are all running around chasing our tail when in fact the problem of solution lies with Google. It is them that needs to fix their search to cope with the challenges that providing valid results on the internet presents.
Otherwise the only peolple that rank will be those in the know technically and that is not providing the best search results.
MADNESS!
Make sure that every page of the site has a unique title and meta description.
Make sure that every page of the site links back to "/" and to the main section indexes.
Make sure that all domain.com accesses are redirected to the same page in the www.domain.com version of the site.
If you have mutiple domains, then use the 301 redirect on those such that only one domain is indexed.
If you have pages that say to bots "Error. You Are Not Logged In", for example "newthread", "newreply", "editProfile" and "sendPM" links in a forum, then make sure the link has rel="nofollow" on it, and the target page has <meta name="robots" content="noindex"> on it too.
If you have a CMS, forum, or cart that has pages that could have multiple URLs, then get the script modified to put a <meta name="robots" content="noindex"> tag on all but one "version" of the page.
Use the site: search to see what you have indexed, and work to correct these issues. The presense of Supplemental Results, URL-only entries, or hitting the "repeat this search with omitted results included" message very quickly are all indications that you have stuff that needs fixing.
It is a sad fact that systems like vBulletin, PHPbb, osCommerce, and a whole range of popular scripted sites, have a large number of SEO-related design errors built in to them. The designers are clever programmers, but have no clue about SEO or how their site will interact with search engines; and the situation isn't getting any better.