| This 236 message thread spans 8 pages: < < 236 ( 1 2 3 4  6 7 8 ) > > || |
|Very Odd - Big Drops in SERPS Today April 26, 06|
Big SERP Drops - Odd Fresh Dates
Very Odd - Big Drops in SERPS Today April 26, 06
A site we have has seen decent results for over 2 years, all of a sudden today all SERPS have dropped for us by about 4-10 pages lower all at once.
This is very odd, we have worked very hard to put out decent content and relevant content.
Any reasons for this, is their some sort of weird dance taking place?
We adjusted the page two days ago to make it better for the reader and seems we got slammed very badly for this.
Google fresh date (on serps) shows a date April 24, 06 but when clicking on the "Cached" link on the results page it shows a date of April 18, 06
Curious to what the webmasters think of this out there.
All you can do is add the redirect, make sure that all internal links go to the correct version, check your site with Xenu LinkSleuth and fix all problems found. Google might take a few months to work things out, but they will eventually get it right.
G1, or anyone, does internal linking cause problems too? Example.. I want to a href to a page www.site.com/stuff.html and I currently use a href="stuff.html" is that fine? Or do I also need to use a href="http://www.site.com/stuff.html"
For my one single-word keyword (non-commercial theme), the top results now have
#6: Open Directory, dmoz
#16: Yahoo Directory
#17: Google Directory
At least three of those are new, or far higher than before. Directories are doing well!
sorry if this has been said before, but do you think all of this has something to do with google merging results from all of its different bots together?
i read that googlebot, the adsense bot and image bot now pool all of their data together - which saves a lot of money for google, as they no longer have to send each bot to each and every site.
i only say this because i recently dropped adsense from half of the pages on my site, and a few weeks later i lost roughly 50% half of my pages from the google index.
if google is relying on just one or more bots to trawl the web, maybe that explains why some people are doing better out of this than others. after all, if you an adsense site, and the adsense bot is working fine, then it is not going to be affected.
See Matt Cutts post on his own blog, about the new Google Crawl Caching Proxy, about two weeks ago.
I use Google Adsense, and it is killing me.
guys good news for me I think, I put in a reinclusion request last week and I have just received an email"
Thank you for your reply. We understand your concern and have passed your message on to our engineering team for further investigation.
We appreciate your patience.
The Google Team
!PLEASE TELL ME THEY ARE GOING TO FIX ME!
Just noticed my Google hits have dried up completely also, very strange?
Well, that may be a strong indicator that you do have a penalty. Can't know for sure, however, when I became worried about a domain I recently registeed (there were a handful of weird links out there and it wasn't getting indexed) I emailed Google and in two days received an email assuring me that there was no penalty in effect.
Ill keep this post updated as soon as I here more news.
regardless of all the talk on www.nonwww etc I did have a sub domain that was duplicating my site, so hopefully this was the problem,
will keep you posted
|Thank you for your reply. We understand your concern and have passed your message on to our engineering team for further investigation. |
Wouldn't get too excited over this, da95649. It is a common response and unfortunately in my case I got no reply in a previous situation. However you never know, and I wish you good luck. Hopefully it IS a problem with your site, and pass on to you an inkling of what has offended their algorithm, so that you can make corrections.
However, I am sure there is a common denominator here. Question is what? Why have some sites been penalized in a similar fashion, yet other sites are untouched? Would others agree with JimLahey's view (and mine) that it can't be a canonical issue? da95649 had a subdomain that was duplicating his site. So did I, but was only duplicating a portion of the site. Anyone else have duplication issues?
|Google are obviously trying to improve quality (and from the end-users point of view have succeeded) |
Not sure about that Spruce2000. My top terms are throwing up WAY more junk than before, and that is not a subjective view.
i ran the site:mysite.com -www command and it shows up 28 non-www supplemental listings, all for pages that no longer exist.
Presumably even if I implement 301 redirects on my site, I wont get rid of those supplementals anyhow? Is it still worth me doing it?
>>>>>Would others agree with JimLahey's view (and mine) that it can't be a canonical issue? da95649 had a subdomain that was duplicating his site. So did I, but was only duplicating a portion of the site.
Duplicating a site on a sub-domain so that Google can index it sounds like a Canonical issue to me.
The advice has always been to do the 301 - however, dont expect Google to fix things if you do.
Hmmm. I'm not convinced that its a canonical issue. I just ran the same search on the sites that have benefitted most since 26th april and they have more results from 'site:domain.com -www' than I do.
DayoUK - whats the best/easiest way to do a 301 redirect for a site on a shared windows server with a third party hosting company? I did just do a sitesearch but didnt find an easily understandable answer.
Not sure - I am not on a windows server.
I have just done some interesting research on the whole 301 redirect issue.
As I do not think this is the cause of all the drops, I would like to understand what to do which is correct.
Above is a link to search competitors and indeed your own site.
My question is, if my main site is [domainname.co.uk...] which picks ups the index.php, should I have a 301 direct setup for:
my second question is I have purchased many other domains that hold keywords, whats the best way to divert these domain names I own to my main site, would this be via DNS/301 redirect?
At the moment when I got www.keyworddomain.com my site appears, however the www.keyworddomain.com stays in address bar until you click somewhere on the site?
Ive looked at matt cutts blog, and he uses 301 directions so its must be a good thing?
|DayoUK - whats the best/easiest way to do a 301 redirect for a site on a shared windows server with a third party hosting company? I did just do a sitesearch but didnt find an easily understandable answer. |
You'll need to get them to set-up the redirect in IIS (it's reall easy to do).
ive just edited by htaccess which looks like its done the job:
php_flag session.use_trans_sid off
RewriteRule ^(.*)$ http://www.example.co.uk/$1 [R=permanent,L]
[edited by: tedster at 12:12 pm (utc) on May 10, 2006]
[edit reason] Note: add a space before ! in line 4 [/edit]
Would others agree with JimLahey's view (and mine) that it can't be a canonical issue?
Well, I'm among those who doubt that.
If I understand it well, the "canonical issue" is based on the "duplicate penalty". G sees exactly the same page under two different URLs (like under www/non www urls). This however can't cause a penalty on an entire website and it doesn't - simply the second (third.. etc) url is put as supplemental result. I think so.
If I were wrong - then lots of sites would have been penalized, especially those based on php scripts, where you can access the same page in a variety of ways. Let's look at the phpbb forums - there are at least two ways to get to a certain topic - and so - two different URLs pointing to the same contents. So if you have a forum with hundreds of topics/posts, then you have also hundreds of "duplicate" contents (from the robots point of view).
Another example - theoretically I can put links to my competitors website looking like this: www.mycompetitorweb.com/?var1=something
and another one:
and hundreds of links like this. If a php script installed on this website doesn't care about the "var.." parameter then it simply show the contents of the page. Always the same.
If the theory of "duplicate contents" penalty was true then G would punish the website - because from G's point of view there would be hundreds of duplicate pages (many URL's - the same contents).
So, if my asumption, that the "canonical problem" is based on "duplicate contents problem" is correct (is it?) then it simply can't be the reason to penalize the entire website.
Of course there can be something specifical about the www/non www website, but did anybody actually see a proof of that? I have a few more websites, none of them uses redirections to "www" version of the URL, and - none of them has a similiar problem like the one which I'm worried about. I hope G will respond to you, da95649, and that you won't forget to tell us what was it :)
And, please forgive me my English.
[edited by: konrad at 2:09 pm (utc) on May 9, 2006]
|This however can't cause a penalty on an entire website and it doesn't - simply the second (third.. etc) url is put as supplemental result. |
It won't cause a penalty, but may reduce the ranking of a page. If Google sees www.domain.com/page1 and domain.com/page1 as two seperate pages, Page Rank may be split between them since some incoming links will use the "www" and others won't.
FWIW I implemented a 301 non www to www over 6 months back, as Yahoo had listed most of my pages domain.com, I have never linked or had links going to domain.com and thats when I thought a duplicate issue "could" arise so did the redirect and Y picked it up and corrected it.
So maybe its not needed YET for google, although this board seems to think it does, why wouldn't you just make sure that your site only shows for either non www or for www, it can only be a matter of time before it IS needed IMO
Yes, as per Jadebox
I agree for a site that shows duplicate content on.
Google will filter out at search level or not index all formats as they are on the same domain (or sub-domain)
However when you get indexed under domain.com/desc/product1.html and www.domain.com/desc/product1.html then you get issues with page rank etc accross sub-domains.
Quick example of what might happen:-
You have www.domain.com with 150 BLs and a PR6 - Google comes along and finds 1 or 2 links to domain.com which would perhaps give it a PR1 (although prob near PR0).
Duplicate content on domain.com and www.domain.com and Google would try to consolidate listing/pr etc - now this is where Google can easily get confused.
What penalty is applied to the domain is unclear. Duplicate content penalty accross domains, sub-domain penalty - or just a very confused Google.
But We are not just talking about the homepage.
If you have www.domain.com as a PR5 and internal pages such as www.domain.com/product1.html has PR4 what will happen when Google finds domain.com is that it splits of a new site - so domain.com/product1.html is now a new unindexed page in Google which gets issued a PR0 until it is crawled under the non-www version.
So yes it effects the whole site.
Before Canonical error queries to the following page would be a PR4:-
www.domain.com/product1.html & domain.com/product1.html
www.domain.com/product2.html & domain.com/product2.html
Once Google indexes under domain.com seperately then those domain.com/product1.html queries refer to an unindexed page with a PR0
Here is my similar dilema.
I have a site, over 1 year old, that is now a PR6 only on the homepage, the rest of the pages don't show any rank.
The site has never (as far as I've seen) ranked for anything other than a long quote.
Now when I do a site: search, the only page that comes back is my non-www version homepage. Nothing else. No www versions, no other pages.
This I checked on DC 18.104.22.168
I don't have a 301 in place and 99% of my links use the www version.
Before this current problem, a site: query would return most of my site in the supplemental index.
I'm at a loss. I've contacted google via sitemaps reinclusion request but have not heard back.
Yes, I have seen sites like that - it makes me wonder if Google prefers the domain.com version and tries to over-ride the www version with the domain.com version......
Now say the domain.com version has virtually no BLs and PR and the www version has loads and PR6 - Google prefering the domain.com version would cause all sorts of problems surely? and all backlinks to www would be wasted.
[edited by: Dayo_UK at 2:36 pm (utc) on May 9, 2006]
It won't cause a penalty, but may reduce the ranking of a page
However when you get indexed under domain.com/desc/product1.html and www.domain.com/desc/product1.html then you get issues with page rank etc accross sub-domains.
ahh yes - I do understand that www/non www can cause some problems, and that the PR is split, and so on, and that it is good to do the redirection.
But what happened is not "some" problem - it's a tragedy. When we search for our own domain names we are on pages 4th or 5th, and at higher positions there are websites with really low PR.
Nevertheless, of course I'll remove this problem with my website.
By the way, Matt Cutts doesn't do anything about that - there are www and now-www versions, with no 301, but, he's got higher PR on the other hand :)
What a joke!
Everyone ask yourselves, does Google care?
How hard can this be to fix, I use to work on the QA team for the Ingram Micro worldwide sites, for all countries and all formats of the web.
If we had a problem like this it would be less than a month to fix it for sure. It's a joke!
What I think is they are just blowing smoke up all our __s's and doing what they want when they want.
Oh but do not forget, give them all they ask us webmasters for so they have all our information and all our data and all our VALUABLE complaints so they can make more money and care less about you.
Analytics - they know everything.
Sitemaps - they know all your managed sites at one time (Yah they dont use that info for anything I am sure) - Remember Enron, you think anyone is Honest or even cares about you?
Send E-mails to Matt Cutts - Get real, what proof does anyone have that Googleguy or Matt ever directly fixed something and pointed it out in detail.
Who knows what the hell is going on. That's the way they want it, but (as they always request) keep feeding them all the information!
Google - please loose the search battle asap! I can not wait! - I really wish webmasters would do something about it for once, like in the old days. We have power folks, get off your ___, stop putting up with the Google )(*&.
>>>By the way, Matt Cutts doesn't do anything about that - there are www and now-www versions, with no 301, but, he's got higher PR on the other hand
He says it is an expirement (so obv. there is something to look at)....besides his homepage used to get dropped all the time not long ago.
>>>>But what happened is not "some" problem - it's a tragedy.
To whoever asked earlier: yes, the internal linking of your site has a big effect on how you are indexed. Run Xenu LinkSleuth and see what it says.
There are many types of duplicate content: www vs. non-www (use 301 redirect), multiple domains (use 301 redirect), same title and/or meta description (make them all different), same on-page content for multiple URLs, (e.g. parameters in a different order, or extra parameters, just don't, right!), and so on. The last is more work to fix, and needs certain versions of the page to be excluded using the robots meta tag, or maybe using the robots.txt file.
Has anyone just stood back and looked at the complete madness of all this.
As webmasters we build websites. Those websites are prone to duplicate content because of X number of factors.
They are prone to Canonical problems due to X number of factors.
They are prone to being scraped by X number of websites.
Webmastery is not an exact art and as such design and coding is not standard.
Many web designers ... probably the majority of web designers that have pages out on the web are people that have very limited technical skills. 301 means nothing to them. Canonical is surely related to the church. etc etc.
So that would suggest there are numerous sites out there with limited technical management that have all these possible holes and no resources to fix them or even know about them. But the information on those pages is no less valid than the information on my pages or on your pages.
So here is the maddness. We provide the words on the page and the search engines do what they do to make those words available to people that search for them. But we all know that Google will miss loads of that data because of IT'S technical limitations.
So here we are...we have to fix issues that no other search engine has so that google can present a similar set of results. Why are we fixing issues at our end when google should be addressing the challenge at their end?
One to Many solutions are far better and more accurate to implement than Many to One.
We are all running around chasing our tail when in fact the problem of solution lies with Google. It is them that needs to fix their search to cope with the challenges that providing valid results on the internet presents.
Otherwise the only peolple that rank will be those in the know technically and that is not providing the best search results.
If you understand the few simple technical issues surrounding all of this then the fixes are very simple. They are basic SEO and design.
Make sure that every page of the site has a unique title and meta description.
Make sure that every page of the site links back to "/" and to the main section indexes.
Make sure that all domain.com accesses are redirected to the same page in the www.domain.com version of the site.
If you have mutiple domains, then use the 301 redirect on those such that only one domain is indexed.
If you have pages that say to bots "Error. You Are Not Logged In", for example "newthread", "newreply", "editProfile" and "sendPM" links in a forum, then make sure the link has rel="nofollow" on it, and the target page has <meta name="robots" content="noindex"> on it too.
If you have a CMS, forum, or cart that has pages that could have multiple URLs, then get the script modified to put a <meta name="robots" content="noindex"> tag on all but one "version" of the page.
Use the site: search to see what you have indexed, and work to correct these issues. The presense of Supplemental Results, URL-only entries, or hitting the "repeat this search with omitted results included" message very quickly are all indications that you have stuff that needs fixing.
It is a sad fact that systems like vBulletin, PHPbb, osCommerce, and a whole range of popular scripted sites, have a large number of SEO-related design errors built in to them. The designers are clever programmers, but have no clue about SEO or how their site will interact with search engines; and the situation isn't getting any better.
|It is a sad fact that systems like vBulletin, PHPbb, osCommerce, and a whole range of popular scripted sites, have a large number of SEO-related design errors built in to them. The designers are clever programmers, but have no clue about SEO or how their site will interact with search engines; and the situation isn't getting any better. |
This is true... but it's the reason we do so well!
What is worrying about all this is it's clear to me a lot of people from this board are now losing faith in Google.
| This 236 message thread spans 8 pages: < < 236 ( 1 2 3 4  6 7 8 ) > > |