Welcome to WebmasterWorld Guest from 184.108.40.206
The particulars: site is 5 years old (so no sandbox issue), lots of relevant incoming links (many more than most of the sites ahead of mine in the listings), very optimized (natural web copy with lots of keywords), no copies of my site between pages or elsewhere (I caught a couple of thieves and one removed material, I reported the other, and I changed my web text so there would be no copy issue), a G-friendly site map, all coding errors corrected.
I have also asked G about this (specifically if two dots in my site URL causes issues, if a few pages having spaces in the URL--%20 in code--causes issues, and etc.), but I get no answer. So I am stumped. The site ranked really well literally for years, and now this.
So, what have I missed that might cause such a quick and extraordinary drop? Or what is the next step in figuring out how to "fix" my site?
If there are a majority of supplemental results this may be part of the problem. Also look for no title/no description. Then, if your site is on a shared hosting, check to make sure your site hasn't been blacklisted by someone else penalized for spam.
Only one supplemental listing, which is /cgibin.sitestats...
There are in excess of of 1,000 sites set up with the same domain arrangement (mysite.outfit-I-purchased-the-site-from-years-ago.com) and I don't know how to check if one has been penalized for copying without visiting each, so a suggestion would be greatly appreciated.
The site at outfit-I-purchased-the-site...com is coming up as not found: a problem?
I have a sitemap I thought was working fine, but I redid it and resubmitted it and there are three errors I don't get at all: url not under sitemap path for myhomepage.com, myhomepage.com/, and .../links.html
All other pages are fine, including /index.html (presumably the same as homepage.com) and other .html instances.
doubt it, has never effected me and I dont recall it effecting anyone else - until the domain expires that is.
You should renew your domain as soon as you can though to avoid problems with the domain expiring - I think I remember a discussion on here a few years ago about this which ended up with the above being the general consensus, but there is no search and I am not going to go looking for it :-P
You need to change all your internal site links to match - add the www on the front.
Personally, I would have gone the other way on a long subdomain like that and left the www off, but its too late now as you have 1000's of back links all including www (and google has cached it this way), so match your sites internal ones to those.
Your main hosts domain is working fine for me, and has PR etc, and google banning an entire IP is debatable (especially since both you and your host have good PR).
You also have good PR and content, so i would start by just changing those internal links. Maybe add some new content (a new page or so) just to get the 'freshness' factor up a bit if you havent modified it much in the last couple of years.
I wouldnt worry about the %20 in that one URL, as google seems to handle it ok, though if starting of its best to use an - instead.
If possible, set up a 301 redirect from the non-www domain to the www domain aswell (you might need to contact your host to do this), as this will ensure google doesnt go and stuff up your site with the famous canonical issue mentioned in almost every thread in this forum.
Basically, consolidate it down so that everyone inside and outside your site uses the www.yoursubdomain.yourhostsdomain.com format.
I am off for some sleep now, but sticky me the search terms when you get a chance and I can look at those other sites ahead of you and try and figure out what they have that you dont.
Oh, and maybe somebody else could have a look and confirm my findings before you change anything?
My site will work with or without the www, and in fact, when I look at all my site files (on my admin pages), they are listed without the www. So, how can I get the sitemap file loaded with a www? Also, do I need to figure out how to have all my files listed for my site as www (I assume all I can do is talk to the IP)?
And I will forward you the search terms via sticky as you suggested.
But one question and one request: many of my backlinks are www but my site is non-www in G. Does this mean I am not getting "credit" for my backlinks when it comes to search results ranking?
Also, could the moderators refer me to a good past thread about the canonical issue, and matbe to a thread about how redirects work (how to do one and what not to do to annoy G). I guess I am wondering if a 301 might help me with lots of issues (like shortening my domain and excluding that second dot) including this www thing.
After a while, the Google index can get filled with bad urls that are all duplicate copies of the custom page or Home Page. This can really wreak havoc. I have even heard of a darker practice that goes looking for this issue and intentionally helps Google to "find" lots of those bad urls.
Even without someone intentionally feeding googlebot those bad urls, the spider seems to be quite creative in requesting some funky addresses all on its own. If you don't feed it a 404 server header, you're off and running.
Any doubts about your custom 404 set up? You can check the server header [searchengineworld.com] for your should-be-404 url on our sister site, SearchEngineWorld.
many of my backlinks are www but my site is non-www in G.
Does this mean I am not getting "credit" for my backlinks
If you use a 301 redirect to the no-www urls, the credit should flow right over -- backlinks, anchor text influence etc. If you just resolve both urls with a 200, then you certainly may not be getting the "credit" you hoped for and instead it is being split into two piles. Sometimes the two urls will even show different toolbar PR.
If you see a drastic shift in your ranking, you should have a quick reference at hand to see what things you did recently. I would confine this particular log to major changes - acquired a couple new backlinks? No biggie. Doubled your backlinks in one week? Record that. Log ANY change to your htaccess, mod_rewrite or parallel changes on IIS.
I always suggest looking at changes that YOU'VE made on the site itself before looking for an issue at Google. Yes, sometimes there is a significant shift in the ranking algo and you might be able to respond in some way once you get a clue about what's going on at Google. But if you lost rank and the SERP did not change much in general, then always start at home, where you are most in charge.
Sudden drop in rankings
Look at any new changes that you made to the site--check for SEO Overkill or an Over-Optimization Penalty (OOP).
Analyze your links and make sure you're not participating in a bad neighborhood that just got tanked.
make sure you're not participating in a bad neighborhood
With the way domains get sold and repurposed these days, a regular review of backlinks is important -- and not just to cull the link rot (although that can matter if it grows to large.) I've seen the "cleanest of the clean" domains expire and then go to the dark side very quickly. Too many links like that and your domain can definitely suffer.
Make sure you didn't add any new links from other domains hosted on the same class C block of IPs
Do we know this to be an absolute fact or has it become an accepted truth simply because it gets repeated over and over.
I'm not poking a pointy stick ay anyone on this, but I can't recall seeing any authorative statements that it is a fact.
I've always had difficulty in getting my head around the logic that would do this... the potential for unwarranted collateral damage is huge.
Sorry if this falls into the 'bleeding obvious', but I have found that Google quickly recognises that the pages have been 'updated' and in some cases the rankings have returned.
This doesn't need to be wholesale replacement of content, just a slight tweak here and there - even better if it makes the site more current/interesting to the target audience.
Sorry if you have already tried this - just a suggestion.
Here's one I've seen hurt lots of sites -- make sure your "404" responses REALLY return a 404 HTTP header. Many sites try to redirect their 404 traffic to a custom page or their Home Page. But when they set it up, they end up returning a 302 header or even *shudder* a 200.
I've been suspecting this issue for some time now. Lost 3 4-5 PR domains after jagger for no apparent reason (clean sites, good incoming links and content) . Guess what they all have in common, an .htaccess 404 (and 500) redirect to the home page.
Question asked: how do you return a valid 404 while stil using a custom 404 page?
Do we know this to be an absolute fact or has it become an accepted truth simply because it gets repeated over and over.
if you take a small town hosting company, then it would make sense that the majority of sites in that town would be a) hosted on the local company and b) link to some of the other sites based on businesses in that town. Would a penalty in that situation make sense? No.
Precisely... and I would hazard a guess that a very significant percentage of websites are hosted outside of the big cities and towns and look for on-topic links in their own geograpic areas (communities). Hell, isn't that what webmasters and SEO folks tell them to do, not to mention Google emphasising the importance of on-topic links.
Sorry if we seem to be going off topic, but if links between sites on the same IP block is a no-no, then it needs to be in a check-list but if is unsubstantiated guesswork then it should be seen as just that.
If this community has the knowledge, then put it up there for peer scrutiny.
On 12/01, my small biz site (<10 pages) dropped from pages 1-3 (frequently in the first five listings) for many search terms to pages 13-15 for all of those same terms (and is still buried).
I witnessed almost the exact same thing as you in the first week of December. My site went from top rankings with major keywords, to nowhere (well page 10 and up...).
For the site you have seen your drop with, do you recall engaging in alot of reciprocal links with sites about a year or so ago?
I sometimes think Google is too big for itself and changes they make are reflected months later no longer weeks later.
Lots of scrapers added it too
ErrorDocument 404 /include/404error.shtml
In your root .htaccess file
Make certain that there are no other such directives in any .htaccess files or in the httpd.conf file or any included configuration files called by the http.conf file .
Note in particuler the ErrorDocument doesn't have any [domainname...] in it.
A lot of folks include the [domainname...] part and that sets up an internal 302 to the page. Google never ever sees a 404. Thus a 302 redirect followed by a 200 found and the error page becomes the content (or did) that Google has for the URL. and one of the reasons that a 302 should always be placed into the index at the target page. Use the homepage and you get (or got) duplication of the homepage in the index.
I haven't checked to see if this condition still exists. That is how for Apache.
I do have a theory that if too high a percentage of your backlinks are from scrapers then your rankings will drop.
I don't think we are penalized for scrapers linking to us but I think they are somehow skewing the Google results. Maybe it's their linking methods or the fact there are just so many of them.
Maybe I'm wrong but I can't help but think the explosion of scraper sites is part of the problem. Too many good well ranked sites or even individual pages have had big problems since there has been so much scraper activity.
This happened to a client of mine. The culprit had copied the home page word for word and posted it on several of their pages focusing on different keywords for each page. The owner of the site had no contact data so we contacted the host. They required a DMCA report. The site disappeared within a week. We also sent a DMCA to Google and sent in a reinclusion request explaining what happened. The PR came back within a few weeks but it took 3 months to get the keyword rank back.