Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: mademetop
"Losing" a site can be a painful and frustrating experience. To help ease the pain, perhaps a starting list of potential issues might help. I'll probably miss more than I'm catching with this list, but at least it's a start.
Do a site search at the SE in question to determine if all of some of your pages are gone. Some think that their site has vanished, when in fact an algo update or tweak has occured causing their pages to drop. Or, individual pages have been filtered or penalized, but not entire sites:
If *all* of your pages are gone (search on URL's to check that), then perhaps:
your server was down at an inopportune time.
you have a robots.txt problem.
you've been removed from the index based on a perception of bad behavior (not good).
If only some pages are gone, or if your pages have simply dropped badly in the SERP's, then perhaps:
you have some other technical issue not noted above (e.g., badly executed redirects),
the algo changed,
you've done something recently that the SE did not like, or,
the algo changed and something that was previously "OK" is now being filtered or penalized.
Here are some specific things to look at:
Start with the basics: Was your server down recently?
Server failure is always a good item to check off your list when searching for problems. No need to start remaking your site if all that happened was a temporary problem.
Are you using a robots.txt file [webmasterworld.com], and if so, has it changed. , Is the syntax correct [searchengineworld.com]?
There are a variety of potential problems that can be caused by improper code in robots.txt files, or placement of the robots.txt file in the wrong location. Search WW on this topic if you're not sure what you're doing. Use the WW Server Header Checker [searchengineworld.com]. At worst, a robots.txt file can tell a SE to go away, and you really don't want that. ;-)
Have you more aggressively optimized recently?
Internal changes that can lead to potential problems include:
More aggressive kw optimization, e.g., changes to Titles, META's, <Hx> tags, placement and density of kw's, etc.
Link structure changes, and especially link text changes. Updates to link text or structure, if done for optimization reasons, can push a site into filter/penalty territory. Look in particular for overuse of kw's.
Have you added redirects?
The SE's *can* sometimes become confused by redirects. Assuming that the changes are intended to be permanent, use 301's, not 302's. Be especially careful about large scale changes. If done properly, redircts are important tools. Done without proper knowledge, they can lead to short term pain, often on the order of 1-6 months.
Do you have a significant number of interlinking sites?
If ever there was a strategy that might be summed up as: "Here today, gone tomorrow..." interlinking is it. You can succeed with this strategy. But if you add too many sites or links to the mini-net you're creating, or interlink too aggressively, it can catch up to you. Penalties can range from soft filters to complete manual removal in rare cases. Even with no recent changes to your sites, the SE algo's can change, making something that squeeked by yesterday illegal today.
Are you linking to sites in "bad" neighborhoods?
If ever there was a strategy that might be summed up as: "Gone today..." linking to "bad" sites is it. If you think that you might be linking to the dark-side, lose that link instantly, if not sooner.
Could you be suffering from a duplicate content penalty [webmasterworld.com]?
Some practices or occurances that can cause problems in this regard include:
Use of a single, site-wide template
Use of one template across multiple sites
Competitors stealing or mirroring your content
Redirects from an old domain to a new one
Over reliance on robots.txt files to exclude bots from content areas you don't want exposed. WebmasterWorld Thread:
Are you cloaking?
Some cloak merely to deliver "accurate" pictures of sites/pages to the SE's. Examples of this are sites with lots of graphics and little text. But if you're a mainly text based site that is delivering one set of content to the SE's while users are seeing something less...umm...optimized...then there's always the risk that you've been caught.
Are you using AdWords?
This is pure speculation on the part of some seniors here, but some do seem to firmly believe that if you place highly with an Adwords listing, it might actually hurt your position in the SERP's. Don't shoot me. I'm just the messenger.
IF OTOH, the only issue is that you're not as high in the rankings as you'd like, then a better place to start would be Brett's 26 Steps to 15K a Day [searchengineworld.com].
Best of luck! ;-)
You may want to dig through some of the recent discussion in the Yahoo Forum [webmasterworld.com] to learn more about their algorithm and what works well there. It's definitely a different beast than Google's algorithm. Good luck.
I have a fairly new site that we had gotten up to a PR5. We then had a ton of server problems causing our site to be down about 20% of the time. We dropped to a PR4.
It took us about a month to move it to a new server which we have been on for two weeks with no issues.
Do you have any idea how long it should take before Google give us our PR5 back?
Thanks a million! This thread was very helpful for me.
But questions about Google are best asked in Google News [webmasterworld.com] where you will find the real experts in that field.
(Sorry for being slow at adding this)
[edited by: troels_nybo_nielsen at 6:49 pm (utc) on May 20, 2004]
Let's try to keep this thread on topic (e.g., by adding to the list of things that might be the cause of a site disappearing from the SERP's at a SE). The point is to have a solid checklist of potential issues to start investigating, if and when your site suddenly seems to vanish.
I know I've been out of the loop for a couple of months, but aren't the above the basic tenents of SEO? Or have the rules suddenly been turned on their heads - have I really missed something recently?!? Isn't the first step to optimise the content of your pages for the keywords, and the second step to get incoming links with the same anchor text to those pages?
I hope this helps you guys.
Also If you have 2 Web Sites and targeting same keywords on the same server, Only one of them ranks even if other one has higher PR.
This is exactly one of the things I was thinking about, even if you're not targeting the same words, but the sites are related in subject matter and tend to run in the same neighborhoods.
caveman - I'm eager to see what you post on the IP number issue. It's a topic that might also need its own thread.
Incidentally, when your server is down, Google will try again before dropping you. I don't know what kind of time window you have, but GoogleGuy has said this explicitly on the forums here, and Matt Cutts has said it at SES.
Meta Refresh leads to ...
... Replacement of the target URL!
There's also a thread about this on Yahoo. Same deal.
Do You Have Hundreds of Domains on the Same IP?
Some of the items in this list have to do with degree. "The IP question" is one of those. Is having a bunch of sites on a single IP bad? No. But what if all those sites are on the same topic? What if they are all interlinked? Point being, if a major SE smells a spammy network, then 340 interlinked sites on a single C block are a strong bit of evidence that something spammy this way comes.
There's another piece too, having to do with static versus virtual IP's. Some believe, as I do from experience, that SE's handle static IP's better than virtual IP's. However, you *will* hear arguements to the contrary. My personal bottom line: It's all a matter of risk/reward. How much is that static IP worth to you?
I've heard by word of mouth that the same whois data may have triggered some drops on other sites, but my guess is that, on those particular sites, there may have been other factors that prompted the inspection.
In the case of IP's for example, I don't believe that just having 10 sites on an IP or in the same C block, even if they are all linked, neccessarily gets you booted. In fact, I know it doesn't. We run a network, try hard to play by the rules, and so far, so good. ;-)
IP-related site problems, in my view at least, are often intertwined with other potential issues. For example, as I said, I know you can have a network of 10 sites, and link them. But what if there's a lot of common text across all 10 sites, or a single template across all 10, or the content is too similar, etc. So much is co-dependent, at least WRT a site dropping suddenly from the SERP's.
In the case of IP's (site ownership, really), any of these could be red flags to a SE:
too many on a C block (I can't define "too many")
too many on a virtual IP
too many with the same WHOIS info
having a site on a C block where a major spammer has been identified
etc., etc., etc.
It all boils down to this: If an SE sees a pattern that leads to a conclusion that someone is trying to artifically inflate PR or otherwise 'game the system,' then that someone *may* be in trouble. Going back to the premise of the first post, when the issue is that a Webmaster has just awaken to find a site missing, what does s/he begin to consider? Certainly, IP and ownership data are among the things that should be on that list.
Doing the detective work to identify the problem with a high degree of confidence is another matter. And so is evaluating the changes one makes to address problems, and drawing conclusions about the effects of those changes.
this is pure speculation. there is no evidence whatsoever that these are true.
Use of a single, site-wide template
The majority of sites I've seen use site-wide templates... usually one for the home page one for the rest. WebmasterWorld uses a basic template for all its forum posts. So, I'm not sure what you mean by "template," but I'm guessing it must be something different than the kind of template I'm thinking about.
No, I'm afraid it certainly is not pure speculation. But that's irrelevant to us who are singing along here, since it's a "check-list."
>>there is no evidence whatsoever that these are true.
There most certainly is evidence that it *can* be true, beyond a shadow of a doubt!
Any and all things that are and have been mentioned in research and academic documents as having bearing on the particular algorithm being discussed, unless they're proprietary and/or patented, can most certainly apply to and be used by any and all search engines in their scoring.
There is a big difference between nay-saying and doubt-mongering, and sincere investigation into possibilities. A world of difference!
Well, it seems that the site-wide templates point has generated a reaction.
Obviously, most sites need to operate with templates. Not only do templates make site creation and maintenance easier, they typically enhance the user experience.
Generally speaking, the issue with templates has to do with churning out too many pages without adding much real value to a site. I could be mistaken, but I believe that this problem has accelerated over the past six months, possibly as a response to Florida and subsequent updates. And I don't think the problem is confined to larger sites...some smaller ones are feeling it too.
As Marcia and others have noted elsewhere in WW, the major SE's have been pretty clear about their feelings that duplicate content and pagemill sites have gotten out of control.
Unfortunately, what may also be happening is that some very legitimate sites are being caught in the nets.
<Come to think of it, pretty soon Brett is going to have to start banning all these new "What happened to my site..." threads, just to avoid a duplicate content penalty. Geez, ya can't win for losin'! ;) >
Yes they sure do. But they don't necessarily need to be linking back to the homepage from the top heading graphic with the main keyword phrase in the alt attribute, plus that same keyword phrase in the alt attribute of a graphic or anchor text in a side navigation column, plus that same keyword phrase as the anchor text of bottom navigation - all linking back to the homepage.
I've seen several sites having some problems with the top heading graphic done that way. In some cases that alt text plus the same text on top sitewide are *all* that shows in the description snippets for all pages on the site.
Conclusive? No. Worth looking into if there's a problem? Yes. That's what check-lists are for.