Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: mademetop
"Losing" a site can be a painful and frustrating experience. To help ease the pain, perhaps a starting list of potential issues might help. I'll probably miss more than I'm catching with this list, but at least it's a start.
Do a site search at the SE in question to determine if all of some of your pages are gone. Some think that their site has vanished, when in fact an algo update or tweak has occured causing their pages to drop. Or, individual pages have been filtered or penalized, but not entire sites:
If *all* of your pages are gone (search on URL's to check that), then perhaps:
• your server was down at an inopportune time.
• you have a robots.txt problem.
• you've been removed from the index based on a perception of bad behavior (not good).
If only some pages are gone, or if your pages have simply dropped badly in the SERP's, then perhaps:
• you have some other technical issue not noted above (e.g., badly executed redirects),
• the algo changed,
• you've done something recently that the SE did not like, or,
• the algo changed and something that was previously "OK" is now being filtered or penalized.
Here are some specific things to look at:
Start with the basics: Was your server down recently?
Server failure is always a good item to check off your list when searching for problems. No need to start remaking your site if all that happened was a temporary problem.
Are you using a robots.txt file [webmasterworld.com], and if so, has it changed. , Is the syntax correct [searchengineworld.com]?
There are a variety of potential problems that can be caused by improper code in robots.txt files, or placement of the robots.txt file in the wrong location. Search WW on this topic if you're not sure what you're doing. Use the WW Server Header Checker [searchengineworld.com]. At worst, a robots.txt file can tell a SE to go away, and you really don't want that. ;-)
Have you more aggressively optimized recently?
Internal changes that can lead to potential problems include:
• More aggressive kw optimization, e.g., changes to Titles, META's, <Hx> tags, placement and density of kw's, etc.
• Link structure changes, and especially link text changes. Updates to link text or structure, if done for optimization reasons, can push a site into filter/penalty territory. Look in particular for overuse of kw's.
Have you added redirects?
The SE's *can* sometimes become confused by redirects. Assuming that the changes are intended to be permanent, use 301's, not 302's. Be especially careful about large scale changes. If done properly, redircts are important tools. Done without proper knowledge, they can lead to short term pain, often on the order of 1-6 months.
Do you have a significant number of interlinking sites?
If ever there was a strategy that might be summed up as: "Here today, gone tomorrow..." interlinking is it. You can succeed with this strategy. But if you add too many sites or links to the mini-net you're creating, or interlink too aggressively, it can catch up to you. Penalties can range from soft filters to complete manual removal in rare cases. Even with no recent changes to your sites, the SE algo's can change, making something that squeeked by yesterday illegal today.
Are you linking to sites in "bad" neighborhoods?
If ever there was a strategy that might be summed up as: "Gone today..." linking to "bad" sites is it. If you think that you might be linking to the dark-side, lose that link instantly, if not sooner.
Could you be suffering from a duplicate content penalty [webmasterworld.com]?
Some practices or occurances that can cause problems in this regard include:
• Use of a single, site-wide template
• Use of one template across multiple sites
• Competitors stealing or mirroring your content
• Redirects from an old domain to a new one
• Over reliance on robots.txt files to exclude bots from content areas you don't want exposed. WebmasterWorld Thread:
Are you cloaking?
Some cloak merely to deliver "accurate" pictures of sites/pages to the SE's. Examples of this are sites with lots of graphics and little text. But if you're a mainly text based site that is delivering one set of content to the SE's while users are seeing something less...umm...optimized...then there's always the risk that you've been caught.
Are you using AdWords?
This is pure speculation on the part of some seniors here, but some do seem to firmly believe that if you place highly with an Adwords listing, it might actually hurt your position in the SERP's. Don't shoot me. I'm just the messenger.
IF OTOH, the only issue is that you're not as high in the rankings as you'd like, then a better place to start would be Brett's 26 Steps to 15K a Day [searchengineworld.com].
Best of luck! ;-)
Couple of points (without trying to nit pick)
A commonly spread misconception is that if Google detects duplicate content, that it will boot one out of the index. That is not the way it works.
Google "down grades" the second page found that matches a prior page found. That just means, it will be buried so deep, that you will be lucky to find it under its own url.
Well, about 6 months back a site I optimized was doing well!
Then the company got carried away and created more!
They then decided to link to each other with similar content for example mywebsite.com and mywebsite.net etc
Suddenly they vanished! (no surprise to me)
I then go to work on their site again, new contract :)
after a couple of months the site started to come back but not much, I was confused, so I checked by copying a block of text and searched in Google and bang! found another 2 web sites they had but didn't tell me.
So I got them to remove it. So now from 5 sites down to 1 site I checked all external links within the site and got it down to 2 from 15 which where worth linking to.
With in another month they were back on the 1st page of Google.
When sites dissappear it can be like a needle in a hay stack. you have to have a check list and caveman's is a good one!
Just became a preferred member, off out tonight drinks on me ;)
[edited by: lasko at 3:04 pm (utc) on May 19, 2004]
Google "down grades" the second page found that matches a prior page found
Do you think it's simply a matter of the time the bot finds the duplicate? Cause I can think of a scenario where a copy which was created after the original, is hit by the bot before the original. In this case, the original would be downgraded.
i have one question though: if you have made changes in the kw density for optimization, what kind of parameters are used to determine 'overuse of kws'?
"• Link structure changes, and especially link text changes. Updates to link text or structure, if done for optimization reasons, can push a site into filter/penalty territory. Look in particular for overuse of kw's."
caveman, you're gonna get a lot of gratitude from new members in the future!
[edited by: troels_nybo_nielsen at 8:18 pm (utc) on May 19, 2004]
Realdude if you search on an exact url of a page at most SE's, and cannot find your page that way, you can be pretty sure that the page is not in the index. Another method is to search on a very unique string of text from a page, which has the advantage of uncovering content duplication at other sites also.
>>• Use of a single, site-wide template
I've never understood what the problem is with this one.
For my site i use the same html file for almost all the 4000+ pages on my site. When i add new content i just update the H1 and add the content to the <p>'s.
I don't have the skill or interest to do design, i just put the content up.
Isn't a uniform look something that users like, it reminds them of which site they're on, makes navigation easier, etc.
If your site is completely missing for no reason:
- Is your site in competition with Yahoo's commerce?
- Did you fail to pay for XML feeds?
- Did you do anything unreasonable such as launch a website, add a page, remove a page, place content or ads on the website?
- Do your total number of pages add up to an even number, not an odd?
- Did you walk under a ladder on your way to work?