Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: mademetop
"Losing" a site can be a painful and frustrating experience. To help ease the pain, perhaps a starting list of potential issues might help. I'll probably miss more than I'm catching with this list, but at least it's a start.
Do a site search at the SE in question to determine if all of some of your pages are gone. Some think that their site has vanished, when in fact an algo update or tweak has occured causing their pages to drop. Or, individual pages have been filtered or penalized, but not entire sites:
If *all* of your pages are gone (search on URL's to check that), then perhaps:
• your server was down at an inopportune time.
• you have a robots.txt problem.
• you've been removed from the index based on a perception of bad behavior (not good).
If only some pages are gone, or if your pages have simply dropped badly in the SERP's, then perhaps:
• you have some other technical issue not noted above (e.g., badly executed redirects),
• the algo changed,
• you've done something recently that the SE did not like, or,
• the algo changed and something that was previously "OK" is now being filtered or penalized.
Here are some specific things to look at:
Start with the basics: Was your server down recently?
Server failure is always a good item to check off your list when searching for problems. No need to start remaking your site if all that happened was a temporary problem.
Are you using a robots.txt file [webmasterworld.com], and if so, has it changed. , Is the syntax correct [searchengineworld.com]?
There are a variety of potential problems that can be caused by improper code in robots.txt files, or placement of the robots.txt file in the wrong location. Search WW on this topic if you're not sure what you're doing. Use the WW Server Header Checker [searchengineworld.com]. At worst, a robots.txt file can tell a SE to go away, and you really don't want that. ;-)
Have you more aggressively optimized recently?
Internal changes that can lead to potential problems include:
• More aggressive kw optimization, e.g., changes to Titles, META's, <Hx> tags, placement and density of kw's, etc.
• Link structure changes, and especially link text changes. Updates to link text or structure, if done for optimization reasons, can push a site into filter/penalty territory. Look in particular for overuse of kw's.
Have you added redirects?
The SE's *can* sometimes become confused by redirects. Assuming that the changes are intended to be permanent, use 301's, not 302's. Be especially careful about large scale changes. If done properly, redircts are important tools. Done without proper knowledge, they can lead to short term pain, often on the order of 1-6 months.
Do you have a significant number of interlinking sites?
If ever there was a strategy that might be summed up as: "Here today, gone tomorrow..." interlinking is it. You can succeed with this strategy. But if you add too many sites or links to the mini-net you're creating, or interlink too aggressively, it can catch up to you. Penalties can range from soft filters to complete manual removal in rare cases. Even with no recent changes to your sites, the SE algo's can change, making something that squeeked by yesterday illegal today.
Are you linking to sites in "bad" neighborhoods?
If ever there was a strategy that might be summed up as: "Gone today..." linking to "bad" sites is it. If you think that you might be linking to the dark-side, lose that link instantly, if not sooner.
Could you be suffering from a duplicate content penalty [webmasterworld.com]?
Some practices or occurances that can cause problems in this regard include:
• Use of a single, site-wide template
• Use of one template across multiple sites
• Competitors stealing or mirroring your content
• Redirects from an old domain to a new one
• Over reliance on robots.txt files to exclude bots from content areas you don't want exposed. WebmasterWorld Thread:
Are you cloaking?
Some cloak merely to deliver "accurate" pictures of sites/pages to the SE's. Examples of this are sites with lots of graphics and little text. But if you're a mainly text based site that is delivering one set of content to the SE's while users are seeing something less...umm...optimized...then there's always the risk that you've been caught.
Are you using AdWords?
This is pure speculation on the part of some seniors here, but some do seem to firmly believe that if you place highly with an Adwords listing, it might actually hurt your position in the SERP's. Don't shoot me. I'm just the messenger.
IF OTOH, the only issue is that you're not as high in the rankings as you'd like, then a better place to start would be Brett's 26 Steps to 15K a Day [searchengineworld.com].
Best of luck! ;-)
I wrote an article for a well known PR8 (homepage) site. They published it with a link to my site, Google picked it up. Two weeks later (in October) I posted the same article on my site. Now I come up first and they come up second for search terms specific to that article. It may be that the links to us in the original article helped? I don't know. Just sharing my experience in case anyone else is interested in whether there are exceptions to the dup content rules.
I can A/B test a certain pharma affiliate banner link over and over and see the same result: put the link in and disappear. Take it out and pop back up in the SERPs.
Ditto for AdBright one one particular page. Not sure yet why, but put it in and lose the page. Take it out and the page comes back. Page has adsense showing alt ads only.
Are you a victim of (badly constructed or malicious) redirects to your site from external sites?
A widely discussed, multi-dimensional problem: [webmasterworld.com...]
Have you been VERY aggressive in adding external links to your site?
WARNING: Potentially controversial.
I believe that we’ve seen strong evidence of aggressive link building campaigns leading to G dropping sites/pages way down in the SERP’s.
In one case, a competitor’s site that was in the midst of a very aggressive link building program (and no other discernable changes) vanished from G’s visible SERP’s...after climbing dramatically for a short period. Since then, some of the site’s pages have returned, but only way down in the rankings. The site continues to benefit from their link building program in Y’s SERP’s. We can find no other reason to explain their fate, and we've heard thru the grapevine that they've concluded same.
1) “Unnatural growth” filters are now in place (a G only phenomenon.)
2) The less overall strength/quality a site exhibits, the more likely it is that ‘unnatural behavior’ - perceived by G as too much SEO - will lead to negative consequences.
3) Since the Florida update, sites have pretty obviously needed to clear certain 'hurdles' (quality indicators) to do well in G's SERP's. Those hurdles recently got higher.
4) The better a site is at clearing hurdles, the less likely it is to be filtered for unnatural activity, to the point that some sites are almost bullet proof.
Have you been NOT aggressive enough in building links?
We recently saw the first notable changes in both Y and G that we’ve seen in a while. The G changes led to significant shifts in some SERP’s. Sites with an insufficient number of quality backlinks dropped substantially. Lazy webmasters repent! (I have.)
Another instance of what we now see as G’s ‘higher hurdles’ philosophy, where the hurdles just got higher.
Have you been VERY aggressive in adding pages to your site?
WARNING: Very controversial, and quite possibly wrong, but I'll include it anyway since this thread is a checklist, and some believe this hypothesis may have merit.
Background: Several months ago, two of our own sites got badly hurt in the G SERP’s after adding a large amount (+35-40%) of new pages all at once. These sites were the only two of ours that had large amounts of pages added simultaneously, and the only two that got hit within weeks. I might have chalked it up to coincidence, but a thread on the subject [webmasterworld.com] seemed to suggest that others had similar experiences.
This is admittedly unproven. And without question, other webmasters have confirmed that they added large amounts of pages with no ill effect. But, it is our belief as noted above that much of how G operates these days has to do with ‘hurdles,’ and it is possible that those sites to which we added many new pages at once did not have a sufficiently large number of backlinks to avoid being filtered for perceived ‘unnatural growth,’ whereas others who added pages in bulk (and had no problem) enjoyed the advantage of a much larger number of backlinks - and that is what kept them above water.
An alternate explanation is that this issue could have had something to do with dup filters (i.e., the addition of those pages put too many pages on the site that looked like other pages). However, I doubt this, since it was mainly the page structures that were similar, not the text/content. And I'm not sure why this would have taken the entire site(s) down.
One more reason that this point is controversial: Some new sites are clearing the sandbox by employing linking strategies that are massively unnatural. Their success however, could be a function of having cleared some hurdles by *so much* that the site becomes nearly immune to filters. :-)
Bottom line: Enough webmasters have posted, stickied or emailed on this to make me think it is at least a possibility.
Our overall conclusions: It has never been more important to follow Brett's rules for site building [searchengineworld.com] (what else is new?)...or, to learn how to blog spam (not recommended). :-)
What a lot of it might point to, both internally within sites and with inbound links, is looking carefully at percentages of identical anchor text.
Also, if sites drop way down in spite of having lots of IBLs, particularly if some are sitewides, look at what portion of the page those links reside on.
It bears repeating and even more so now - in fact, it's critical now. If there's more than one domain name pointing to a site, make double sure they don't all return a 200. With all the page scraping going on, all it takes is one picking up on that secondary domain name and linking to it, which CAN appear as a link on the same scraped page as the link to the *real* domain name, and they'll both be identified and crawled, with the identical content for two or more domains.
Yeah I think that this may be an overlooked issue for many. Doesn't apply to the examples I noted above (for our sites, or for the competitive example cited), but we know it's an issue, especially since Florida. I vaguely alluded to it in the very first post I made in this thread ("Have you more aggressively optimized recently? -> Link structure changes, and especially link text changes"), but it is worthy of more specific attention.
At this point we frequently keep coming back to the mantra of avoiding unnatural activity: Too much of any identical thing, including identical kw anchor links, can be a problem (particularly the external ones, since it's obviously *not* organic if tons of sites suddenly link to you within a period of just a few weeks, using identical anchor text). Doh!
IMO, this applies to a lot of SEO elements, at least as far as G is concerned. Personally, for example, I'd never use the same exact kw phrase in: Title, beginning of main META's, first text on page, and <H1/2/3> . But that's just me. Also, some of this has less to do with dropped sites than with poorly performing sites. ;-)
Just a quick question regarding duplication (or perceived duplicaton in this instance).
I run a travel website with hundreds (and growing) of itinerary pages. In other words, pages that look very similar in shape and form ( I purposely wanted to keep all pages similar for ease of reading and especially PRINTING) but each page is a seperate, totally different itinerary. Some are 2 days and others are 40 days long.
Surely, it above threads are correct, this will be seen as duplication!?!?! I am not doing well in the SE's at all - ESPECIALLY G. Msn and Yahoo on the other hand is acceptable.
Could the scenario described above be the problem? And if so, how on earth can I change this. I cannot for each and every new itinerary we create, make a page that looks so much different to the others.
Please, if anyone can share some thoughts it will be great.
Let's try to confine the posts in this thread to general issues that can lead to sites being dropped. Specifics are best used when reinforcing or clarifying points being made.
Requests for individual help, or questions related to specific situations, probably belong in other threads. Focus helps keep each thread in WW sharper and easier to digest. :-)
its all speculation speculation speculation
Yes it is! And very well educated speculation backed up by tons of experience at that!
Great thread guys. One of the best in a very long time with loads of meat and potatos for all to enjoy. Keep them coming and lets stay on track!
Google "down grades" the second page found that matches a prior page found. That just means, it will be buried so deep, that you will be lucky to find it under its own url.
Not to nit pick Brett, but I had content "borrowed" by our own Tourist Board. Their page beat mine using my own original copy because their page has way more inbound links than my page does. Where I was ranked #1, the tourist board is now #1 and I am now #2.
So, although there is duplicate/borrowed content and all things being equal ... PR wins. At least, that's how I see it in this one case.
other webmasters have confirmed that they added large amounts of pages with no ill effect. But, it is our belief as noted above that much of how G operates these days has to do with ‘hurdles,’ and it is possible that those sites to which we added many new pages at once did not have a sufficiently large number of backlinks to avoid being filtered for perceived ‘unnatural growth,’ whereas others who added pages in bulk (and had no problem) enjoyed the advantage of a much larger number of backlinks - and that is what kept them above water.
This seems like a very good guess, and would account for the different results you've seen reported. I think Google may have added maybe two new flags to their algo over the last year, and this could be one of them.