Hello Herb, and welcome to the forums.
A couple things may help. First, that kind of penalty is often related to gaming backlinks - something the earlier SEO may have done. Since Google does not usually show a full backlink report, I'd suggest seeing what you can learn through the Yahoo! Site Explorer.
Second, you may want to have a look through the Hot Topics [webmasterworld.com], which is always pinned to the top of this forum's index page. There is a lot of meat in there about various iinds of problems and how people have resolved them.
Thanks for the warm welcome, I have been a reader since this site started, but have now found a good reason to post.
I checked yahoo and found about 7 inbound links so far that need attention. There are 600+ inlinks pointing to my homepage via yahoo, these are among those. I suspect that when I look for links to pages other than my homepage I may find a whole new batch. I hope not, and if I dont it seems horrid that a penalty would be applied based on 1% of a sites inbound links.
5 are gaming style sites, 1 lingerie site and 1 enlarge your willy site.
My problem has been contacting these folks and asking for removal. about 4 of the 7 have no way to contact them, either the domain is shielded through the host, the email bounces or some other WALL. The rest I have sent requests to but I doubt I will get very far.
This is a crappy situation, some of these sites have links that I nor the company ever requested, its almost as if someone else has requested the link using our name. They use a form for adding and modifying the links, the modify link is broken of course. (What a great way to hurt your competition).
This is the main flaw with attaching a penalty to a site based on inbound links. One guy with an issue can damage a site.
If I cant get these links pulled does that mean the penalty will stay in place, does that seem fair?
What do you folks think about another re inclusion request to google pointing out the links and explaining that I have tried to get them removed but have not had luck. Do you think they would actually do something about it? Is there anyway to verify this is the penalty causing my trouble?
Very frustrated.. (But happy to have you folks helping me out.)
Herb (pulling hair out)
[edited by: tedster at 5:25 pm (utc) on Aug. 1, 2008]
A couple of things jump out at me :
|and shows 1100 indexed urls via the sitemap, but when I do a site: search google is only indexing about 400 of the 1100 pages. |
Have a look at the pages that don't show in the index. 400/1100 pages is 36% of your site. If these pages have no TBPR then it's possible they have insufficient link juice.
They may also have insufficient unique content. If 36% of the site is considered of no value, this is enough to suck the site out of the results, and may act like a penalty.
|we had some duplicate domain and content issues, but those were resolved in April of this year |
Not all duplicate content is obvious to the eye. WMT's recent [ albeit flawed ] reporting may give you some clues.
There's one thing I've noticed over the years when Google returns site's to the SERP's with regards to duplicate content. It seems to trawl the whole site, score it and then re position it out of the results if it continues to have duplicate content issues.
So I'm not sure if "retiring old penalties" applies to duplicate content problems.
Thanks for the response. When I first noticed the indexing problem I found that only 378 pages out of my site were in the index. I compared my meta descriptions, at that time I found that SEVERAL had duplicate meta descriptions. I ran the numbers and found that I had exactly 378 unique meta descriptions.. That same week the WMT content analysis caught up to the new cart and host and showed 450+ duplicate descriptions. I found that this was only a small chunk of the duplicates that were on the site, the actual number was close to 900. I then corrected the error and made sure that each of the product pages had a unique meta description. Since then the WMT content analysis as almost caught up to my changes and today reports about 20+ duplicate meta descriptions over about 50 pages. These are mostly category pages that take more than one page to list the products, index.html, index2.html, same category, second page of products. I am working to fix that but its less than 5% of the whole site.
My other duplicate content issues revolved around other domains redirecting to the main domain, yet google crawled them an indexed them. It took some time to clear that, but I was able to add the domains the the WMT, kill the forwarding and ask to have the domains removed. After a few weeks they were gone from the index. I had a post at the google help groups about it.
As to PR, this last PR update has started to give my cats PR, last week they had the grey bar, now they float between 2 and 3, mostly 2. My sub cats still have they grey bar, as do many products.
This is a good site, it sucks if we have been penalized based on links pointing to us. Links that we did not request.
The real shame is that I have worked HARD over the last 18 months and gotten no where, and may be in a position where I wont be able to correct this problem if its based on inbound links. Its very frustrating.
[edited by: tedster at 11:41 pm (utc) on July 31, 2008]
[edit reason] removed specifics [/edit]
|The real shame is that I have worked HARD over the last 18 months and gotten no where, and may be in a position where I wont be able to correct this problem if its based on inbound links. Its very frustrating |
I don't know how Google handles links historically that have breached their guidelines and cannot be removed - does anyone know ?
Elsewhere I saw someone say something like:
*** Until this week, I haven't had any Google traffic since Google's 2005 October Jagger Update. ***
Now, that is a long time in the wilderness.
Over the last 6 months we finally started to get traffic back ourselves. Jagger was bad, but big daddy was a cruel punch for an ecomm site that never bought links or did any dirty seo. We did have some duplicate content that was fixed early 2006... but some is just unavoidable when talking about similar products ..
One thing i suspect is that Google doesn't forgive some duplicate content issues easily and quickly.
A site I saw had stayed under since 2005, it's only offence was inadvertant and poor administration of the duplicate content management, resulting in internal duplications. 99.99% of webmasters don't understand this , so why such a heavy penalty. There were no issues with links.
Then the site came out of penalty in around mid 2006 and excitement was created. A whole bunch of redirects were sent off another site into the released site to reposition the content correctly.
4 weeks later it disappeared again and has only managed a minus penalty status in recent months.
What I am saying is Google seems to invoke penalties on sites where the owners have absolutely no idea why they are being penalised and an unreasonable amount of absense from the SERP's, and in circumstances where the "offense" is seemingly "normal" or "trivial" .
Some of these penalties come about because folks have no means of knowing how to administer the management of their sites correctly in relation to the big G. Duplicate content is but one aspect that's been spoken about for years and years and we still don't know the full implications [ albeit it got a lot better over the last couple of years ].
Is there justification for this extended penalisation for pure "housekeeping" issues ?
Doesn't G owe some level of communication responsibility to assist in the administrative difficulties of site owners and open up more. I don't think any of this need interfere with SERP quality or issues of "secret soup" protection.
|The google webmaster tools are not showing any problems and shows 1100 indexed urls via the sitemap, but when I do a site: search google is only indexing about 400 of the 1100 pages. |
Any suggestions? How can I identify the penalty in place, what do you suggest I do to get it removed?
Herb, first off having 1100 pages indexed but them only showing 400 is fairly normal. It usually shows like that when you go to the end of the results returned and click for "similar pages" which I refer to as the similarity filter.
That isn't the same thing as garden variety penalties for other reasons, but the first thing to do is find out which pages are being evaluated as too similar and what's causing it. Often times it's on-site search script results, which really aren't "pages" as such. Same thing with some shopping carts.
On the other hand, if a site has a large number of pages and decent IBLs and homepage PR but only a small handful aren't Supplemental, the internal navigation and PR distribution has to be seriously re-evaluated.
|On the 28th and 29th our site finally ranked at position 39 for < one target keyword > but on the 30th we went back to position 580. We have been floating between 500 to 900 forever and no matter what we have done the site has not cracked 100 until just a couple of days ago. Its obvious that a penalty is in place of some sort but I cant figure out what its for. |
That isn't for similarity, it's a different issue
[edited by: Marcia at 6:39 am (utc) on Aug. 1, 2008]
|Then the site came out of penalty in around mid 2006 and excitement was created. A whole bunch of redirects were sent off another site into the released site to reposition the content correctly. 4 weeks later it disappeared again and has only managed a minus penalty status in recent months. |
We need to be wise when creating 301 redirects. There are definitely trust problems and penalties for using 301's in a way that Google considers manipulative rather than merely informative. The use of redirects is an even more sensitive area than backlinks.
|definitely trust problems and penalties for using 301's in a way that Google considers manipulative |
Are there any definitive threads that cover how to handle this properly - i couldn't find anything?
Does Google let go of old penalties related to this?
*** My other duplicate content issues revolved around other domains redirecting to the main domain, yet Google crawled them and indexed them. It took some time to clear that, but I was able to add the domains the the WMT, kill the forwarding and ask to have the domains removed. After a few weeks they were gone from the index. ***
If those other domains used a 301 redirect from the very beginning then they would not have been indexed at all.
My guess is that you used a 302 redirect, hence the problem. The 302 allows duplicate indexing.
However, there is another way that URLs with a 301 redirect stay indexed. This occurs when the URL first changes from being 200 OK with content to become a 301 redirect. The redirected URL is then added to the Supplemental Index and stays indexed for anything up to a year. That is NOT a bad thing because those listings will still send you traffic, and those listings are NOT treated as being Duplicate Content, because a 301 redirect is NOT *CONTENT*.
I note you have stopped the redirects from working. That is likely a bad thing, as now all those old links and old bookmarks send the visitor to a dead end. You have now cut off some of your traffic sources. The 301 redirect is absolutely the right thing to have done in this situation.
I am not sure if you misread what was happening with Google listings, or whether the redirects were botched in some way.
I don't think we've ever had a dedicated thread about 301 abuse - but it sure comes up at PubCon site review panels. I'm particularly remembering this one with Matt Cutts and Greg Boser [mattcutts.com] plus an all-star line-up.
Several times Greg began recommending various 301 actions to a webmaster and Matt was over at the end of the table suddenly doing silent, mock calesthenics. When he got the floor, Matt issued very strong cautions about using 301's only to improve rankings. He said that Google had even built an internal tool for tracking 301 activities that they called the "Greg Boser" tool.
When the general webmaster/SEO community started to learn about 301 redirects, some went quite wild, throwing 301s around like confetti - and then getting smacked down hard. It was like a new toy on the market and it became "all the rage."
The potential for 301 abuse is well beyond that offered by link manipulation - and so Google really gives 301 redirects a trust check-up. I'm sure that this is one of the reasons that changing to a new domain can be so difficult.
The webmaster knows when they are placing a 301 (or a chain of redirects) only because of trying to manipulate rankings - and when they are using it in an informative, intended fashion. Too much 301 action, especially placing them and then switching them around, or chaining them in with other kinds of redirection, can definitely cast a pall over a domain... or a network of domains.
Maybe some of these historical 301 penalties are part of those old penalties that are now being forgiven - I can't say for sure. But I can say that the 301 redirect is a kind of power tool and it should be used only as the instruction manual intends - essentially, pointing to a new location for previously published material.
And given that "cool urls don't change" I personally recommend limiting use of the 301 redirects. There are times it is exactly the right tool, but many times its use has become very casual and abusive.
I wonder if that is why a URL that returns 404, and then becomes a 301 at some later date, doesn't get credit for being a 301, and still stays marked as a 404 in WMT data [webmasterworld.com]? Perhaps they see that as an attempt to divert PageRank. In my case, the previously-404 URL never contained content, the incoming link is a typo; and that URL now returns a 301 redirect to the correct URL.
Good question - that kind of 301 is certainly appropriate and natural enough with typos on external links, but I guess it might get caught up in some kind of automated detection routine. But somehow I don't think Google intends to make trouble for a 301 that's used to capture backlink typos. After all, that's a good thing for the user.
Wow, great posts guys.
|"I note you have stopped the redirects from working. That is likely a bad thing, as now all those old links and old bookmarks send the visitor to a dead end. You have now cut off some of your traffic sources. The 301 redirect is absolutely the right thing to have done in this situation." |
Let me clarify, one of the two domains was pointing to the old cart content as a reference. Our mistake which lead it to being crawled. The other domain was a .net we used to develop the x-cart store, I had a noindex tag but I think it was a little late as the site was still crawled and indexed by google. I set up a redirect from .net to .com and then took steps to get the site de-indexed.
In order to remove the domains, I had to verify them in WMT, then make sure they reported 404 in order to be removed from the index using the removal tool.
|Make sure the content is no longer live on the web. Requests for the page or image you want to remove must return an HTTP 404 (not found) or 410 status code. Block the content using a meta noindex tag. Block the content using a robots.txt file. |
I had a 301 in place on the .net that had to be removed in order to get the site pulled from the index based on googles rules. I was able to get our previous host to put a noindex tag on the .com duplicate which allowed it to be removed. I could not do this with the .net because it was a single hosting account that did not allow me to have a separate .net and .com, they both pointed to the same content, so if I put a noindex in place it would have shown that for my primary domain. Kind of a catch 22 situation.
|"Maybe some of these historical 301 penalties are part of those old penalties that are now being forgiven - I can't say for sure. But I can say that the 301 redirect is a kind of power tool and it should be used only as the instruction manual intends - essentially, pointing to a new location for previously published material. " |
This is exactly what we had to do, point to the new location. We had all the old pages on a format that was not the compatible with x-cart. We had to switch out all of our url's when the site moved. Instead of productname.html the new site used /category/subcategory/product-name/ which used the full path to that category. I can see how this would 'LOOK' like I am trying to manipulate the 301 to increase the keywords in the url but since I was forced to use 301's I tired to be smart about it. We HAD to use 301's to direct the content and this category based system allowed for internal tracking and better stats on sub-categories. x-cart uses a naming system that sucks and put numbers and other crap in the url I did not want.
For the record, when we found the duplicate content issues, we fixed them and sent a re-inclusion request outlining exactly what happened. Of course we got no response but a message was sent using wmt.
If its a 301 penalty then it has been applied unfairly because we used the 301's to direct the old content to the new content.
If its a link penalty then I am screwed there as well. Just unfair.
On a brighter note I have been able to get 2 of the bad links removed since yesterday, I am tracking down the rest. I suspect the final count of links that I wont be able to change will rest around 3 or 4.
How would I word a re inclusion request to get google to see that I am not trying to game the 301 system or responsible for the inbound links? Any suggestions?
Thanks guys, this info has been very helpful.
[edited by: tedster at 11:27 pm (utc) on Aug. 2, 2008]
[edit reason] made the example urls anonymous [/edit]
|Too much 301 action, especially placing them and then switching them around, or chaining them in with other kinds of redirection, can definitely cast a pall over a domain... or a network of domains. |
Have you seen if this results in penalties and if so do these penalties lapse over time?
|Maybe some of these historical 301 penalties are part of those old penalties that are now being forgiven - I can't say for sure. |
.... and if not, would this be something worth a reconsideration request or another remedy?
Sometimes these things happen with innocent intent.
sorry - i hadn't read this post [webmasterworld.com...] which carries on this issue
bunnyburybaby I would say there is serious issues with trust on the site and it may take a while for the penality to be lifted. I would suggest to begin targeting long tail searches as best ya can with content and links going to internal pages on the site.
Forget the tough one wordered terms go after the real money in the long terms for your vertical. Not near as much traffic but it converts. I know it is frustrating but when improper things are done it takes time to get back in. Use this as a means to better yourself by finding other ways to get traffic to the site I promise you it will make you a much better weber.
I was under the filter for 2 years never give up just get smarter.
My site is back to page one (#6) for a single KW that has been gone for a long time. The first time we hit page 1 was 1/15/2005. We steadily rose to #1 over the next 2 months and stayed in the top 5 until Nov 2006 where we fell off the map. It bounce in and out of the top 200 on occasion but never showed any real strength, until 7/26/2008 where we popped up to #21.
I had pretty much written it off as a lost but now I think it might be time to start adding more products.
long time no post.
tedster I see you are still beavering away, you must be the world's no1 authority on google penalties by now (outside google)!
just had to pop by to say, rather SHOUT, that after three years, three months and three days, thousands of man-hours and more money than I can shake a dead rat at, on July 4th 2008 my site came out of penalty zone.
Who knows why or how, that we nor countless 'seo experts' never ever found out. Now I just have to kick this dormant volcano back to life again..
| This 50 message thread spans 2 pages: < < 50 ( 1  ) |