| 3:47 pm on Jan 9, 2007 (gmt 0)|
>> 78,000 new, unique content pages in the past 21 days! Site is now 95% new, unique content--something Google says they love.
No offense jwc, but how did you manage to do that (78000 new pages) in 21 days? If you search for a sentence, will it show up on other sites as well?
| 4:15 pm on Jan 9, 2007 (gmt 0)|
We used XML to feed content into the page. However, if you search for a specific sentence, then that by itself may not come back as unique. However, the page in total will come back as unique. Hopefully, that will pass muster with Google.
The remaing 2,165 pages in our site are completely unique.
| 4:31 pm on Jan 9, 2007 (gmt 0)|
*Hopefully, that will pass muster with Google*
Sounds like an MFA...
| 4:54 pm on Jan 9, 2007 (gmt 0)|
78,000 new pages in 21 days will trigger an alarm. I don't believe those are the types of "unique content" pages that google would like..
I put out aprox 1-2 hand written, articles a week.. nothing is computer generated.
| 5:04 pm on Jan 9, 2007 (gmt 0)|
78,000 "quality pages" in 3 weeks...
With 10 people that's 7800 pages each in 3 weeks.
That's 2600 each a week.
That's 520 pages a day.
If I get 2 quality pages out in a day, I am exhausted, my arm hurts, my eyes are dry and my wife is mad at me...
| 5:27 pm on Jan 9, 2007 (gmt 0)|
Ok guys, you are reading more into the unique content portion of my post than you should be. The other details are much more important so please reread.
FYI, the 78,000 pages of new content was achieved via using the same URLs as before. Instead of dumping the content into new webpages with new URLs, we dumped the old content, all of which was duplicate content due to the stupid programming firm I hired in the fall of 2005, and replaced it with content fed via the XML.
So I don't think Google will see this as more than they can or want to chew. We haven't done anything except what they tell us to do--make content-rich sites with compelling content.
And since Google has aggressively crawled since we uploaded the sitemaps--sometimes they download all the sitemaps at least a couple times a day--and cleaned out the old crap, then I think what we did was positive.
What bothers me is that we have done all they said to do and Googlebot crawls very aggressively which is a very positive sign. But the -30 penalty is still assessed. I honestly think that my site has dropped into some black hole at Google and won't get out until someone finally wakes up and realizes that "...this site shouldn't be penalized..." That is what happened at MSN.
That is why Google's communication, while a lot better than 12 months ago, is still poor. Any site penalized for a year needs a second chance.
Lastly, pia's like Sussie get attention from both Adam and Matt while honest, hard-working webmasters get ignored.
| 6:18 pm on Jan 9, 2007 (gmt 0)|
you said: "Lastly, pia's like Sussie get attention from both Adam and Matt while honest, hard-working webmasters get ignored. "
Where have they addressed this problem at?
| 7:02 pm on Jan 9, 2007 (gmt 0)|
put a goog noindex on those 78k pages and see if you can get in with the 2000+ you say are truly original.
Oh, and mention that fact to Goog when resubmiting.
| 1:07 pm on Jan 10, 2007 (gmt 0)|
I got back my number one place within 36 hours (from realizing that I had the minus-30 penalty). I will try to explain my case in detail below to help others.
1. I have a site with a domain keyword1-keyword2.com. The site have been first place when searching for "keyword1 keyword2" from 2004 until 2006-07. I lost about 80% of my traffic and 95% of my affiliate income. The site is a PR5 site with very good relevant links in - no linkfarm etc.
2. As I thought it was a new algorythm that domainnames is less important I did not think I could suffer from penalties. I have never worked with (according to myself) any blackhat links and techniques. I did not even consider the thought.
3. 36 hours ago I read more about this issue in this thread and realized that when searching for www.keyword1-keyword2.com I came
up on page 4 (place 31). Then I realized I have been punished by the minus 30 penality.
4. I directly changed following in my index.htm-file
A. Changed the title (removed my affiliates brands from title)
B. Removed all links to 100%-affiliatebased hotelsites (relevant
but as I figured out in this thread, hotelaffilate sites are
C. I did have some links to my other sites, not 100% relevant to
the topic. I removed all these links.
D. I hade a list in Arial 8 at the bottom of index.htm listing
all the products that are displayed on the other pages of
the sites. I removed all these words
5. Then I added the site to Google Webmaster Tools and asked for
a reinclussion. 36 hours have passed and I am now at place
one again for both www.keyword1-keyword2.com and for
a regular keyword1 keyword 2 searc.
So keep in there and good like to the rest of you!
| 1:09 pm on Jan 10, 2007 (gmt 0)|
I must add that I have almost identical sites in different languages that have not been penalized.
This makes me think that the penalty is manual applied after a spam report is filed by someone!
Maybe time to file spamreport for our competitors ;-)
| 6:23 pm on Jan 10, 2007 (gmt 0)|
This is amazing.. 36 hours for some and 9+ months and counting for others. It's crazy to think that I have less things wrong than you fixed and still harassed with a penalty.
If this penalty is indeed manually arranged, then it's impossible to guess what the google staff are looking for if all guidelines are followed (in my case).
| 6:32 pm on Jan 10, 2007 (gmt 0)|
I ran into a potential new client yesterday and checked out his sites. He too has a -31 penalty... Here are the details:
Roughly 200 pages
No optimization ever (Trust me on this, the guy isn't smart enough to do it)
Has the www - non www issue
Site titles are different throughout, no meta tags
Typical sitewide links... Home, faq, articles, products... About...
Product catalog is common around the internet, however this indivdual wrote all the copy instead of using the boilerplate manufacturer stuff.
1/3rd of site is this catalog, rest of the site is original articles regarding his radio show topics and a few request forms for various consulting work.
On the homepage there is a link to another domain he owns dealing with the same industry but a completely different catalog and set of articles. It too has the -31 penalty
He has 6 backlinks from some reputable websites. Both sites have a PR 4
Why on earth would this guy have a penalty? There is literally nothing wrong with these sites.
I love to hear from google on this one.
| 7:06 pm on Jan 10, 2007 (gmt 0)|
>> I love to hear from google on this one.
submit a reinclusion form then--assuming what you said is true.
| 7:35 pm on Jan 10, 2007 (gmt 0)|
James45, keep us posted please. I'd love to see if others innocent sites are being freed from the penalty!
| 8:28 pm on Jan 10, 2007 (gmt 0)|
I havenīt followed exactly how much traffic you lost, but if it was like me 85-90% down it is very little to loose trying some major fixes!
Why not try to make a total new index.htm - very few outgoing links and few (none) affilate links or hotels. Save your old index.htm. Submit a reinclusion and wait for a week.
If it gets included again - you can start adding things slowly ;-)
Just an idea!
| 9:26 pm on Jan 10, 2007 (gmt 0)|
thanks for the reply.
Yes I have lost 90%+ traffic from Google. I have actually done some dramatic things since April which can be considered major. Checked and implemented ALL the things that people have done that have gotten out of the penalty. Seems there is very little of us left that have been penalized for so long.. most stay in for a few months, post here, get out and never return! :)
Perhaps I will try a new index, however I am doubtful that it will do anything.
| 9:40 pm on Jan 10, 2007 (gmt 0)|
Does Google keep a record of re-inclusion requests? This would make it much easier when it comes to re-typing a list of things that have been changed in the past 9 months, etc, etc.
Does the same person at the receiving end of re-inclusions get it or is it a different person each time?
Pretty much just speculation.. but what do you guys think?
| 3:48 pm on Jan 11, 2007 (gmt 0)|
Today it's exactly 6 month, since Google hit 3 of my 4 Pages with the 31 Penalty.
The smallest one seems to be back since yesterday. I have done no changes on it.
The most important domain is completly new since 3 month. I've done 4 reinclusion requests. Nothing happens.
Traffic is down from 6000 google hits to 30 a day.
| 3:54 pm on Jan 11, 2007 (gmt 0)|
shredder, sounds like a familiar drop! Hurts.
For the one site that has come out of the penalty without any changes to it (clear proof that the -30 penalty is flawed), what is the theme of the site?
[edited by: AustrianOak at 3:55 pm (utc) on Jan. 11, 2007]
| 5:40 pm on Jan 11, 2007 (gmt 0)|
this weekend I shall submit my first re-inc request and I will let everyone know. I feel that I meet and exceed all of the Adam's pre-requisites. Many, many pages have been deleted and I am in the process of checking the rest one by one manually and beefing them up. When in doubt, they will be removed and added as time goes on.
| 6:23 pm on Jan 11, 2007 (gmt 0)|
My one site is still (last 3 consecutive days) showing a 33% INCREASE in Google Traffic!
1. Since Dec 26th, I have been changing Meta Titles and meta descriptions.
HOWEVER - I have been tracking and recording (daily) the pages Google search is returning in the SERPs and 99% of them are NOT pages I have recently changed!
2. I have noticed (wasn't paying attention before) that PR changes [PR4 to new PR5] have been occuring on different datacenter C-Blocks over this same timeframe. Yesterday, at one point, there were 42 C-Blocks reporting the main page as a PR5. Just now I checked again and only 4 C-Blocks are reporting a PR5 - the rest are reporting a PR4.
NOTE: This sites' index page started out as a PR5 (last January) and at some point [kick self in ass for not documenting] was DEMOTED to PR4! All the while (2006) the site continued to increased total backlinks [but PR decreased] as reported by Google.
3. I see in another thread, here at WW, that Google 'tweeked' something in the last few days that has wreaked havoc with some other folks sites.
To what do I attribute my few days of increased traffic?
- I am leaning towards Observation #2 or #3.
#2 - My page/site has gained strength due to linking and additional PR weight.
#3 - This is just some recent wierd Google Fart and this traffic increase will be fleeting...
#4 - None of the above...
What remains to be seen is IF this increased traffic 'trend' continues.
| 11:31 pm on Jan 11, 2007 (gmt 0)|
caryl, perhaps it is seasonal. Does your site relate to New Years resolutions in any way? Fitness/Diet? Etc.
[edited by: AustrianOak at 11:31 pm (utc) on Jan. 11, 2007]
| 12:08 pm on Jan 12, 2007 (gmt 0)|
No, my site is not seasonal and is 'service' oriented rather than product. Last Jan 11, 2006 I recorded 1169 Google search referrals to the site. Yesterday Jan 11, 2007 I had 22!
I read a post on another thread here at WW that is adding 'support' for the #2 observation.
|...yesterday, Matt Cutts made this comment on his blog: |
If you used to have pages in our main web index and now they're in the supplemental results,
a good hypothesis is that we might not be counting links to your pages with the same weight as we
have in the past.
Almost ALL of my sites have dropped from PR5 to PR4 while over the course of the last year, Backlinks pointing to the sites have virtually doubled.
PS - Yesterdays stats were still 'good' but NOT as robust as the previous 3 days.
| 12:56 pm on Jan 14, 2007 (gmt 0)|
My Google hits seems to have returned to their previously, paltry, trickle.
At the same time - Google seems to have settled back down as far as PR goes and all but 5 datacenter C-Blocks are once again displaying a PR4 for my main site.
I have been in this field and hanging around the SEO Forums for nearly seven years now. I do not remember a time where a PR 'Update' caused serp changes. But I have read in different threads here at WW, that others are seeing the same odd phenomenon.
As far as I can remember, PR 'updates' have always been just a periodic update or 'snapshot' of the PR calculated for a page. BUT - that Google actually calculated/adjusted those on more of a 'real-time' basis. Any serp changes would happen at the time Google actually re-calculated PR/backlinks values.
Since BD, however, it seems that Google has slipped from being innovative in it's approach to search - to being clunky and cumbersome.
It would not surprise me to find that now they have reverted to a much slower rate of calculating PR in large 'batches' then 'pushing' the new calculations with one of their "Data Refreshes".
I think a 'link building' campaign is in my future...
"Content is KING": it was pre- Big Daddy
Now it seems that Content is more like a Cadillac - You may have the best - but you'd better have gas (PR) - or you won't get it out of the driveway (won't come up in the serps). :)
| 1:22 pm on Jan 14, 2007 (gmt 0)|
|78,000 new pages in 21 days will trigger an alarm. |
Really? We recently allowed Google to crawl one of our article archives (we were previously afraid of scrapers). There must be over 200,000 pages in there. Will I be penalised for allowing them access to these pages all at once? Should I be spoon-feeding the Google bot-children or even bottle-feeding them?
Big news sites add hundreds if not thousands of articles a day. Big software/shareware sites add hundreds if not thousands of pages a day. Forums add hundreds of pages a day. There are many, many legitimate reasons why some sites grow really, really fast. They will all be penalised?
| 4:22 pm on Jan 14, 2007 (gmt 0)|
|Does Google keep a record of re-inclusion requests? This would make it much easier when it comes to re-typing a list of things that have been changed in the past 9 months, etc, etc. |
Does the same person at the receiving end of re-inclusions get it or is it a different person each time?
AustrianOak, I have read that we must assume that it is a different Google person each time and that therefore it is advisable to repeat any previous problem diagnosis/fix pairings in each reinclusion request.
Personally, I think that no human (.. and probably not even an automaton) at Google reads my requests. Ever. I see nothing happen nor ever receive an acknowledgement. It just sucks.
| 12:52 am on Jan 15, 2007 (gmt 0)|
dangerman, I agree it's very frustrating.
| 4:23 pm on Jan 15, 2007 (gmt 0)|
Hey Fellow -30 Suffers, I have heard from Mt. Olympus so please take note. Perhaps you can benefit from the reply I got.
Last week I added a comment to Matt Cutts blog post entitled "infrastructure-status-january-2007". He answered my comments, mentioning me by name (I added this so you can find mine and his post easily), and nailing me for several things I am not even guilty of. He suggested that I ask for a site review at Google Groups and get "some tough love".
This morning I did just that. That website review request can be found at Google Webmaster Help --> Webmaster Tools and is entitled "Matt Cutts Said to Review My Site and Give Me Tough Love"
So check out his blog and see if your can gain any insight into his comments. Hopefully, it will help you solve your problem.
If you have time, I sure would appreciate you doing a site review.
| 4:43 pm on Jan 15, 2007 (gmt 0)|
Google is dinging you for being an affiliate type of site.
| This 185 message thread spans 7 pages: 185 (  2 3 4 5 6 7 ) > > |