Welcome to WebmasterWorld Guest from 188.8.131.52
I had been working on making changes (and still am... I found quite a bit of link-rot, and I have NO meta descriptions), but none of the changes were posted by the time the rankings came back, so this was 100% on Google's side. The site they dropped in June is exactly the same site that's back now.
I expect it to go back to barren after exactly 4 weeks, with the next data refresh.
This would just mean that a few parameters are being turned up or down in every refresh, and sites cross a line and lose and gain rankings.
The next thing then would be to figure out what those parameters are.
In my case, I suspect they are similar/ same / no meta descriptions; a dupe content penalty to a 20-30 link 'related stories' box which was repeated across all stories in a section, occasional titles repeated in pages due to oversight.
Other potential suspects: Press releases published on the site (obviously not original), but as a news site, I need to keep press releases on the site, so I am not going to touch that.
Duplicate content is not an issue with my site as I write all the content. I'm hoping the site will recover with the next data refresh but who knows. I already have a 301 redirect in place from non www to www and unique meta tags.
On August 17th, 95% of my listings came back to "normal".
On September 17th, I lost my heavy competition 2-word phrase that my home page usuaally ranks about #3 or #4.
The past few days I am back for that phrase as well.
In late August, AFTER I recovered, I cleaned up my outbound links, deleting dead links and links to crappy websites. I've added a 2 new pages deep within the site, and that's about it.
From my viewpoint, my site has essentially recovered with little help from me though. The late August changes I made occurred after I recovered. Whether they helped the recent recovery to my home page keyword, I'll never know.
I have taken the attitude the last few years of trying to "Weather the Storm" whenever these hurricanes hit and not do anything drastic. Its generally worked for me as I have held top rankings in a competitive sector by sticking to my strategy of good content, well optimized, with good backlinks.
-- that there are elements in the site which trip a filter when its turned up or down a notch or two.
This doesnt mean tht your site is spammy. google's algo cannot be all-knowing, and it makes mistakes. So up to us to find those factors that get it to penalise your site and remove them. It might look like G is broken - after it is illogical that a normal site can rank at no 1 today and no 300 tomorrow and at 1 again after 30 days. But broken or not, I think its within our control to identify the problem areas which throw G off-track.
Nevertheless it pushed me to see where seo could be improved. So all in all a good lesson.
Plus, and this is the best thing, I had started working on improving my adsense placements around the same period when the downfall took place. Nevertheless I managed increasing revenue and now that the traffic is back up where it used to be, things go well.
But to be honest, I don't think an of those changes are related to the recent improvements in my SERPS.
My site is no spam whatsoever, has lots of good external deep links and has unique content. Why Google punished me during 3 months? ... no clue at all.
During those 3 months strange things happened: My site is in Spanish, for some of my most important keywords (in Spanish obviously), several Polish sites started to show up in the first page... something was definately rotten during this Google tweak. I hope it doesn't revert! ... really do!
While I don't know that any of the things I did had any affect, here is a list of things I did do:
1) terminated some questionable advertisers;
2) corrected some 302 vs. 301 redirect issues I uncovered;
3) eliminated some potentially duplicate pages;
4) resolved some URL issues such that query strings could not be appended to URLs to prevent this from creating duplicate content.
Within this period, changes i made are:
- redesign the whole website link structure and outlook. this is not due to the recent change of SE position but the site design is rather old. however the drop of the SE ranking do pushes me to work faster.
- buy lots of one way link. from what i see in the new SERP, it seems google is rewarding those who bought a number of site wide one way link. although i feel its a waste buying text links, but i cant deny the facts that my website is losing its ranking due to low amount of links. so what i did spend a few hundreds in buying homepage as well as site wide links.
- trim off reciprocal links. for one good whole week, all i did is send out notification mails and deleted ALL those unrelated reciprocal links.
till date, there's still no significant changes to my ranking. i am slightly worry that the sudden boost on my inbound link will make the situation worse but i guess i have nothing much to lose anymore. its good to have thread like this as it still gives me hope that my website might come back one day.
Most recently I had 64 pages linked for a three-week period from the end of September, but that went down to just four pages 10 days ago.
I have done nothing to my site except add three or four news stories every week day. I am confident the pages will return to Google, but equally confident they will disappear again shortly afterwards until the underlying issue is sorted out.
Virtually every month I’ve been tweaking, but typically the same areas, but on just various groups of pages each month.
Still no success in avoiding the filter.
I think I’ve got it narrowed down to a few likely issues that are intervening in the filtering process:
1) site popularity filter
-PageRank // but not really… the new formula for the concept in use seems to (possibly?) be based on Google’s directory/ a modification of DMOZ, but a custom counterpart with up to date ‘popularity’ ranking. After a certain day of the month, if you haven’t made the popularity cut you are outta there – (around Monday, 10-9-06 this month for me)
-(so here is the confounding factor – it has to be the main keywords and phrases for the site, i.e. the keywords that match the category of listing in the Google directory – can anyone else concur on this?)(in fact, my bouncing inclusion pattern is only for the #1 phrase and it’s plural equivalent, i.e. my Google directory category; everything else is stable I believe)
-sort of an aside/ but another Popularity type issue – will syndicating in the Google Content Network help get me ‘officially’ gain popularity
2) sitewide spam filter
-as mentioned in an early post by wanderingmind – seems like a very bad idea to have any duplicate titles or key meta tags – short-cut flag to the duplicate content pile. So I’ve been working on this and just finished up the last few pages, but doesn’t seem to matter to this particular issue since I am still bouncing in and out
-I’ve also had to deal with a spam-laden footer – I finally was able to update it, but did a ton of writing first. So – I don’t think that was the problem….however, again, I’ve just done this for some, not all, pages
-Third idea – anyone have experience with comment tags and the impact? Are they truly ignored by Google? My inherited site has them on every page packed with keywords.
3) automated tool filter
-yes, I once erred and used WebPositionGold on the specific phrase (but I very, very rarely use the tool at all/ now I have my Google code plugged in). So maybe that is the problem. I don’t have much to compare this too.Seems weird that they could do this however – I am not technically the site owner)
Can anyone help me tease out the situation further? Thanks in advance.
Sorry to confuse - it is PageRank - but the accurate version of it, which seems findable in the Google directory.
You can browse it here:
Check out the categories themselves - they will show the Pagerank order.
I say the accurate version - because for a long time I was relying on the toolbar. We lost a point, but it only shows in the directory, not the toolbar.
Whether this figure has any impact on serps is unlikely - surely a more upto data and correct version is used.
(Not saying it is impossible - but directory PR is just a very old figure which G have not bothered apparently to update for a while (could be a base figure they use of course - but I personally feel that is unlikely))