Welcome to WebmasterWorld Guest from 184.108.40.206
The only thing that differs this site from the other sites of mine doing well in all search engines is that at the end of 2007 the cgi script used for forum was vulnerable to attacks of spam sites. A lot of bad spam messages with bad links were posted, numerous bad external links to the cgi script appeared (about several thousands). By the July 2008 the visits from Google reduced 10 times.
The site has been heavily reconstructed recently (about half a year), reconsideration request was sent (about 2 months ago) but still Google doesn’t want to show the pages of the site (the exceptions are index page and several rare keywords that are on the first pages of results). Also images are doing well in Google Images search by the same keywords (first page of results). All the pages are indexed and have PR 2-3.
I will be pleased if you share your ideas how to solve this problem. Thanks.
It sounds like you did a lot of the right things. Often when a site is hacked, all kinds of spammy backlinks also show up. Have you checked for those and mentioned them in your reconsideration request? Also, have you made 100% sure that there is now no vulnerability on your server? No backdoors installed, no parasite links that are cloaked to the regular visitors and only shown to googlebot - that kind of thing?
Also, you may have just some bad timing here, since as you can see from other recent discussions., Google is behaving quite strangely at the moment. Lots of guesses about why that is, especially theories about Google have troubles rolling out their new Caffeine infrastructure.
If there is an official announcement about Caffeine being fully live and your site is still not visible, that's when I'd go into high gear. Until then, do your research but it may just have to be a waiting game.
Unfortunately I haven't mentioned the exact list of links (just wrote "numerous bad external links to the cgi script"). From our side the vulnerable script was deleted; all target urls of these links were blocked in our robots.txt and the request to remove them from Google index was sent through Webmaster tools.
Do you think I should make a second request with all the details? As for the current version of the site I am sure there are no vulnerabilities.