| 9:18 am on Dec 19, 2006 (gmt 0)|
Hello, and welcome to the forums nevyan.
|I've pointed all internal links to no follow |
Do you mean "external" links, here? It would also help our discussion if we know how old your website is.
| 12:57 pm on Dec 19, 2006 (gmt 0)|
I mean the internal links - in order not to cause repeated linking. The website is almost 5 years old.
| 1:13 pm on Dec 19, 2006 (gmt 0)|
Just a thought, you refer to your site being page 199 of a 200 page serps, or page 728 of a 729 page serps,
How are you checking the serps? thats a lot of pages to manually scan
I am guessing here, but are you using automated scanning software of some sort?
If you are, well, you should read Google webmaster guidelines an other terms of service
| 1:26 pm on Dec 19, 2006 (gmt 0)|
I was doing a normal manual google query to reach 799 or the last query results in that case.
| 1:50 pm on Dec 19, 2006 (gmt 0)|
|I mean the internal links - in order not to cause repeated linking. |
This doesn't sound right at all to me. There's no reason to use rel="nofollow"on internal linking. You may have created some problems with that -- I would suggest removing the nofollow attribute from your internal links. It tells Google 'I don't trust the target page of this link, do not send it any Page Rank from this page.'
| 3:08 pm on Dec 19, 2006 (gmt 0)|
I'll restore the internal linking. I would like to say that it doesn't lead to SERPS problems at least for this website.
| 4:46 pm on Dec 19, 2006 (gmt 0)|
if you have just "no-follow" it will index that page, but it won't follow or "crawl" the links on that page.
If you have something like this on the home page:
menu-bar HOME WIDGETS BLUE-WIDGETS ABOUT CONTACT
Then only the homepage is crawled and indexed but not WIDGETS BLUE-WIDGETS ABOUT CONTACT
In order to get the whole site spidered you need to remove the no-follow.
| 5:12 pm on Dec 19, 2006 (gmt 0)|
The website is fully indexed. All the webpages apear in the google's primary index.
Even when doing a keyword search for an unique website content, the website shows at the end of the all google's results for those specific keywords.
| 5:25 pm on Dec 19, 2006 (gmt 0)|
| 5:29 pm on Dec 19, 2006 (gmt 0)|
No, the results appear in the primary index as normal ones, and there is almost no difference between &filter=0 and without it.
| 7:19 pm on Dec 19, 2006 (gmt 0)|
Hi Nevyan and welcome!
Do you have any backlinks of importance?
| 8:47 pm on Dec 19, 2006 (gmt 0)|
Well, the website has not so many indexed by google backlinks.
| 2:27 am on Dec 20, 2006 (gmt 0)|
Then try Yahoo.
Also, even if those results are displayed as normal, if they are shown after supplemental results from other sites... they're practically supplemental themselves waiting for the tag to be stamped on them. ( Might not be the case though )
More interestingly... if you don't have sitemaps submitted... a good hint would be the answer to what their last cache date is?
I'm with tedster on this, that you should not have put the nofollow on the links. Even if you have G sitemaps and thus let the bot know where to look for the pages, all the relevance coming from the internal links would be eventually dropped, not to mention that PR will be choked right on the homepage. Sooner or later all pages will drop to zero.
Meaning you WILL have your pages indexed in G normally ( until they'll get marked as supplemental for not having PR at all ) ...but they won't come up for anything, and even if they do, a PR1 page will outrank them.
You'll need to weed out which pages to index and which not... and put the NOINDEX tag in the header. And/or disallow them in the robots.txt. But nofollow WITHIN a site's internal navigation sounds pretty counter productive if you ask me ;)
Not that i'm telling all this from experience, i'm just improvising based on what i've read here... and am interested in your results ;D
| 8:42 am on Dec 20, 2006 (gmt 0)|
Some of the website history: it appeared a long time in its normal search positions, until last 2 years when it has been sanboxed. Last year I've managed to remove completely the sandbox after a reinclusion request.
Then this summer I've decided to remove all the meta keyword and description tags and I also found a several 302 linking sites that led to supplemental inlusion of the most of the website. I've fixed that too - now the website is fully indexed but the results keep showing in the last google query pages.
The website has a valid submited google sitemap. The nofollow links are from 2 weeks and I am not noticing a change in google results. The root domain has PR of 5 and the inside pages are getting PR from 0 to 4. I've restricted the indexed webpages to only article pages.
| 11:24 am on Dec 20, 2006 (gmt 0)|
|Then this summer I've decided to remove all the meta keyword and description tags |
You mean there are NO description tags?
That might be a problem.
Also the PR you see now is not live, it's from sometime late august.
It might be completely different by now.
There's no "sandbox", it's the effect from having no trust, a penalty keeping you from gaining it or not being linked from trusted sites. A reinclusion request might have called in someone from google re-evaluating your site and thus lift a penalty but... then again it might still be there ( if there was any ), hence the low positions. Or it might have been issued again because of an offsite factor they didn't see just by looking at your site. You should double check any links you have in and out.
There's no mirror site, is there?
If you've gone over the following checklist:
- remove the nofollow from internal links
- add unique meta description and title to ALL pages
- make up your mind on canonical URLs and do the 301 redirects
- from www to non-www or the other way around
- from index.? to /
- if there are any dupe pages... that can be disallowed by a pattern, disallow them in the robots.txt
- if there's no pattern use NOINDEX in the html header
- go through your incoming and outgoing links ( remove bad sites )
- remove massive footers, especially with tons of anchor text links
- internal navigation should be consistent with the words it uses
- remove sitewide outbound links
- remove sitewide inbound links to you from other sites
- remove unnecessary syndicated content, blog rss, news, temperature, whatever :P
- see if anyone is scraping you or not ( are those articles ONLY on your site? )
- see if the pages are being cached with the changes
- add one or two weeks for the links to be refreshed in the database
... once this is done and STILL nothing...
Then it might be safe to say that you ARE doing something that calls for a penalty in G's eyes :P ( or the site is constantly being filtered by a quality filter, which is not a penalty )
(i haven't seen your site... besides how would i know ;)
Also if the 302s were AFTER your reinclusion request that got you your trust back, that might be the reason for losing it again.
Okay... too many options.
What was the cache date of the pages?
| 4:48 pm on Dec 20, 2006 (gmt 0)|
Today I found 4 more sites linking with 302 to mine so I took action against them.
Also at the time of speaking I did several google queries:
1) site:http://mywebsite.com -> 340 results, no supplemental
2) [mywebsite.com...] -> only 2 results, all others as supplemental
Next step will be probably to remove the whole website from google's index and submit it again later for reinclusion in order to regain trust.
| 6:22 pm on Dec 20, 2006 (gmt 0)|
What is 302 and how do yo know if they're linking to your site?
| 6:24 pm on Dec 20, 2006 (gmt 0)|
Er... nevyan... in your case you should do this:
And you'll see how it really is.
The site: command won't show supplementals for subdirectories.
btw. you can't remove a site and then get it indexed again... well you can but most parameters will remain the same. But as far as i see it's not trust that's causing you problems...
| 6:41 pm on Dec 20, 2006 (gmt 0)|
A form of redirect by your webserver, saying the page moved TEMPORARILY, as opposed to 301 which says page moved PERMANENTLY.
| 9:33 pm on Dec 20, 2006 (gmt 0)|
nevyan, I very strongly suggest you add meta descriptions back to the pages, stat.
Keyword meta is of little/no importance, but the description is of utmost, MAJOR importance with Google.
| 7:42 am on Dec 21, 2006 (gmt 0)|
|site:subdomain.domain.tld inurl:directoryname |
gives 369 ot of 477 google results - so around 100 are in supplemental index. The whole article database of the website is in primary index.
|I very strongly suggest you add meta descriptions back to the pages, stat. |
That was done.
|But as far as i see it's not trust that's causing you problems... |
I am wondering that too.
| 2:52 pm on Dec 21, 2006 (gmt 0)|
Same problem happened to us, we used to rank on first page on google for most of our keywords and we just vanished. We are now on the last page. I went to google and typed: intitle:mytitle and found a scraper that copied my whole site and was using for adsense. Anyone with similar problem? What should we do apart from reporting the site?
| 1:17 pm on Dec 24, 2006 (gmt 0)|
I guess that this google ban/hold problem will be resolved with a detailed solution later. Definitely it needs time to be broadly observed. For now we just have to wait.
| 12:07 am on Dec 25, 2006 (gmt 0)|
This specific site seems to have several self-inflcited problems, but the end of the result penalty is a common one, also basically mentioned here [webmasterworld.com...]