Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
We’re seeing this happen to more and more of our clients with small sites, but this one is not that small and it does get updated fairly often.
Worse there are only 3 pages indexed in Google’s database and two of them are in Supplemental Results. (There’s a fair number of pages indexed in the Yahoo Index and Teoma: 248 pages in Teoma).
There are also a fair number of links to this site. In fact Google itself shows over 50 other pages that contain their domain name. Of course there’s no Google PR or backlinks. A quick link check shows at least 30 sites linking to them.
Any suggestions on how to figure out why this is happening? (I do have robots.txt file, but I think it’s ok. I copied it below)
PS – They do have a small Adwords campaign running
Here’s the robots.txt file
The symptoms are typical. Such as
1) inward links don't seem to matter
2) site ranks well initially for a month or two, and then vanishes from say, top 100 results or more
3) indexed pages (other than homepage) start showing up as supplemental links.
If the site is new, here's what seems to be working for one of my sites hit by the sandbox effect
1) Keep adding links. Say, 2-4 new links a week is brilliant. You will notice that the links actually show up in the SERPs, even though the page they link to (your site) is nowhere to be seen. Funny in a way.
2) Keep modifying the site. Change something, anything at all. Interchange keywords in title, add some alt text, remove the alt text, change a 'last updated' date, anything will do. This will keep Googlebot visiting the index page everyday.
You will have to keep doing this for a month, and they you get the next deep crawl by Googlebot and in a few days or a week, you are back in SERPs. If not in the top results, at least somewhere visible. From then on, normal SEO activities will work.
You can just sit tight too. Problem with that is, you do not get Googlebot visits daily, it assumes your site is practically static, and you get a deep crawl. After much longer. The new links and site modifications tempt Googlebot back faster.
Just my two rupee :)
wanderingmind, you misunderstood the sandbox effect. Nothing to do with pages or sites removed from the index or the directory. Nothing to do with supplemental results.
New sites getting their pages indexed but not ranked. They can be found for site:example.com searches but don't rank for their keywords.
>no longer had a page rank
>We’re seeing this happen to more and more of our clients
Doesn't sound too good. Are the different clients' sites all linked from the same places? Could they be identified as related to the same creator?
Afaik, this should read:
That's what I meant. This particular site I mentioned was created in January. Ranked well in SERPs through February.
In March, site disappeared from SERPs. When I say 'disappeared' I mean the pages are indexed, will show up for an allinurl: search, but will not appear in top results for any keyphrase search, even for non-competitive ones.
The extra news is that then I do an allinurl:www.mysite.com, the pages show up as supplemental results.
Yes, the site can be identified by its creator easily, I designed it, and a link to my site is there on their website. But apart from that, there is nothing that's approaching black hat techniques.
After doing all the steps I mentioned, the homepage started ranking better for certain searches. In the SERPS, between 10-30. I just had a deep crawl too - and I expect the site to be back with decent SERPs in a week.
I cannot see any explanation at all for the indexed pages showing up as supplemental results. The server was never down, so it can't be that.
If it is the sandbox effect, how does it really affect a site? Your pages are indexed, ranks well for searches, then disappears from TOP SERPs, comes back when its out of the sandbox period. The only extra symptom I see is the 'supplemental results'. Rest looks the same as sandbox effect to me... :(
About the only common thing with the linking of these sites is our own site. We develop sites as well as do marketing, so we have the typical “web site designed by” link coming at us and we have a list of clients’ sites on our site with links to some of them. I’ve often wondered if there’s a penalty on our own site as we have no ranking anymore either, but I checked with Google when this first happen, probably 2 years ago, and they said they saw no penalty. I checked with them again within the last 4 months or so and their answer this time was boilerplate, along the lines of “just make sure you follow the guidelines published at..”. PS we are strictly “white hat”
I’ve thought about removing links on our site and clients, but this would really hurt as we get prospects from these links
As I mentioned I have a number of small sites that have also been removed the Google directory and their pages moved to Supplemental Results too. But most people are suggesting this is because the sites are small and don’t change very often. Of course if this is the case it’s a real disservice to many small business sites in similar situations. For example one of the small sites is a timber frame builder, less than ten pages, they haven’t changed anytime in probably a year. But they are just as ‘relevant” in a search on “timber frame builders” as they have ever been, but now they don’t appear at all.
Are others seeing this with small sites that don’t change much?
Afaik, this should read:
Disallow: /store/storebnr.htm ]
Just fixed this one. Thanks
keyword keyword2 keyword3 -asdf -asdfg -asdfgh -asdfghj... etc.
The actual number of exclusion params required may depend on the number of keywords. Also, in the past, Google has closed similar testing methods.
I had the same problem with a friends site. It's been on again off again with both the Google Directory listings and PR.
When PR updates, it gets it's PR and a directory listing. Shortly thereafter, it gets greybarred, and it loses it's spot in the directory.
This has happened after the last three(?) PR updates.
I wrote to google, askig if there was a penalty - the site author writes for an online mag that allows him to reproduce his articles on his site once they begin to become stale.
Google assured me that there was no penalty, and that they are constantly tweaking and this should come out in the wash.
Similar to the missing index page problem that Google had?
I think it was the missing index page problem where, after many menbers started complaining, GoogleGuy had a look and sure enough something was up. I wonder if this is the same, as something is up, though they replied to me that all was good. Perhaps they're missing something...
Let me explain. It seems to work. and my 'sandboxed' sites rise to the top from nowhere immediately.
However, I also notice that all the directories and yellowpages that crowd the SERPs also more or less vanish when i use the -asdfghj search.
So I believe -asdfgh turns off not only the sandbox effect, but a host of other filters. Maybe it turns off all except on-page keywords!
Kaled and Yidaki, am I right?
The missing pages/titles/snippets, etc that have been much reported are most likely (in my opinion) due to a bug or bugs in the first stage or in the spidering. My money would be on spider bugs.
PS Another thread is talking about an update. I see no signs myself but I was expecting something this weekend.
We will never know what secret parameters turn of filters unless google officially publish them. I don't know if the -asdfgh search proofs anything - i never tested it since the results would confuse me even more - so why bother. ;)
One of my sites which was showing indexed pages as supplemental results has started ranking high once again, and all supplemental pages tags have been removed from the SERPs.
If there was a sandbox effect in action, it has been removed for this site.
Approx timeframe the alleged sandbox effect was in action - 2 months.