|8 months of crap from Google for my new domains|
| 12:42 pm on Sep 10, 2006 (gmt 0)|
Im pretty new here and rarely post..So I havent done alot of research here on these threads.
I just DONT understand why google has been ABSOLUTELY horrendous this year with caching and ranking new sites.
I purchased 3 new domains in january and optimized these sites fully, plus increased their page rank. They were in the index within 1 month...But wouldnt come up under any keywords..If they would they would come up under some really odd keywords.. (2 hits a day or something like that)
These 3 new domains by the way were keyword targetted for unpopular keywords. When I have done this in years past with other domain names, I ranked up high in google (per keywords) within 2 to 3 months. Now their not even in the top 20 PAGES of google. Its driving me absolutely crazy.. I keep telling myself that their cache must be screwed up and that their really behind on everything.. But I could NEVER imagine 8 months behind on ranking.
The keywords im targetting are IN the domain name, are IN the title, and are optimized throughout the site...and these are NOT popular keywords. Im doing everything correctly and i've been doing this for 5 years +. Could someone please tell me what the heck is going on with google regarding new domains?
Any info would be VERY MUCH appreciated.
Goin crazy here in jersey,
| 2:18 pm on Sep 10, 2006 (gmt 0)|
There is nothing on Google regarding new domains.I know that becouse I participate in a SEO contest in what one participant register a new domain and in 2 weeks got > 40 IBL's and now takes the lead.
All you have to do is to get more IBL's from other related sites.
| 2:22 pm on Sep 10, 2006 (gmt 0)|
> purchased 3 new domains in january
I don't claim to truly understand what factors constitutes a site being buried in the sandbox. However, it's frequently reported here that a domain name under a year old can stay bogged down deep for close to, or even over, a year.
It's also, though less frequently, reported that someone's new domain managed to miss out on the sandbox.
I dunno what makes the difference for those sites.
| 3:00 pm on Sep 10, 2006 (gmt 0)|
|...optimized these sites fully, plus increased their page rank. |
I think the old line by Starkist Tuna applies here:
"Sorry Charlie, we don't want tuna with good taste, we want tuna that tastes good."
In other words, it's not neccessarily enough to "increased their page rank" because what the toolbar shows you is meaningless. It hasn't been a reliable SEO metric for a few years now. Also, a "fully optimized" site, depending on what you mean by optimized, may be the last thing Google wants in it's index.
That is the difference between tuna with good taste, and tuna that tastes good.
| 3:35 pm on Sep 10, 2006 (gmt 0)|
I have to say that my experiences are very similar to yours and that tryig to achieve anything on Google is very difficult, to say the least. Spending time on trying to discover the "magic" secret formula to get seen on Google is, in my opinion, a complete waste of time nowadays. My original business site still ranks very well on Google and I have set a few new ventures going over the past couple of years, all built using the same principles of layout and design and in much less competitive areas. All rank well on MSN and to a lesser extent on Yahoo. It is easy to get Google to visit your site and I have logs with many thousands of spider visits to what I consider are well formed pages. So, what does Google do with them? I do not know and I don't think the so-called Google experts have any better idea of what is going on than I do. When you see pages that are crawled a week ago VISIBLE in the search results that are then replaced with the same pages crawled months ago then it defies comprehension.
I am now taking a different tack in that 10% of my main traffic comes from MSN so instead of concentrating on one new site then I will build 10 and this should give me the same result as trying to build one site for Google - in the long term this has to be the way to go as I have wasted so much time on Google. My stance is now to ignore Google as EVERYONE is simply guessing what is going on.
All of this begs the question, just how many people could state, hand on heart, that they had registered a domain within the last three months and now have the pages fully indexed, visible, ranking and non supplemental?
If you have then why not share how you achieved it. If you have tried and failed then post what fails. Between the two we might find a solution to what ACTUALLY works. It would also be interesting to keep a count of success versus failure and I am pretty sure I know how that will stack up.
Get posting, folks.
| 3:52 pm on Sep 10, 2006 (gmt 0)|
|...registered a domain within the last three months and now have the pages fully indexed, visible, ranking and non supplemental? |
That hasn't been an easy job for a few years -- you're describing the much discussed "sandbox effect". Actually, in recent months, it seems a bit easier to get out from under those filters, but that is "easier" and still not "easy".
In the past year I've heard several long-established SEOs talk about the need to manage new client expectations about Google ranking. In my opinion, Google is not the place to look for an instant online business success.
One last comment -- if a domain were "fully indexed, visible and ranking", I wouldn't care if they had a Supplemental Result tag or not.
| 5:55 pm on Sep 10, 2006 (gmt 0)|
Thank you kevinpate for this straightforward observation...
|I dunno what makes the difference for those sites. |
and thank you confuscius...
|My stance is now to ignore Google as EVERYONE is simply guessing what is going on. |
and thank you tedster.
|That hasn't been an easy job for a few years |
The Google of 2002-2003 is not the Google of 2006. It is no longer fairly predictable, it is not fully understandable, and some would say it may even be somewhat dysfunctional. Given that there are now LOTS of people adding millions of pages to their sites -- to maximize their own AdWords $$ income -- it is no wonder that the algo is having trouble keeping up. So if you're doing everything right and still not getting indexed, the honest answer is, it's probably not you...
| 6:53 pm on Sep 10, 2006 (gmt 0)|
This year I haven't had any problem getting new sites listed, just keeping position with the older ones.
With my new sites I quite often apply to the local press websites, who are quite obliging in giving me links from their topical pages. This usually results in my site being picked up (1st parse) in a week, being listed as index page for a month and then completely cached and listed as individual pages with a 3 month period. So it might be an idea to enrol some local press support, they usually enjoy helping local firms.
Also, I get a lot of help from articles on bbc.co.uk, they are gold dust at getting the bots knocking at your door ... but you can't show any adverts on the page that they list to.
| 8:25 pm on Sep 10, 2006 (gmt 0)|
This forum is the first place where I've found other people thinking what I've been thinking all summer. Something seems to have gone wrong with the Google cache.
I followed this Google Group for a few months until I got tired of it:
Its distinguishing feature is hundreds of people trying to find rational explanations and rational responses to Google's gyrations, under the assumption that Google is a rational system that is merely mysterious. No, for the past few months it has been acting like an irrational system that is operating in "random" and messed up mode. There is no rational response, and you cannot even gauge whether a change you see in your rankings or SERPs was the result of some action you took, because the system itself is so "noisy". Random fluctuations drown out any real signal. Yesterday they had something like 80 of my pages indexed. Today it's 229. Previously, I might have been tempted to think it was "something I did", but I know better now because I've watched it for so long. On any given day it could be any number of pages. And that's relative *stability* compared to earlier this year. My site's more stable now than it's ever been. And what happened to my site is really trivial compared to what's happened to other people.
I agree with the advice to just try to build a good site and ignore Google's troubles. I believe they'll likely get fixed eventually, but trying to figure out how to get your site on a good Google footing at this time is just a waste of effort that could be better spent developing the site itself while waiting for Google to fix itself.
I know this is really hard on people who rely on income from Google search results. Read a hundred or so posts in the forum linked above. It's heartbreaking. It has done real damage.
| 9:13 pm on Sep 10, 2006 (gmt 0)|
We have domains that are ranking top 10 for keywords that are less than a year old. The more good quality links you have the better you are. Adsense seems to help out in the indexing since the bots crawl more.
| 9:36 pm on Sep 10, 2006 (gmt 0)|
|tuna with good taste, and tuna that tastes good |
which is the same thing is it not..huh?
i would have gone with..optimized farmed tuna full of added nutrients exactly as the consumer demands....against organic natural tuna...
| 10:05 pm on Sep 10, 2006 (gmt 0)|
|>>>tuna with good taste<<< which is the same thing is it not..huh? |
That was a reference to a commercial from twenty or so years ago. That must have flown twenty miles over your head (and possibly many others).
Starkist Tuna used to have a commercial where a tuna fish would cultivate himself with fancy clothes, fancy music on a fancy stereo, acquiring all the outward signals of Sophistication, otherwise known as Good Taste. Starkist isn't looking for tuna with signals of sophistication, otherwise known as Good Taste. Good Taste as used in the context of that commercial is not the same as tastes good.
That is analogous to an SEO creating a web page with the keywords in the title and H1, and keywords bolded and italicized elsewhere on the page. Otherwise known in some circles as "properly optimized."
In my opinion it's sending the wrong signal of having good taste but not tasting good.
Google is not looking for pages that are well optimized for specific keywords. Google is looking for trusted sources of information that will answer a query. Sending out signals that it's optimized for specific keywords in not the same as sending out signals that you're a trusted source of information.
So instead of focusing your attention on optimizing your site for keywords, try focusing your attention on becoming a trusted source of information. So ask yourself, what are signals of a trusted source of information?
<edited on martinibuster's request>
[edited by: tedster at 4:04 am (utc) on Sep. 11, 2006]
| 3:46 am on Sep 11, 2006 (gmt 0)|
<<we don't want tuna with good taste, we want tuna that tastes good.<<
This would be cute except for the fact that google in the last 12 months have gorged themselves on a lot of very bad tasting tuna.
Its been almost a year now that Google's pr machine were espousing the oncoming of BigDaddy to sort everything out.
12 mths on and noone can convince me their serps are any better.far from it.
Sums it up in word
| 4:02 am on Sep 11, 2006 (gmt 0)|
|The Google of 2002-2003 is not the Google of 2006. |
And neither is the web of 2000 the web of 2006. The blog explosion, social networking, and many other sea changes. Sometimes I feel like we want Google to be perfect no matter what gets thrown at it. Yes, I agree that some of the bugginess in past months is a bit hard to handle, and the public avoidance of certain critical issues seems worthy of a candidate for high office.
But I really do understand. A lot of sites flow past my eyes in a week, and I'm amazed that the SERPs are as good as they are. I remember search engines in the 1990's -- the results really SUCKED. If I did a search on AV sometimes I had to be happy if there was even one url in the top ten that was anywhere near relevant.
If I were to sum up my current disposition, I would say that Google came on the scene and blew us all away. Since then, they keep dazzling us here and there with their further promise, but delivering on that promise is proving to be a big challenge. They (and the web) may be approaching a nearly unmanageable level of complexity.
Hey, at least they've got network TV using Google Earth for graphics!
I was talking with a friend last night about complex systems, and our discussion got around to honeybees and their hives and behavior. Any given bee begins its work with a nearly random flight, and sooner or later it usually finds a decent source of nectar. In future flights, this worker bee than makes a "beeline" for that source a lot of the time, but it never cuts out random flights from its pattern.
There is great natural wisdom here for both webmasters and Google alike. Any source of nectar will eventually dry up -- so we need to keep exploring other flight paths if we intend to stay around. A little randomness needs to be built in.
[edited by: tedster at 2:31 am (utc) on Sep. 20, 2006]
| 5:02 am on Sep 11, 2006 (gmt 0)|
I really wonder why some sites can copy pages of content from other sites word for word ranks well (no supplemental) while others with completely original content that cannot be found get thrown into supplemental results.
Searching for article on rewrites gives me a page of top ten results with exactly the same content from ten different sites. What is the point of doing that google? Shouldn't those duplicates be in the supplemental index?
| 6:19 am on Sep 11, 2006 (gmt 0)|
I am seeing the same thing on a domain I purchased in December. I had not had more than 2 IBL when they did the pr update and ended up with a PR of 1. (I know not that important).
Since then I have got more than 50 IBLs and my sitemaps shows that everything is working and show the updated PR distribution etc. It also tells me that no pages from my site are in google's index. This is obvious because if I do a site search on my domain nothing comes up... there is not a single pages from my domain in the Google Index. This is not even a matter of ranking in the SERPs. I have not even done any SEO.
There is something fishy going on and Google's Webmaster Center is doing jack to help webmasters in this situation.
/end rant that has been brewing for 9 months.
| 1:10 pm on Sep 11, 2006 (gmt 0)|
|And neither is the web of 200_ the web of 2006. The blog explosion, social networking, and many other sea changes. |
Agreed, which was the point I was trying to make by saying that the people adding millions of pages (per site) to cash in on AdWords are making the crawling that much more difficult. And as you said, when you add blogs to that, plus MySpace et al, then anyone can see the magnitude of the challenge .
I am particularly troubled by these programs that auto construct huge megasites for the express purpose of generating AdWords revenue. These kind of websites are mostly clogging the pipeline with recycled content, but at the same time, Google is also tremendously benefiting from the revenue, so one can wonder if they have any real incentive to deal with the situation. It strikes me as bordering on conflict-of-interest (better crawling & cleaner results vs meeting quarterly revenue estimates), but because there's a lot of $$ involved, they may not even see a problem.
Which raises the question -- did going public (and thus the need to take care of stockholders) help bring on this scenario? Perhaps the Google of 2002 would have simply ignored all the auto constructed megasites, which means that the rest of us might be seeing better serps in 2006. Don't know -- and it's a moot point -- but it makes me wonder.
| 2:17 pm on Sep 11, 2006 (gmt 0)|
Based on what i've read in this thread, would it be reasonable to hypothesis that
The importance of Dmoz listing, plus listing in other acknowledged , top of the line paid directories, is increasing even now, especially with the
2 man companies boasting a 5 million page website, why if the read/edited 40 pages each a day, they would need 62,500 days per site edit, thats 172 years, believable heh :-)
The other thought being that, Google Yahoo an MSN themselves have introduced a strong, random element to the SERps
If they don't, you could concievably have the SERps reflecting a who is Who of SEO, i.e. the best SEo ed sites, the longests standing SEO'ed sites,
That would be a formula for stasis, newer entrants would never get a look in
I dunno if the above is what you where refering to , Tedster
| 3:38 pm on Sep 11, 2006 (gmt 0)|
3 months ago I changed many pages' names on a website of mine. I did the 301 redirects in my .htaccess and MSN showed the new pages in about 2 weeks. Google did the same with some of these pages, but only for awhile. Then it started showing the old pages again and today the new pages are still not indexed. My point here is how come MSN can do it and Google can't?
| 2:41 am on Sep 19, 2006 (gmt 0)|
Thanks guys for all the replies. Sorry it took me so long to reply back. I actually forgot about this thread for a week and now remembered again. I must have google syndrome or something.
I agree with everyone that says there are serious ranking issues with google on new domains. 2 of those domains I registered earlier this are BOTH #1 on yahoo for their main targetted keyword.. But nowhere to be found on google... searched 20+ pages.
I never heard the term Sandbox Effect before...but it sounds like what I have. Amazing that the best search engine of the world could have such serious issues. Anyone know how to notify them of these errors? There used to be an email address for their webmaster department...but its impossible to find now.
| 3:11 am on Sep 19, 2006 (gmt 0)|
|Anyone know how to notify them of these errors? |
Google is well aware of the "sandbox effect" that their filters have been creating -- there's no need to let them know. They don't consider it an error, it's something they do in an attempt to protect the quality of their search results against spammers launching boatloads of temporary, disposable domains. Many website owners don't feel that the final effect is in line with Google's original plans and intentions, but they certainly do know about it.
| 3:53 am on Sep 19, 2006 (gmt 0)|
it's something they do in an attempt to protect the quality of their search results against spammers launching boatloads of temporary, disposable domains
I have no intent on making them disposable domains. If that is the case then why would my domain fall in the category?
| 4:49 am on Sep 19, 2006 (gmt 0)|
|If that is the case then why would my domain fall in the category? |
The length of time a domain is registered may be considered an indicator of quality. For instance, a domain regged for ten years can be assumed to be more likely to be in it for the long term.
That's one signal of quality out of many possible signals.
Tuna that tastes good, etc...
| 3:55 pm on Sep 19, 2006 (gmt 0)|
I dunno, I have been seeing the "sandbox" effect for a few of my sites (and client sites) where either it's a completely brand new site, or else a complete re-working of the old site. I can't really put a handle on any one thing; the results all seem kind of random. If there's a rhyme, reason or pattern in any of it, I sure can't see what it is.
I added a couple of new pages to my personal domain some months ago, and they still haven't shown up in the index - and I've had my domain name with a website on it since 1994.
I did a brand new site for a band earlier in the year on a new domain, and all the pages showed up in Google for about a week, and then it went back to just the home page, and there it still sits at just the home page.
A client came to me with a new site (that I didn't design) and asked me to help them get indexed - I changed all the page titles and meta descriptions, and all the pages were picked up by Google in less than a week. Who knows if they'll stay?
On the other hand, I rebranded another website for a client who got bought by another company, and the switchover for all the pages showed up in less than a week - and stayed.
I've changed two sites over from tables to CSS - both have been around for quite a while. One of them had all the new pages picked up immediately, the other hasn't, even though Googlebot comes daily.
I've gotten to where I'm almost afraid to make any changes to anything whatsoever, because the sites I do the least with are not only maintaining, but doing better.
(I had one site that was completely "sandboxed" for almost three years, until May 20, 2005 when it miraculously recovered under the update that day, and has been at #1 ever since. It was great - but just as mystifying, since I couldn't predict it or put a finger on what conditions had changed)
| 4:29 pm on Sep 19, 2006 (gmt 0)|
Ted's approach seems to me to have the kind of solid good sense that would be borne up even by the most rigorous technical analysis.
Google HAS to look random in its treatment of any single site; any failure to do this, and it becomes subvertible, And in today's web, that subversion the death of a billion spammer-cuts.
So every post that says "I can't understand what Google is doing with this site" is a testimony to Google's continued success at defending its results against systematic subversion. (A perfect defense against SHOTGUN subversion is, of course, an impossible dream.)
Again, Google HAS to maintain that unpredictability, even if that requires purely random unjustified drops in site ratings, it would absolutely be worth it (from Google's perspective and in the long run from the surfers perspective.)
Think of Google as a rabbit in a meadow, completely surrounded by lines of men with semi-automatic shotguns. What MUST it do? Move to avoid the highest concentrations of firepower, learn to survive ingesting a quantity of lead equivalent to the average density of pellets, and still find something to eat.
The hunters are already doing a fairly good job of filling the entire space with randomly fired pellets, which is THEIR optimum approach: because if they concentrate fire in some places, that's less free-flying lead somewhere else, which is where Google will attempt to go.
Ted's "signals of optimized keywords" are zones of high concentrations of pellets, the rabbit is well-advised to be elsewhere, no matter how many good clover patches it misses out on.
| 1:59 am on Sep 20, 2006 (gmt 0)|
|I've gotten to where I'm almost afraid to make any changes to anything whatsoever, because the sites I do the least with are not only maintaining, but doing better. |
I have the same exact feeling -- "leave well enough alone" -- which of course is in many ways the antithesis of what the web should be, which is to say, dynamic.
So for some of us, our fear of upsetting the great GoogleGod means mostly standing still, and that is not a particularly healthy environment to be in.
| 10:07 am on Sep 20, 2006 (gmt 0)|
Hutcheson, that is brilliant.