homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 52 message thread spans 2 pages: < < 52 ( 1 [2]     
Bad results on Google for my new site
allmost no hits from Google and Im for sure no newbee

 7:42 pm on Apr 17, 2004 (gmt 0)

Hmm this is a little strange I have a site that when it got spidered got about 1000 unique a day for a week maybe, thats ok for the beginning, but now I get almost non from Google or Yahoo, so thats wierd, but also interesting.

The site is pure html with PR5 (so not baned), but maybe some filter. The site is also made the usual seo way, but still moderate ofcause and is very user friendly, I have another with almost same layout but with different content and its doing ok, could be better, so I just dont get it whats wrong with tha site.

any suggestions, what could hold the site down the rankings.




 7:50 am on Apr 18, 2004 (gmt 0)

>In a niche where literally hundreds of full sites of pure
>garbage go up every week, lots of them are ranking
>very well right away.

steveb, that's something i'm seeing too. But i guess, these sites ara not new. They have been prob waiting for rankings since months and now start to rank. No proof that these are new pages.

I recently launched a new site (march 20) that is deep crawled now (2 full weeks between first crawl of the index and the deep crawl). The site has a dmoz listing and two handsfull of on topic back links. -> No rankings beside some very obscure queries with just a few dozen results. Not even the domain's keyword pair widget-example returns the page higher than pos 10 but all sites that link to it.

What is even more strange is that new pages i launch at established sites don't rank either but influence the rankings of established pages that are linked from the new pages. I just did one new page at my biggest, most established site. It's a kind of a site map - covering a sub topic and linking / describing the various sections of the site that deal with that topic. These linked pages (sections) changed their rankings (no improvement, just changed, one page even dropped). But the new page (site map) is nowhere ...

OTOH, i launched a classifieds section at my established site march 10. Rankings are brilliant. Doh!

Although there are exceptions, i'm pretty sure, google has a freshness delay ...


 7:54 am on Apr 18, 2004 (gmt 0)

>So what are the distinguishing characteristics of those that go into quarantine and those who dont?

There is no proof of a "quarantine" why do you indulge this concept? Maybe little green men are blocking the site too? There is every proof that Google has a money word filter that jumps around and is inconsistant. To blame Google's inconsistancies and their nutty acting money word filter on a quarantine is a little over the edge IMO. It's more likely that they change their filter depending on how much revenue they need for adwords this week/month.


 8:02 am on Apr 18, 2004 (gmt 0)

>In spite of what we do *know* and figure out, there's obviously something going on that we haven't the slightest notion of. And I really think it's time for the crepe-hangers to go silent and stop spreading their whiney, negative words of gloom and doom around like a dark shroud.

Very poor, very....

There are people out here loosing BIG money and you want to sit on a high horse and call them "whiney" and "negative"? Maybe you should suggest they all commit suicide next so as not to disturb your fantasy reading further.


 8:06 am on Apr 18, 2004 (gmt 0)

>>There is every proof that Google has a money word filter that jumps around and is inconsistant.

With all due respect, we've yet to hear any substantiation of that theory. We're waiting to hear what the proof is, but frankly, patience is running out.

What IS the proof, and what is the testing that's been undertaken to verify and substantiate this so-called "proof" we keep hearing about?

Per the forum charter, we value analysis; so we're stil waiting for response.

Apologies for pushing the point, but whining only raises more questions rather than attempting to come up with answers beyond simplistic parroting that "Google is Broke."

How about let's go back to the very first message in this discussion, and taking the focus off ourselves try to focus on what the issues are that the original poster brought up?

So what are the distinguishing characteristics of those that go into quarantine and those who dont?

Are there or are there not distinguishing characteristics that we can extract a pattern from?


 9:07 am on Apr 18, 2004 (gmt 0)

OK, back on topic.

I put a site up about five or six weeks ago that sells surplus widgets. It has about five or six inbound links for PR4 and 5 sites. It was nowhere to be seen then on Friday, Bang! There it was right up there with all the rest of them. It had SERPS that were really encouraging and even managed a couple of number ones for less popular search phrases.

Foolishly I let my client know about this and by the time he got round to checking the results the site had disappeared! Trying to be objective I would say that the site was performing about where it should have been or perhaps a little better. But it was only there for a few hours before disappearing again. Is this part of the same effect that is being discussed here? If so what happens next?


 11:07 am on Apr 18, 2004 (gmt 0)

"But i guess, these sites ara not new."

I considered that, and I suppose some "new looking" sites may be appearing out of quarantine, but still besides those, I see sites that obviously weren't even plausible two months ago -- the equivalent for example of something based on a news event.

What do these instant stars have in common? That's an excellent question, but probably impossible to tell considering everything. "Lots of links", as in several hundred minimum, is one suspect.


 11:17 am on Apr 18, 2004 (gmt 0)

BeeDeeDubbleU, well that is something like the site in question. Maybe there is something about new sites, I always get my site spidered after 2 days, when I make a new site or page, but like others here my site fear rankings for 3 days then it was gone but still with PR5 and good seo, lets take the theory with new sites filter/block I could imagine that a filter/site hold could take about 3 days to take affect in the search redsults, so thats interesting.

What about the other site Imade 2 month before it ranks gr4eat like I expected, there no hold there and never was, so there the theory goes again, so where are we?

I know theres alot of bad Google talk, but fearly I cant say so much bad, every update my main site has performed the same, the only chabge I see is if you search for products, then the search is bad, but that I think has something to do with there adwords. THE END

Marcia where are we now in our theory and dont forget its also not performing good on Yahoo, so my 5 cents is that there is a thing that I maybe have overseen in my laout of this site a border that has been broken in the seo law, maybe just a bit.



 11:57 am on Apr 18, 2004 (gmt 0)


I had the exact same problem. Here's something for you to try. For the site that is not doing well in Google, put some links to the top three or four sites on your index page.

Someone recommend this to me and it pulled my site up to the first page (#8) for my main search term. Not sure how or why.



 12:05 pm on Apr 18, 2004 (gmt 0)

wellzy, thanks but I dont think thats the problem.



 12:13 pm on Apr 18, 2004 (gmt 0)

I'm certain its due to being 'new'. I have two new sites in two relatively competitive areas. I had no input in one except to give the designer,textwriter and seo broad guidelines and one completely home grown. Everything is different on them both down to registrant, hosting, links etc etc. They were both launched on the same day and they are both now dust.

Patrick Taylor

 12:25 pm on Apr 18, 2004 (gmt 0)

I launched a new site on 1st February - 60 pages. Each of my 10 or so designed-for keywords and phrases are searched about 4 to 5 thousand times a month (according to Overture) so I suppose you would say they're not very competitive, but all those keywords and phrases (that the site is about) add together to account for about 40,000 searches per month.

The main pages were indexed in about 2 weeks and the rest followed in stages, and by the end of March all 60 have been indexed by Google. I have maybe three decent backlinks, one of which is a DMOZ category with a page PR of, I think, 4. All my pages now have a PR of 4 or 5.

As far as my SERPS go, I can't complain, with about 400 page views per day on average, and that's ever since the pages were indexed. Some of the pages rank in the top ten for their principal keyword or key phrase, some are no-where, but I've seen no evidence from this (rather modest) site that there is a quarantine. All I can say is that some of the pages have dropped a little from where they began, but nothing dramatic.

<added>... er, no quarantine, unless, of course, my pages are destined for future stardom.</added>


 3:22 pm on Apr 18, 2004 (gmt 0)

Hi All,

I have witnessed a "phenomenon" that wasn't on a new site or new page but it did, indeed, lead me to believe there was some sort of "prestored-data" used to help expedite delivery of the search results especially on results returning millions of pages.

I had placed a link, with specific anchor text, on my main page to a new small website. After a while I noticed that MY SITE was starting to come up in the results for a search on the keywords I used in the anchor text to that site. (the small site did not realize the same benifit).

Well, I made changes to my main page, in an attempt to improve my own placement for my keywords. One change I made was wording in it's title. While I was at it, I removed the link/anchor text from that page (since it did no good and I did not want to target that phrase).

The changes were picked up and also showed up in Googles cache of the page. The odd thing was, even though the title was changed, Google displayed the old title. While the cache for the page showed the new title.

Seeing this, I did a search for the removed anchor text and sure enough, the page still came up in the results even though the text appeared nowhere on the site nor in any anchor text to the site.

I did see the same page, new version, accurately returned with "new title" displayed for a more defined search with much fewer results (ie. 600,000 as opposed to 2,000,000).

It was obvious that Google was using an older stored cache version or some stored pre data that resulted in still ranking that page in the results in the much less targeted, over 2,000,000 pages returned search.

This "phenomenon" lasted almost three weeks before things were truly updated.


 7:54 am on Apr 19, 2004 (gmt 0)

So what are the distinguishing characteristics of those that go into quarantine and those who dont?

I almost hate to answer because the answer seems simplistic (and some details are proprietary), but what seems to set our more successful (and recently launched) sites apart from those that went into quarantine are these relatively simple and previously discussed traits:

Less use of Targeted Keywords
Also, we believe that kw appearance patterns need to vary depending upon pages/levels within the site.

More User Friendly Text
Makes a difference. Period.

Sites with More Outbound Links
Many have said it recently; we've always done it; we believe it is important; especially on the homepage.

No <META> Tags
Go ahead, throw stones. Why did we lose the META's? It was an easy and universal way to reduce "SEO" activity (i.e., it reduces the number of possible infractions that might lead to a penalty when there are many different factors at play). In a company with multiple developers at work, this sort of simplicity may not be sophisticated, but it can be simple and relatively effective. Of course, reduce too many SEO'd elements, and you'll have a site that doesn't rank well for the concepts it's supposed to rank well for.

I can't say that any one of these are provable explanations, but I must say that when I look at the list, and the details of what we've done, it does seem to support:
- theories relating to Over Optimization and PR hoarding being problems ...
- the notion that linking to related / relevant sites is a plus, (goes back to comments from Brett on site building).

Of Particular Note
One of the new sites that's doing really well has only *two* significant backlinks, both from PR5 sites (not owned by us). Go figure. I would have bet against this site doing really well so fast, but sound linking strategies and strong content strategies and less attention to SEO appear to have overcome any issue related to too few backlinks.

The new anti-SEO site. :-)


 8:31 am on Apr 19, 2004 (gmt 0)

>>There are people out here loosing BIG money and you want to sit on a high horse and call them "whiney" and "negative"?

If they are just that, we call it like it is. ;)

We're dealing with algorithmic search, which intrinsically has to be based upon observing patterns and predictable parameters of relevance indicators. It'll serve us all well to examine patterns and consistencies, and will do us absolutely no good to bash or whine. Like it nor not, that's how it is.

I find caryl's and caveman's new posts quite interesting; there are a couple of very interesting observations there.

The new anti-SEO site. :-)

hehe... that about says it, doesn't it?


 8:43 am on Apr 19, 2004 (gmt 0)

cabbie, austtr and marcia - a special effect for location-related searches? Yes, I believe so...

Imagine you live in a small town and are at the local town meeting to discuss the budget deficit. Everyone wants to say their piece and it looks like the meeting is going to drag on into the small hours. Your eyelids start to slip and you are only jerked out of your doze by a few speeches. Who are those speakers:

> the town mayor
> the representative of the state bank
> the head of the chamber of commerce
> the philanthropist who lives in the mansion on the hill at the back of town

Maybe with location-based searches there is an element of having to be qualified to speak for an area?

Town Mayor = tourist office/local govt. site
Representative of state bank - section of larger encompassing site a level above (e.g. directories, Amazon)
Head of the Chamber of Commerce - well-linked and well-connected sites from others within that area
Philanthropist - pure info/interest site giving and attracting links to and from disinterested parties

Dealing with an area it used to be relatively easy to get small specific sites to hit above their weight - "widgeting in Anytown", even "widgeting in Anystate" if you were lucky. Now the formula seems to be to look for sites at an Anytown level or above and then look for "widgeting" within those sites, rather than pages about "widgeting in Anytown".

So maybe get qualified to talk for your area.

How to do that? No hard and fast ideas, but some of these might be worth trying:

> local links in and out (see other threads relating to this)
> expanding the coverage of the site (up a local government level, or out to neighbouring parishes?)
> look closely at related directory categories (IMO, certain people have been far too quick to validate their prejudices and celebrate the "demise" of the ODP - look at a throwaway comment by Mike Grehan in the Yahoo interview recently about the worth of directory categorisation)

Caveman and liane (in another thread) - de-optimisation? I don't see it and here's why:

One area I have been working in is developing pages about a new(ish) sport in various languages. In some languages I have no idea what the "money phrases" for this subject are because of the newness and the unfamiliarity with the nuances of language in this case. So the pages were just "thrown at the wall" to see what worked and what the logs held. So, no optimization - but the same effects as for optimized pages on the same site.

In other words, if the site ranked for optimized pages it ranked for non-optimized. If it was lost for optimized pages, it was lost for the non-optimized.

Lastly, a lot of people working in non-regional areas will be thinking what this has to do with them? Since most of my stuff is regional, I'm not sure - but the principle could be the same.

Spark plug is to engine is to car is to automobile industry?


 1:47 pm on Apr 19, 2004 (gmt 0)

I think that to add "value" to .edu and .gov sites might be over complicating the issue.

To simplify this, they also tend to share the fact that they are sites with 1000s of pages.


If you reduce this to a size factor, you now allow in Forums, review sites, enormous retail chains.

I will not ask anybody to just believe what I am about to say, you can verify this easily enough for yourself.

Make a chart, put in your site's position#, the total number of pages Google has listed for your site, the total number of backlinks G has listed for your site, and add the position of the allinanchor: look up in Google (list all results across)

Now do the same for the top ten (or more) sites listed for the same search.

You will start to get the picture.

I have seen two occassions just yesterday, where the mere mention of a keyword phrase on a PR0 page of a LARGE forum, with high PR, actually has made that forum come up in the top 5 positions for a search in google for that phrase.

The forum is not an authority on the topic. The forum is an SEO forum and the two phrases are commercial products. No links or even anchor text were involved.

The #1 site returned, in one case, had 31 pages, and 28 backlinks. This forum site has 14,500 pages, 0 backlinks.

As far as searches involving location, Google needs to find a way to make the Location secondary and the other keywords primary.

But, then again, maybe their introduction of "local search" is an attempt to compensate for this problem.


 1:40 am on Apr 20, 2004 (gmt 0)

stever, did anyone here say that de-optimization and "money words" have anything to do with one and other? I didn't, anyway.

We have definintive proof that de-optimization works, if you know what to do ... or guess right. Not sure which one we are. :-)

Alterations we applied after Florida brought sites back.

How do we know it wasn't just later changes in the algo, which are known to have brought a good number of sites back that made no changes? After one of the sites that we altered post Florida rebounded...we changed it back to it's old style, and it disappeared again. Changed it back using post Florida assumptions, boom, it's back.

Good enough for me. Of course, I like simple-minded things. For those who prefer their solutions to be more complex, I can offer little advice. ;-)


 6:00 am on Apr 20, 2004 (gmt 0)

stever, did anyone here say that de-optimization and "money words" have anything to do with one and other? I didn't, anyway.

I wasn't referring to previous posters linking the two concepts but talking about not knowing what the popular searches were in a foreign language for a new subject and thus not doing any normal optimization to the page.

In many non-English languages, the case and gender can make differences to words used. Thus I was using "natural" copy provided by a translator, which would seem to bear great similarities to the use of "stemming" or associated words that other people have been talking about - or at least not using the traditional forms of on-page optimization.

After one of the sites that we altered post Florida rebounded...we changed it back to it's old style, and it disappeared again. Changed it back using post Florida assumptions, boom, it's back.

Good enough for me.

I like that off-on-off-on explanation a lot better than others which say I changed it and it came back.

If you don't mind answering, on a scale of 0 (completely unoptimized) to 100 (most optimized), where were your previous pages and what are they now?


 6:35 am on Apr 20, 2004 (gmt 0)

It appears that if your PR is high enough you don't get hit by the filter. Sites that have pr7 rank well. But similar optimized sites at pr 5 and pr 6 aren't ranking.



 9:42 am on Apr 20, 2004 (gmt 0)

I would call it a network identification filter. Can you stiky me the network of your site zeus.


 12:15 pm on Apr 20, 2004 (gmt 0)

webnewton, What do you mean?, that all the sites are on the same server, I dont think thats the problem, I have such a feeling that Im working on a limit in the algo like to little text, or somthing like that and then some sort of filter gets in, because I also get no hits from yahoo only 5 day.

The situation is very interesting, because PR5, backlinks from internal sites and outside, and 3 days where it 1000 unique a day and that time frame could be the time a filter takes to be active on the whole internet, like a update. The site is NOT banned or have any illiagal stuff.


P.s If Googleguy is whatching this is no complaine, its just a little seo talk.


 12:26 pm on Apr 20, 2004 (gmt 0)

A site i released late February suffered from having homepage PR and all internal pages being a blank. Homepage came up in searches content pages didn't.

However over time Google has slowly indexed more and more of the site. At the moment it indexs just about all of it (2000 ish pages) and internal pages also have PR - some the same as the homepage others less. All internal pages are now ranking well.

The site has around 15 back links most of which are high quality authority sites including ODP and the BBC.

This 52 message thread spans 2 pages: < < 52 ( 1 [2]
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved