homepage Welcome to WebmasterWorld Guest from 184.73.52.98
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Website
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Why do new sites become Supplemental Results?
experienced




msg:3373148
 6:39 am on Jun 20, 2007 (gmt 0)

Have observed many small small sites having very true & original content with no promotional languages, keywords and all. Links are limited on the every page, each page is having fresh & genuine content. site is indexed in google, but with Supp.. reslts. I am sure content and number of link is not the only point for Supp. res.. but when we are very much sure site is genuine by all the means. how come Supp. res.. coming for a new site. A new site can not collect sufficient links with in the 1 month of origin. what could be the rule for the new site behind Supp. res... Its very disappointing. I can bet google that ther is no duplicate content or anything which could lead to Supp. res...

would like to discuss this if published..

thanks a lot.

 

Jakpot




msg:3373397
 12:32 pm on Jun 20, 2007 (gmt 0)

One reason web pages become supplemental is because
they would rank high and Google does not want them
to for corporate financial reasons

centime




msg:3373411
 12:38 pm on Jun 20, 2007 (gmt 0)

Google, via its representative has stated that a lack of links is the primary reason why pages are in placed in the supplementary index.

An it does stand to reason that new sites cannnot immediately rank at the top, no matter how good their owners may think they are

links are, a vote from one webpage to another, and as such, create a form of peer review of the site being voted for

Simply put, new sites have fewer votes, as time moves on, depending on the level of promotion an the quality of the site, the site accrues more votes, ant therefore rises in the google search results

Its a successful ranking system that does appear to be used by the big 3 search engines, albeit with different emphasis

b2net




msg:3373517
 2:36 pm on Jun 20, 2007 (gmt 0)

But why do some new sites get all their pages indexed to the main index and then they start dropping one by one to supplemental while the number of backlinks to the site remains the same or increases?

I think there is too much emphasis on links as some types of sites do not get natural backlinks no matter how good they are.

SEOPTI




msg:3373626
 4:02 pm on Jun 20, 2007 (gmt 0)

It's simple, their crawling process needs lower PR for crawling, but far more PR = quality links are needed to stay in the main index.

Jakpot




msg:3373640
 4:17 pm on Jun 20, 2007 (gmt 0)

The backlinks concept is academic baloney and only provides a rationalle for penalizing perfectly sound
web pages. Coat it with intellectual honey and sugar
and it's still poorly conceived "dream" from a PhD
thesis. Follow the money!

b2net




msg:3373676
 4:50 pm on Jun 20, 2007 (gmt 0)

Before the June update I didn't have any trouble staying in the main index with new sites with a couple of PR4-5 backlinks. Now these links will get a site crawled, have the root page indexed but all the subpages are either supplemental or not in the index at all.

Is this a new kind of a sandbox were all links are considered bought and valued at zero by default and will only start help the new site after several months?

fishfinger




msg:3373710
 5:15 pm on Jun 20, 2007 (gmt 0)

1) Be SURE that your content is unique. Don't just check Copyscape (it can be blocked) take 20 words here and there from the mid to bottom of the page and search in quotes for it.

2) Be honest with yourself. Not saying this is you, but so many people whinge about how original their site is when they've effectively rewritten someone else's. So many people say they have a 'quality' site when it's piss-poor MFA, thinly disguised affiliate, reseller or cookie cutter drop-shipping front.

3) Check to make sure you've built your site properly and that the nav is clean and spiderable. Make sure you're not doing anything silly like using the same description tag on every page. Make sure your 'optimisation' isn't 1990s style. How much original unique content do you have on every page? How much is template/boilerplate?

4) Get over yourself. Again, I don't mean to be rude but you must remember GOOGLE DOES NOT OWE YOU A LIVING. Stop whining and start working. There are usually hundreds if not hundreds of thousands of sites doing exactly what you do. Many will be better and will have been around longer. Google is not some sort of slot machine set to pay out after a few minutes. It's a multi-million pound business. If there's money to be made on the internet then people will invest in their sites right? You wouldn't expect free print, TV or radio advertising would you?

5) Accept that if you want to rank in Google you need to play it their way. Yes you need links to rank. Yes you can pay for them and spam for them - but you can also get them for free by offering quality - articles, press releases, swaps. All the information you need to learn how to rank sites and make money is out there. This forum is a very good place to start. Search for lists of free directories that will list your site from nothing - or find lists of places to submit articles / press releases to. There are moderators on this forum and others regularly writing huge "20 steps to a successful site" threads for your benefit.

Halfdeck




msg:3377801
 5:02 am on Jun 25, 2007 (gmt 0)

"the number of backlinks to the site remains the same or increases?"

PageRank shifts constantly. Just because you have the same sites linking to you, it doesn't mean they're passing the same value.

A new site, no matter how original your content, will be supplemental without strong, trustworthy backlinks. If you do a bunch of reciprocal link exchanges with a TBPR 10 site, it might help you for a while but if Google catches on, your site may end up completely supplemental as Google loses trust in the way you gain your links.

Also you will make your site more supplemental-prone as you add more pages to your site. A 2 page site needs just one or two weak links to get fully indexed. A 100,000 page site needs a whole lot more backlinks for a majority of its pages to stick.

That's just the nature of the beast.

[edited by: Halfdeck at 5:03 am (utc) on June 25, 2007]

Rufus_dog




msg:3377938
 7:55 am on Jun 25, 2007 (gmt 0)

When Google first developed the supplemental index, pages went to supplemental results because of duplicate content (within site), lack of unique title and description, etc. But later on, they started to put more and more inner pages to supplemental result. Even if your homepage has lot of backlinks, and the PR is transferred to first level inner pages, these inner pages can still be put in supplemental results if they don't have any sites linking to them directly. The higher your website PR, the more pages Google will let you in the main index. If you have a new site with few backlinks, your homepage can easily get a PR3 but even with original content, most of your inner pages will be in supplemental results. If there are two websites, both have 200 pages. The one with PR6 can have more pages in the main index than the other one with PR3.

Anyway, these are just my observations.

fishfinger




msg:3377967
 8:30 am on Jun 25, 2007 (gmt 0)

With a medium size site you will have different 'layers'

.home (trunk)
.sub (branch)
.sub sub (twig)

Links to the home page have to filter down to the end of the tree. The same amount of links to the 'branches' or 'twigs' will have FAR more effect getting pages into the regular index than if all the links went to the home page.

selomelo




msg:3378496
 6:46 pm on Jun 25, 2007 (gmt 0)

Recently, I registered a new domain, and launched the site with an index page consisting of only a query form, a single link to an internal page, and a copyright notice. There is only 4 words and a single internal link on the page. Right now, there is no backlinks, as the site is just launched, and is still in the development phase.

Google was quick to find and crawl the page. Upon checking, I noticed a striking behavior. When I search the domain name ("mydomain"), Google shows a normal index. But when I use the "site:mydomain.com" operator, Google shows supplemental result.

For the time being, I have no idea as to potential implications of this observation, but I think I will be able to test one or two parameters. Firstly, I will add some "relevant content" to the index page, and wait for the next crawl. Then, I will get some backlinks, and again wait for the subsequent crawls. I think in this way, I will be able to speculate about which factor (content vs. backlinks from authority sites) is critical in new sites' going supplemental. My initial opinion is that backlinks carry more weight.

fishfinger




msg:3378947
 8:17 am on Jun 26, 2007 (gmt 0)

The site: operator is pretty unreliable lately - try inurl: and see what you see then.

selomelo




msg:3379492
 7:26 pm on Jun 26, 2007 (gmt 0)

Another interesting addition to the above post of mine:

When I search with Firefox (2.0), I get supplemental. But when I search with IE (6), I get normal results. I checked the datacenters, and saw that it is the same IP in both cases. Really bizzare for the site: operator

SEOPTI




msg:3379620
 9:40 pm on Jun 26, 2007 (gmt 0)

I think the site: operator is there to fool webmasters.

tedster




msg:3379766
 1:21 am on Jun 27, 2007 (gmt 0)

I don't go as far as SEOPTI does - I don't feel there's any intentional deception. But I also don't think Google realized when they set up these functions just how SERIOUSLY webmasters will watch every report they can get -- that includes the site: operator, GWT reports, and so on. The reports we get have been in the area of "ballpark" or "best guess" rather than the laser precision we hope for.

Giving us high accuracy on reports is just not Google's core business, it's a sideline. Still, their efforts in this direction are very welcome, and often better than the other engines. It's just so frustrating at times to be dealing with "almost accurate" instead of having confidence that "they nailed it".

incrediblehelp




msg:3382628
 1:47 am on Jun 30, 2007 (gmt 0)

" One reason web pages become supplemental is because
they would rank high and Google does not want them
to for corporate financial reasons"

Now this is funny. I needed a laugh today.

gibbergibber




msg:3382753
 8:36 am on Jun 30, 2007 (gmt 0)

-- The backlinks concept is academic baloney and only provides a rationalle for penalizing perfectly sound
web pages.--

It's the worst system.... except for all the other systems.

Ideally Google would get a human expert to examine each site in detail, but given the billions of sites out there the only way to do things is through automation.

Machines cannot understand content in any meaningful way, so the only option open to a search engine is to see how many other sites link to that site. The backlinks method is (in theory) a way to indirectly bring in human experts to rank sites.

theBear




msg:3382869
 1:32 pm on Jun 30, 2007 (gmt 0)

Good grief SEOPTI I wish you wouldn't do that ;-).

You have me all confused, I thought that was the function of the link command.

A site that I know quite well currently shows 1060 and yahoo's site explorer says over 57,000.

Essex_boy




msg:3382977
 4:31 pm on Jun 30, 2007 (gmt 0)

One reason web pages become supplemental is because
they would rank high and Google does not want them
to for corporate financial reasons" - Manot be as funny as you claim.

I always thought it was due to indexing space in the main SERPS, in that you can only so high with your indexing reference numbers then you have problems.

I suspected (and I guess ill get laughed down) that the google index had grown so large they required two which has effectivly become an in out door for sites that are failing or on the way up.

Just cant recall the exact reasoning behind the number problem but it was due to Microsoft thinking that the most amount of memeory a computer would require was 256K, hence memory addressing, addresses blocks of 256K but only go so high.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved