Welcome to WebmasterWorld Guest from 126.96.36.199
would like to discuss this if published..
thanks a lot.
An it does stand to reason that new sites cannnot immediately rank at the top, no matter how good their owners may think they are
links are, a vote from one webpage to another, and as such, create a form of peer review of the site being voted for
Simply put, new sites have fewer votes, as time moves on, depending on the level of promotion an the quality of the site, the site accrues more votes, ant therefore rises in the google search results
Its a successful ranking system that does appear to be used by the big 3 search engines, albeit with different emphasis
I think there is too much emphasis on links as some types of sites do not get natural backlinks no matter how good they are.
Is this a new kind of a sandbox were all links are considered bought and valued at zero by default and will only start help the new site after several months?
2) Be honest with yourself. Not saying this is you, but so many people whinge about how original their site is when they've effectively rewritten someone else's. So many people say they have a 'quality' site when it's piss-poor MFA, thinly disguised affiliate, reseller or cookie cutter drop-shipping front.
3) Check to make sure you've built your site properly and that the nav is clean and spiderable. Make sure you're not doing anything silly like using the same description tag on every page. Make sure your 'optimisation' isn't 1990s style. How much original unique content do you have on every page? How much is template/boilerplate?
4) Get over yourself. Again, I don't mean to be rude but you must remember GOOGLE DOES NOT OWE YOU A LIVING. Stop whining and start working. There are usually hundreds if not hundreds of thousands of sites doing exactly what you do. Many will be better and will have been around longer. Google is not some sort of slot machine set to pay out after a few minutes. It's a multi-million pound business. If there's money to be made on the internet then people will invest in their sites right? You wouldn't expect free print, TV or radio advertising would you?
5) Accept that if you want to rank in Google you need to play it their way. Yes you need links to rank. Yes you can pay for them and spam for them - but you can also get them for free by offering quality - articles, press releases, swaps. All the information you need to learn how to rank sites and make money is out there. This forum is a very good place to start. Search for lists of free directories that will list your site from nothing - or find lists of places to submit articles / press releases to. There are moderators on this forum and others regularly writing huge "20 steps to a successful site" threads for your benefit.
PageRank shifts constantly. Just because you have the same sites linking to you, it doesn't mean they're passing the same value.
A new site, no matter how original your content, will be supplemental without strong, trustworthy backlinks. If you do a bunch of reciprocal link exchanges with a TBPR 10 site, it might help you for a while but if Google catches on, your site may end up completely supplemental as Google loses trust in the way you gain your links.
Also you will make your site more supplemental-prone as you add more pages to your site. A 2 page site needs just one or two weak links to get fully indexed. A 100,000 page site needs a whole lot more backlinks for a majority of its pages to stick.
That's just the nature of the beast.
[edited by: Halfdeck at 5:03 am (utc) on June 25, 2007]
Anyway, these are just my observations.
.sub sub (twig)
Links to the home page have to filter down to the end of the tree. The same amount of links to the 'branches' or 'twigs' will have FAR more effect getting pages into the regular index than if all the links went to the home page.
Google was quick to find and crawl the page. Upon checking, I noticed a striking behavior. When I search the domain name ("mydomain"), Google shows a normal index. But when I use the "site:mydomain.com" operator, Google shows supplemental result.
For the time being, I have no idea as to potential implications of this observation, but I think I will be able to test one or two parameters. Firstly, I will add some "relevant content" to the index page, and wait for the next crawl. Then, I will get some backlinks, and again wait for the subsequent crawls. I think in this way, I will be able to speculate about which factor (content vs. backlinks from authority sites) is critical in new sites' going supplemental. My initial opinion is that backlinks carry more weight.
When I search with Firefox (2.0), I get supplemental. But when I search with IE (6), I get normal results. I checked the datacenters, and saw that it is the same IP in both cases. Really bizzare for the site: operator
Giving us high accuracy on reports is just not Google's core business, it's a sideline. Still, their efforts in this direction are very welcome, and often better than the other engines. It's just so frustrating at times to be dealing with "almost accurate" instead of having confidence that "they nailed it".
It's the worst system.... except for all the other systems.
Ideally Google would get a human expert to examine each site in detail, but given the billions of sites out there the only way to do things is through automation.
Machines cannot understand content in any meaningful way, so the only option open to a search engine is to see how many other sites link to that site. The backlinks method is (in theory) a way to indirectly bring in human experts to rank sites.
I always thought it was due to indexing space in the main SERPS, in that you can only so high with your indexing reference numbers then you have problems.
I suspected (and I guess ill get laughed down) that the google index had grown so large they required two which has effectivly become an in out door for sites that are failing or on the way up.
Just cant recall the exact reasoning behind the number problem but it was due to Microsoft thinking that the most amount of memeory a computer would require was 256K, hence memory addressing, addresses blocks of 256K but only go so high.