Forum Moderators: Robert Charlton & goodroi
I had noticed a while back that pages with just a search form and very little content were doing well.
So what causes these pages to rank well - inbound link text?
It seems like there might be a flawed logic here:
i.e. quality content creates inbound links - therefore rank pages on IBLs and not content. Not sure about that.
Anyway, I guess the question is does content matter or is it all about IBLs?
Imagine that your Yellow Pages directory return 50+ % poor results, would that sound excellent for you too?
But they aren't the yellow pages, and their services are nowhere close to comparable.
If I do a search and 4 out of the top 10 results are good enough to meet my needs, then I have 3 results more than I require. i.e. excellent results.
I almost never require "the best" site on the subject, I just need a site that is "good enough". If I get one site that is "good enough" then I am satisfied.
<I almost never require "the best" site on the subject, I just need a site that is "good enough". If I get one site that is "good enough" then I am satisfied.>
Fair enough. Then you just need to keep on using Google for your search after that one "good enough" result, because that is all what it can deliver at present :-)
Have a great day and a successful week.
Which content is the king?
One can say that the first one. It is OK from the user’s point of view.
But search engines and web masters prefer the second one.
Why? The target for a web master is the first page of the SERPs. If you have the page with the perfect content but your page does not appear at the top 10 or 20, your page is invisible for the most of the SE users.
As a result, various SEO techniques are using to provide high key words density and artificially inflated number of IBLinks from satellite sites created (automatically generated and optimized) only for these IBLs.
Modern search engines including G and Y do not able to provide a good quality personalized service to the particular user, one of hundreds of millions SE users.
surfgatinho - You're right.
They do take into account on-page factors more than G.
My conrern was that, on the other hand, I don't think they utilize enough filters in order not to be manipulated by the "gardens" of IBLs that some websites has.
What do you think?
Assaf.
My conrern was that, on the other hand, I don't think they utilize enough filters in order not to be manipulated by the "gardens" of IBLs that some websites has.
Content is king because if your visitors reach a worthwhile site (regardless of how they get there), they will buy / come back.
I have, in a database, 12 million pages worth of unique content. I am intending to generate 12 million static pages for my site.
1) What will be the best way to get all this content indexed buy Google?
2)Assuming that this is all top quality, highly relevant descriptive text, how much of it will Google take?
3) How long will Google take to suck it all up?
Does anyone else have any experiences of working with this much content on a single static site?
Wasn't the original phrase "Content is King" coined in regards to what was perceived as the winning strategy for websites to succeed in growing their userbase, not to grow their SERPS?
One thing leads to the other. It is a natural progression. Good quality content leads to natural inbound links which boosts your PR which boosts your visibility in the SERPS.
We have seen one stunning example recently for 'generic-word blue widgets', where the generic word appears quite a lot, blue appears once and widget appears once. Its a set of affiliate links with no content.
It appeared a couple of months ago.
Frustratingly, we also have a page of aff. links with plenty of useful surrounding content and guidance about blue-widgets.
The only thing I can think of is that the algo. has 'learnt' somehow that someone searching for 'generic-word blue widgets' can only be looking as a buyer and therefore just wants an immediate list of places to buy them, not information about them.
So if you've got a commercial site - do you just stick the list out there and forget the content? or do you create 2 pages - one with the list and little else and one with list and content, making sure they don't overlap too much?
<Someone did mention earlier that some sites are ranking highly when they hardly have any of the keywords present at all.>
I have seen sites on position #1 with keywords/keyphrases range: one, five and even 10+ lately!
However I have noticed that some pages which have top positions on serps have titles where the "base" keyword is repeated twice. For example:
Widget card processing service. Get Widget cards Online
<Frustratingly, we also have a page of aff. links with plenty of useful surrounding content and guidance about blue-widgets.>
IMHO, this is the most correct effective way to PRE-SELL. You may wish to group your affiliate links and contents into pages with one specific theme per page.
You may wish to take a look at these 2 threads
[webmasterworld.com...]
[webmasterworld.com...]
I have 12,000 products each with many variables, in the past each of these variables has been documented in paper catalogues. These have now been transfered to a digital format that equates to millions of pages of unique descripive content.
Does that make me a spammer?
I see so many loacl interest sites that dominate the SERPs not because they have better content but because they have been around for 4 or 5 years (Some of them don't seem to have been updated since then either)
Anyway, it's a self perpetuating situation where they get links because they come high up the SERPs
Anyway, it's a self perpetuating situation where they get links because they come high up the SERPs
Exactly! The thing is that lazy webmasters looking for good content to link to don't bother to go any further than the first one or two SERPS to find what they want.
You have to dig deeper to find the truly worthy, "fresh" sites to link to. Its a question of taking the time to hunt them down. Few people bother, and so the problem continues.
...many well ranking sites are page after page of fluff. Search engines have no way to distinguish between good content and lousy content designed to do little more than cover every keyword. In fact the latter tends to rank better. Still, in Google, inbound links are far more important than content, other things being equal.
I'm seeing a lot of fluff (mainly template-based fluff) ranking high in Google lately, but that doesn't mean the situation is permanent or that Google intentionally favors links over content. The importance of PageRank waxes and wanes, but I think most WW members would agree that it isn't as decisive a factor as it was a few years ago.
Still, no search engine can determine quality or even relevance through on-page factors alone, and inbound links remain a useful tool. The trick is to determine which inbound links are meaningful and which ones aren't. Google's TrustRank is a promising concept (especially the idea of human-vetted "seed sites" as benchmarks), and there are other techniques (such as ignoring inbound ROS links, inbound links from networks of affiliate sites, or internal links beyond a certain number) that can help to keep brute-force linking from having undue influence on ranking. I think we can safely assume that Google's engineers are comfortable with complexity, and that they view the search landscape in color or shades of grey and not in black and white.
I've just remembered one reason the IBL to quality site relationship is flawed:
Older sites have a huge advantage.
That is a generalization, and not completely accurate.
While an older site may have a large collection of long forgotten links, the site had to be good enough to get those links in the first place.
There are a lot of old sites that don't rank anywhere. That is because they are crappy sites that have never been able to get any links.
In fact, with all the reciprocal linking, it is a lot easier for crappy sites to get links now than it was for them to get them in the past.
It is also reasonable to give any organisation with a longer history more credibility, whether on the web or in the real world. that is why you see things line "Est. 1897" on business signs. Why should a new company get a free pass when competing with a business with 108 years of customer relations history?
Every real business takes time to build, whether on the corner or on the web.
It has been suggested that outgoing links to "quality" sites may help rankings.
IMO this has compounded the problem in that a webmaster links to sites that already do well in the SERPs rather than looking for quality, up and coming sites that aren't in the top ten.
So just to re-iterate. I think the bias away from content and towards links is weighted in favour of established sites and against newer sites which in some cases have more up to date content.
Having said all that, the pages that prompted me to start this thread do not fit into the above category. They are neither particularly old, have no content, negligible PR and not a huge amount of links.
The one that has wound me up most appears to have only one offsite IBL. That link has the KWs in and is ODP listed, although I wouldn't regard it as a quality site.
Maybe the problem is more to do with how Google weights the links from what it perceives as quality sites.
The site doesn't have a heap of links yet, but it can still rank top 3.
I'm sure Slurp has found lots of spam between March 7th and now, but they seem to be filtering it out better than Google, while at the same time allowing new sites without thousands of votes/links to have a shot too.