|Imagine that your Yellow Pages directory return 50+ % poor results, would that sound excellent for you too? |
But they aren't the yellow pages, and their services are nowhere close to comparable.
If I do a search and 4 out of the top 10 results are good enough to meet my needs, then I have 3 results more than I require. i.e. excellent results.
I almost never require "the best" site on the subject, I just need a site that is "good enough". If I get one site that is "good enough" then I am satisfied.
<I almost never require "the best" site on the subject, I just need a site that is "good enough". If I get one site that is "good enough" then I am satisfied.>
Fair enough. Then you just need to keep on using Google for your search after that one "good enough" result, because that is all what it can deliver at present :-)
Have a great day and a successful week.
There are at least two kinds of content when we are talking about web pages.
The fist one is the user oriented content (an article, a product description, a list of useful links with brief comments and similar).
The second one is the content (possibly misleading) optimized for a search engine including different spam techniques (a hidden text or “borrowed”/hijacked content, for example).
Which content is the king?
One can say that the first one. It is OK from the user’s point of view.
But search engines and web masters prefer the second one.
Why? The target for a web master is the first page of the SERPs. If you have the page with the perfect content but your page does not appear at the top 10 or 20, your page is invisible for the most of the SE users.
As a result, various SEO techniques are using to provide high key words density and artificially inflated number of IBLinks from satellite sites created (automatically generated and optimized) only for these IBLs.
Modern search engines including G and Y do not able to provide a good quality personalized service to the particular user, one of hundreds of millions SE users.
I'd put it like this:
Is content still the King?
In google - sometimes.
In Yahoo!, MSN, AV, ATW.. rarely, unfortunately. I'm under the impression that it is easy to manipulate the latters SERPs with IBLs and more IBLs. And IBLs have nothing to do with content, relevancy, quality.
|In google - sometimes. |
In Yahoo!, MSN, AV, ATW.. rarely, unfortunately.
It's the reverse for us - one content-rich site with unique articles, about 2 yrs old, dedicated to one topic did well in MSN & Yahoo, but is nowhere to be seen in G.
|In google - sometimes. |
In Yahoo!, MSN, AV, ATW.. rarely, unfortunately
I'd say it was the other way round as these SEs rely more on on page factors - like content!
My site does well in all three ... lots of original content. So go figure!
I agree that the bottom like is "Go figure".. :)
surfgatinho - You're right.
They do take into account on-page factors more than G.
My conrern was that, on the other hand, I don't think they utilize enough filters in order not to be manipulated by the "gardens" of IBLs that some websites has.
What do you think?
|My conrern was that, on the other hand, I don't think they utilize enough filters in order not to be manipulated by the "gardens" of IBLs that some websites has. |
I'm not too sure. I never went in for this approach and have always been selective about sites I exchange links with. Many in my field aren't and have many more links - I still manage to get to the top of a lot of competitive KWs in Yahoo! and MSN using (white hat) SEO.
I also know a guy who does well in Yahoo! who doesn't do any link management so I'm not sure what conclusions to draw.
You don't need that many links in Google. Some well-directed, on topic links could very well push you to the top, especially if they're from well-established, high ranking websites.
Will you be able to shed some light regarding your linking strategy?
What would you define as a good link, as far as SEs are concerned - G, MSN, Y!.
Hope this is not too much of a topic ;)
I'm a firm beleiver that an ODP link far out-weighs most other links so that's a good start.
I think the rest has already been repeated many times - on-topic links, from reasonably high PR pages which are not stuffed full of irrelevant links.
Also try for one-way - i.e. not reciprocal links - which I guess takes us right back to where we started, which is how do you get one way links? The theory is quality content attracts them.
Wasn't the original phrase "Content is King" coined in regards to what was perceived as the winning strategy for websites to succeed in growing their userbase, not to grow their SERPS?
Content is king because if your visitors reach a worthwhile site (regardless of how they get there), they will buy / come back.
Quality Content = Visitors = Sales = Word of Mouth = Return Visitors = links = better SERPS.
Content is still king and will be for a longtime.
Think about it. If there is a blank page, would you stay long.
Ok, I have a question.
I have, in a database, 12 million pages worth of unique content. I am intending to generate 12 million static pages for my site.
1) What will be the best way to get all this content indexed buy Google?
2)Assuming that this is all top quality, highly relevant descriptive text, how much of it will Google take?
3) How long will Google take to suck it all up?
Does anyone else have any experiences of working with this much content on a single static site?
|Wasn't the original phrase "Content is King" coined in regards to what was perceived as the winning strategy for websites to succeed in growing their userbase, not to grow their SERPS? |
One thing leads to the other. It is a natural progression. Good quality content leads to natural inbound links which boosts your PR which boosts your visibility in the SERPS.
Someone did mention earlier that some sites are ranking highly when they hardly have any of the keywords present at all.
We have seen one stunning example recently for 'generic-word blue widgets', where the generic word appears quite a lot, blue appears once and widget appears once. Its a set of affiliate links with no content.
It appeared a couple of months ago.
Frustratingly, we also have a page of aff. links with plenty of useful surrounding content and guidance about blue-widgets.
The only thing I can think of is that the algo. has 'learnt' somehow that someone searching for 'generic-word blue widgets' can only be looking as a buyer and therefore just wants an immediate list of places to buy them, not information about them.
So if you've got a commercial site - do you just stick the list out there and forget the content? or do you create 2 pages - one with the list and little else and one with list and content, making sure they don't overlap too much?
<Someone did mention earlier that some sites are ranking highly when they hardly have any of the keywords present at all.>
I have seen sites on position #1 with keywords/keyphrases range: one, five and even 10+ lately!
However I have noticed that some pages which have top positions on serps have titles where the "base" keyword is repeated twice. For example:
Widget card processing service. Get Widget cards Online
<Frustratingly, we also have a page of aff. links with plenty of useful surrounding content and guidance about blue-widgets.>
IMHO, this is the most correct effective way to PRE-SELL. You may wish to group your affiliate links and contents into pages with one specific theme per page.
I hate to be a bore, but going back to my 12 million pages, does anyone have any advice? should I put this on a subdomain?
You may wish to take a look at these 2 threads
Thanks reseller, There are alot of 'if you're adding huge amounts of content you must be a spammer' posts in those threads.
I have 12,000 products each with many variables, in the past each of these variables has been documented in paper catalogues. These have now been transfered to a digital format that equates to millions of pages of unique descripive content.
Does that make me a spammer?
some sites that are TOP 10 with 'no content' may actually be cloaking?
I've just remembered one reason the IBL to quality site relationship is flawed:
Older sites have a huge advantage.
I see so many loacl interest sites that dominate the SERPs not because they have better content but because they have been around for 4 or 5 years (Some of them don't seem to have been updated since then either)
Anyway, it's a self perpetuating situation where they get links because they come high up the SERPs
|Anyway, it's a self perpetuating situation where they get links because they come high up the SERPs |
Exactly! The thing is that lazy webmasters looking for good content to link to don't bother to go any further than the first one or two SERPS to find what they want.
You have to dig deeper to find the truly worthy, "fresh" sites to link to. Its a question of taking the time to hunt them down. Few people bother, and so the problem continues.
Content should be king, and it is for users, but I still doubt that good content really helps one rank. Lot's of content helps one rank as each new page adds internal links as well as shows activity, but many well ranking sites are page after page of fluff. Search engines have no way to distinguish between good content and lousy content designed to do little more than cover every keyword. In fact the latter tends to rank better. Still, in Google, inbound links are far more important than content, other things being equal.
|...many well ranking sites are page after page of fluff. Search engines have no way to distinguish between good content and lousy content designed to do little more than cover every keyword. In fact the latter tends to rank better. Still, in Google, inbound links are far more important than content, other things being equal. |
I'm seeing a lot of fluff (mainly template-based fluff) ranking high in Google lately, but that doesn't mean the situation is permanent or that Google intentionally favors links over content. The importance of PageRank waxes and wanes, but I think most WW members would agree that it isn't as decisive a factor as it was a few years ago.
Still, no search engine can determine quality or even relevance through on-page factors alone, and inbound links remain a useful tool. The trick is to determine which inbound links are meaningful and which ones aren't. Google's TrustRank is a promising concept (especially the idea of human-vetted "seed sites" as benchmarks), and there are other techniques (such as ignoring inbound ROS links, inbound links from networks of affiliate sites, or internal links beyond a certain number) that can help to keep brute-force linking from having undue influence on ranking. I think we can safely assume that Google's engineers are comfortable with complexity, and that they view the search landscape in color or shades of grey and not in black and white.
|I've just remembered one reason the IBL to quality site relationship is flawed: |
Older sites have a huge advantage.
That is a generalization, and not completely accurate.
While an older site may have a large collection of long forgotten links, the site had to be good enough to get those links in the first place.
There are a lot of old sites that don't rank anywhere. That is because they are crappy sites that have never been able to get any links.
In fact, with all the reciprocal linking, it is a lot easier for crappy sites to get links now than it was for them to get them in the past.
It is also reasonable to give any organisation with a longer history more credibility, whether on the web or in the real world. that is why you see things line "Est. 1897" on business signs. Why should a new company get a free pass when competing with a business with 108 years of customer relations history?
Every real business takes time to build, whether on the corner or on the web.
OK, here's another problem I have with the link popularity model:
It has been suggested that outgoing links to "quality" sites may help rankings.
IMO this has compounded the problem in that a webmaster links to sites that already do well in the SERPs rather than looking for quality, up and coming sites that aren't in the top ten.
So just to re-iterate. I think the bias away from content and towards links is weighted in favour of established sites and against newer sites which in some cases have more up to date content.
Having said all that, the pages that prompted me to start this thread do not fit into the above category. They are neither particularly old, have no content, negligible PR and not a huge amount of links.
The one that has wound me up most appears to have only one offsite IBL. That link has the KWs in and is ODP listed, although I wouldn't regard it as a quality site.
Maybe the problem is more to do with how Google weights the links from what it perceives as quality sites.
Yahoo is doing a great job with the prompt indexing and ranking of new sites with content these days. I've been seeing none of the 1997 style auto-spam that is clogging google lately, and my new site launched March 7 is at least visible.
The site doesn't have a heap of links yet, but it can still rank top 3.
I'm sure Slurp has found lots of spam between March 7th and now, but they seem to be filtering it out better than Google, while at the same time allowing new sites without thousands of votes/links to have a shot too.
Content is still King, but like all Kings, occasionally must fight for his life.
| This 60 message thread spans 2 pages: < < 60 ( 1  ) |