Forum Moderators: open
There are about a dozen serious players that I can see, at least in the Directory.
Other than one site with a PR8 and 60 backlinks, a PR0 with no backlinks, although a deeper page is PR6 and about 120 backlinks, nearly every other one I have seen has more backlinks but typically PR3. They all have the same jargon on their pages, e.g. about 50:50 text and pictures.
At least one of these sites is completely immune to my searching, although it is PR3. (I started at MSN Search in this example, else I wouldn't have found it.)
I don't have any contact with these companies and appreciate that PR is in flux right now (so the PR8 site may well have 5000 backlinks in reality), has anyone noticed consistently low PR in an industry?
Wrong. I consistently find more information on recent subjects in AV than I do Google. But, Google has the Google Dance, the freshbot, Feeling Lucky and a large webmaster following that insists that Google is the best engine on the web.
Google drives the most traffic so webmasters are more concerned with Google than any other engine. Is it the best? That's subjective. McDonalds sells more burgers than any other burger chain, but the burgers I grill in my backyard are better than any burger McDees shoves in a bag. I won't sell 3 billion burgers this year.
AV won't serve up more SERPS than Google this year. But,, for a lot of searches AV consistently serves up a fresher result than Google. FAST won't serve up more results than Google. But as long everyone keeps telling clients that Google is the priority, and as long as people keep believing what people tell them, Google is going to keep rolling right along serving up results that are comparable to FAST and AV while everyone buys into the belief that Google's results are superior.
Oh yes, Google indexes more pages. Yep. Well, after the first BILLION results I don't know too many people that keep looking in order to compare. I can index all five thousand books on my bookshelves but only about 800 of them are good books. If I add another 8000 mediocre books will that affect the quality of the listing for those 800 books that are good? Do I need to find 8000 more mediocre books to determine that those 800 gems are good books? If another good book is written do I need to compare it to every book I can find to determine if it is a good book?
From the thousands of results several people have just finished compiling and analysing so that a group of 3 mathematicians and 11 SEOs could form an objective analysis, Google certainly doesn't stand out as a provider of stellar SERPs, in fact, Google falls short in several areas. I have yet to see FAST return a result in which the words used to find the site don't even appear on the damn page, let alone fail to appear on the entire site.
The Google phenomenon seems to be a product of the rule of three. Repeat something three times and it becomes truth. People have been saying that Google is the best engine on the web for so long they no longer have to provide proof. It's accepted as fact simply because people keep saying it is so.
Spend some time with these other engines and let me know if you find a site that ranks well just because 300 sites linked to it. Some engines actually seem to care about what the site is about rather than what 300 sites say the site is about. Hidden text penalty my arse. At least the keywords are on the page, they might be hidden, but they're there.
FAST doesn't seem to care if I buy an expired domain, create a site that is relevant and put it up on the web so that people can use it. Google, well, ya might be cheatin, they can't seem to determine what is actually on the site ya know, they have to go by past votes. So rather than try to determine what that site may be about, they'll just assume it's deceptive. Yep, the only reason you picked that .com is because you were hoping to get that Pagerank benefit from inbound links. Forget about the fact that good .coms are hard to find.
Now there are theories going around that Google might think if you have too many links using anchor text that accurately describes the page the anchor text links to that you're trying to cheat. What kind of BS is that? Oops, perfect anchor text, must be cheating. Sure the site is about blue widgets, but hey, there's too many site linking to it with blue widgets in anchor text. Get a few links with red wodgets as anchor text. That should clear things right up.
The absurdity of Google's position was driven home tonight when a client found a perfect domain and wanted to buy it. He called me to ask if he should purchase it and I had to tell him to make sure that it wasn't a "bad domain".
Well, he wanted to know how a domain name could be bad. After all, it's just a name, and a rose by any other name... Yeah, yeah, unless you have to worry about Google, and reinclusion requests, and bad neighborhoods and every other absurd thing I can think of that has absolutely nothing to do with the actual purpose and content of the site, which Google can't seem to index or consider important unless there are other sites linking to it, telling Google how important the site is, or how relevant it is...
But yeah, it's all about content... ;)
<added>Apparently the title and subject matter of this thread changed while I was writing this.</added>
I must have read it 3 times now and I hope that we will find some thing better than G in near future, as for this month I have not a found I single useful site from G and know there are hundreds having the same experience but not open about it as they feel that GG might get annoyed and give them a PR0 as a gift.
Imagine what went wrong this month with G and me?
Type in my URL in G and its most likely that you get my site – right? Hey your wrong you will get my affiliate site. Coz the PR of the affiliate is better (both rounded to some PR value) & looks like G handles both sites as being one. When looking for backward links, a mix of links to our site and the affiliate site is displayed. A search for cache:www.mysite.com on the search engine shows the data for the affiliate. Though this sounds very funny it is terrible for our business and I don’t think the world’s best search engine would end up doing this.
The site of the affiliate is only one page with a meta refresh:
<meta http-equiv="REFRESH" content="0; url=http://www.myaffiliateurl.com/some file?=some value">
So that is for Google the reason to think it is a door-page and therefore it should be handled as one site. It is apparently a bug in Google's software. I have written to them, but I don’t even hope that they will ever care about it.
G Thanks for giving us such an amazing search engine :-P
digitalghost, I really liked your post. Other SE's are more relevant and more up-to-date than Google, so why are webmaster's still intent on flogging this horse?
I think a lot of this has to do with the fact that Google was seen as totally free. Most SEO's think you have to pay to be in INK and AV....total garbage of course, INK and AV are also free! Why Ink appears in the PFI section at WW is still a complete mystery to me? It may have a PFI 24 hour option, but it also indexes sites within a week for free, so why make the distinction?
Today I personally believe true Ink results are the best, followed by ATW, Teoma, AV and then Google.....odd the traffic generated is Google, Ink, AV, ATW & Teoma.
Google has peaked, now we will see others come in to play, and not a second too soon!
Other SE's are more relevant and more up-to-date than Google, so why are webmaster's still intent on flogging this horse? - percentages
I asked the same question to an adult webmaster and the answer was - The Google-busting web sites are themselves easy to develop, so don't think of them as individuals. Rather, consider them in groups. A single network of twenty web sites (one full month's work) is all that is necessary to capitalize on your own link popularity and begin receiving large quantities of traffic. Rinse and repeat is the long-term solution.
Google-busting web sites means web site made to rank high on the search engines...
So what we are really saying is that Google is not an SE that produces relevant results but one that is free and allows sufficient manipulation that webmasters fall in love with it purely because of the associated financial gains?
Google is Webmasters/SEO personal play thing, they want to encourage its use for their own benefit? They know they can manipulate reults to their own advantage and as such just need to increase the audience size?
Don't know about that theory, but too many anchor text exactly matching too many obvious on page seo techniques could be a problem. If this is a new algo, it helps dampen too much internal linking and obvious seo tactics. Early days to test but it makes sense to me.
[edited by: MHes at 11:19 am (utc) on May 24, 2003]
But thanks, Digitalghost, for an insightful (inciteful? :)) post. In fact I did visit AV today (my earlier favourite in the 90s) after an absence and found more relevant sites. I might visit it more often and recommend it to my readers (am editor in some magazine).
I posted elsewhere about my surprise at being taken to the uk.av.com site (I am in .au) as I was checking out US companies. Then I checked my own newish site. I have had scooter visiting for many weeks but wasn't in the AV SERPs. Today, I found my subdomain and one subdirectory off the root, but the other pages including the entry page were a click away in the "More pages from" link. I was searching for the domain name in this case, but the SERP didn't show the root. I don't call that a clean result.
Try searching for www.ibm.com in AV for a similar situation. (Msoft and WW come up cleaner).
Chiyo,
I am talking about a company/companies that have hundreds of links to it from numerous universities, govt labs, etc. A parallel situation could be companies that make SARS related equipment - there are few manufacturers (just a guess) but many potential sites that would link to them.
I haven't heard that definition of PR before - the probability of a random surfer finding a page. Aren't we all random surfers once we have SERP before us? I'm afraid I missed something there.
- Ash
"13.1 Clearly identify the target of each link. [Priority 2]
... Instead of "click here", link text should indicate the nature of the link target, as in "more information about sea lions".....
In addition to clear link text, content developers may specify a value of the "title" attribute that clearly and accurately describes the target of the link.
If more than one link on a page shares the same link text, all those links should point to the same resource. Such consistency will help page design as well as accessibility."
[w3.org...]
I think if Google starts handing out penalties which directly contravene W3C accessibility guidelines - then that would be cause for concern. If that ever happens
Chris_D
Ask about any other SE you will find no information...
With Google its a PR, iBLS, anchor, sub-domain, KWD-Rich Domain yada yada…Do you have anything like that for AV, Teoma, ::Fast etc.? Nope you don't...its just pure content and nothing else...
SEO's promote Google for their own benefits....here in India in my early days of SEO I use to actively promote Google to who had never heard about it but were happily surfing/ searching Yahoo! (Yahoo India uses its own directory as priority rather Google’s data) because I wanted to have that extra engine in my “TOP 10 RANKINGS IN TOP 10 SE” list.
Bottom line: Its easy to spam and top in Google while in other SE’s this is not the case so every webmaster who does spam loves it and spread the knowledge about it, while some here like Jill, me and hundreds of other doing everything right are getting annoying bugs in Google SE.
In btw, I come across tips to rank top in google every now and then, here are some few to enjoy:-
1) Don't be afraid to place an important keyword twice in the title in order to compete with other highly relevant web sites.
2) URL and file name are attributes that are a serious nuisance to consider when marketing.
3) Link popularity is the most highly touted of Google's algorithm features. Also known as PageRank, link popularity determines whether the rest of the Internet considers your web site to be important. In other words, you need to make sure that your incoming links use your keyword phrase in their link, ALT, and surrounding body text.
4) Heading text is very important. If you want your web page to rank well for its keyword, the easiest thing to do is add a single <H1> tag at the very top of your document.
5) Body text is an important part of on-page keyword relevance. Make sure that your body text is sufficiently long (several paragraphs is ideal), and keeping the inner-string relevance mentioned at the beginning of this article in mind, try to make your paragraphs begin with the keyword phrase in question.
Try searching for www.ibm.com in AV for a similar situation. (Msoft and WW come up cleaner).
Is this a case, search something like online dating on google and you will find a showcase of spam sites, affiliate sites, junk porn sites....
If you want I can sticky you 10000 of high searched keywords where you will find nothing but spam...
If only some kind webmaster would wave 5k, and say ok spend it on ink, and here's $100 for google seo.
But often the reverse is true
Whether you think google is in decline does not matter, the webmaster ceo or whoever sees over 75% of referrals from google, there will be exceptions.
Maybe, and I hope you are right joe surfer will broaden the field, at the moment it is not likely.
Everyone knows how to top google here...not for online dating they don't ;-)
Thanks for that, interesting stuff.
I realise links in with good anchor text are good, the more the better. However, I suspect that if the anchor text matches the file name, title, h1, italics, li, high keyword density and all the rest of the old tricks, a dampening filter kicks. Just a hunch from observing the very unstable serps and I think the logic is that a site can be too good to be true.... e.g. spam! Good sites, high quality and on theme, probably do not have this degree of optimisation. That is the 'theory', pure conjecture, but it could be a way to deal with keyword domains targeting one phrase.
the more obtuse or niche the subject matter, the lower the overall average PR for the whole niche
It all sorts out in the end though as less people are searching for niche terms so the information is found just as easily. A very low pr page that is just one of say 50 on the topic to be found on the net could rate top in a search.
Which should remind us all that "PR" only really makes sense in the cintext of a search query. you cant compare apples with oranges.
I haven't done the exercise but seen some of the PR10 site lists. I would guess that sites with the highest PR are very broad-based popular sites catering to a broad market segment. Only as you go down PR8/PR7 you find "niche" sites that are really authorities in a specialised topic.
However, I suspect that if the anchor text matches the file name, title, h1, italics, li, high keyword density and all the rest of the old tricks, a dampening filter kicks. Just a hunch from observing the very unstable serps and I think the logic is that a site can be too good to be true.... e.g. spam!
The problem with this theory is that if true, it isnt being applied consistently or evenly.
This means that a search for an obscure keyword combination has a chance even if it's coming from a page that has a PR of close to zero.
Having said that, my response is that this does not redeem the concept of PageRank.
In the first place, PageRank is page-based, not site-based. There are sites where all the visitors from search engines come in directly to deep pages. These sites have all of their backlinks to the home page. Therefore, whatever PR their deep pages enjoy has to be derived from their home page. Even using excellent internal linking techniques, an average home page will not infiltrate its PR very well to very many deep pages. That means the deep pages have close to zero PR, if you have thousands of deep pages. But if you based the deep pages on the reputation of the home page, they really deserve better PR than this.
In the second place, a site with average-to-strong PageRank on the home page that has thousands of deep pages, stands an excellent chance of not getting a full deep crawl. This just stinks.
Many sites with lots of deep pages and an average-to-strong home page, would prefer that PageRank be replaced with content analysis. Short of that, they'd do better with a completely random algo than they do with PageRank.
Google's PageRank is spinning down the drain, and it won't be soon enough for many of us.
I have a couple of points...
1. If a site is dependent on their home page PR only for distributed PR for lots of inner pages, maybe they should look at how they can receive incoming links for inner section indexes or inner pages? How can these pages be made more attractive for other sites to link to? Basically if no-one links to one of your inner pages, or it does not get anough PR by your own internal linking, why is it important anough to appear in the direct SERPS? One thing Pr can do is suggest whether a page can "stand alone" in the information it offers to searchers over a period of months.
2. >>prefer that PageRank be replaced with content analysis<<
How would this be different from the many forms of content analysis that google already uses? My own view is that more varied factors the better, off page/on page, link popularity and PR, theme pop, etc.?
3. Are you coming from the perspective of a commercial (say product maybe database driven( site, and information site, or a corporate site? I can understand the problems PR creates for the former, but not so much for the latter 2 cases.
As you are talking about sites with "thousands of pages", I an presuming you are talking about sites which use some sort of templating and database. It would take a long long time to create 1,000's of completely original substantive pages (say a least 500 to 1,000 words of non-duplicate text in each page) There are only a relatively few sites on the web that have this, and they usually DO have lots of links to inner pges. In one of our own cases for example, we have 1,600 independent backlinks to our home page, and over 5,000 to our other 1,200 inner pages. I always thought that seemed understandable for info sites.
I liked your post, just wanted to discuss it further
I admit that this is not your usual site. But the thing that gets me is that many people search for proper names on search engines. This site has the most important proper names from the last 30 years, worldwide, painstakingly indexed from books and clippings. It escapes me why this site has such a hard time getting its fair share of Google juice. This is not a fluke; it's been happening for 2.5 years now with Google. I've tried everything.
I can understand what's happening in terms of Google's algorithms, but I'm at a loss to understand why this is the fault of our site, and not the fault of Google.
I hate to sound like an analog old-timer, but I think that all search engines ought to hire at least one professional-librarian ombudsman, to consider situations where the search engine's responsibility to the public sector suggests that they consider unique situations such as ours.
This idea is, of course, extremely revolutionary to the geeks in Silicon Valley, who feel that digital algorithms are the beginning and end of reality. So I apologize for my presumptiveness.
However i guess what they are trying to do is make savings from employing bytes rather than humans!
Could the structure of the site be made more PR friendly by creating some new content on pages describing "categories" in your database? I was pleasantly surprised how much pages i put in our sites which are simply a dozen or so well written original para's with good titles that have just one or two links to "section index" pages of our site which then have far more links. They have direct links from the home page as "feature summary articles". Maybe you could try that. These pages are very very simple. No js, no nav bars, just title, head, <P>s a few <H>'s, bolds and one or two links. thats all. This also keeps your home page fresh. If the content is topical or original enough, other sites link to it, so you can get a benefit not just from your own internal linking structures (which you say is great anyway), but people linking to these compelling topical "articles". In one way you are making maybe an "old" database more compelling by contextualizing it to issues of the day.
My tip. Google likes topical.
And i think other SE's will go the same way.
Could the structure of the site be made more PR friendly by creating some new content on pages describing "categories" in your database?
Been there, done that, still doing it. Broke it down into 90 subject categories. Each category has between several and 25 two-paragraph reviews of relevant books. At the end of each of these 90 pages we have "site map" links to the individual proper-name pages that appeared most frequently in the above reviewed books.
Thanks for the good suggestion though. I suspect by now that it's just a numbers problem. Any site with 100,000 pages ought to be a PR 8 and if it isn't, to hell with it. It's the way Google's universe works. We're a 6 or 7 on our home page. Eat dust. You ever tried going from a high 6 / low 7 to a PR 8? It sure isn't a weekend project!
But never forget, PageRank is "uniquely democratic." (barf)