Forum Moderators: open

Message Too Old, No Replies

Hi rank for key phrase which doesn't appear anywhere in site

phrase doesn't feature ANYWHERE on site. How come?

         

jimmydubs

9:48 am on Sep 1, 2003 (gmt 0)

10+ Year Member



Incredible! A very small (tiny) competitor of mine is ranking #3 on Google for the key industry phrase "widgets company". But their web-site appears to have undergone no SEO whatsoever. The phrase "widgets company" doesn't even feature anywhere on the site at all. In fact the word "company" doesn't feature anywhere. Their PR is 4 (same as mine), and they have very few backlinks. How on earth can this happen. Anyone got any bright ideas?

fLaMiN

12:50 pm on Sep 1, 2003 (gmt 0)



yes

valeyard

1:11 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



Some possibilities:

- Key phrase in domain name / URLs
- Lots of internal links with key phrase in anchor text, ALT tags, etc
- Lots of low PR backlinks with good link text that don't get shown with the Google link: search. Try ATW.
- Their SEO is bad but the competitors are worse :-)
- They're using some nasty spam technique you haven't spotted
- They got lucky

Gus_R

1:20 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



Index page was renamed, but old version still ranks at same url.

David_M

2:09 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



I'd go with the "Lots of low PR backlinks with good link text that don't get shown with the Google link." theory.

I've seen English sites rank for Japanese keywords, simply because the incoming link uses the keyword in the anchor text.

hotice_2002

2:40 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



Yes!
For Google, LINK is king.
Who has ever said content is king?
For Google, content is nothing.

You can find some irrelated sites ranking #1 in Google's search results. Only because of LINK Text, wish Google consider more on web content, not only LINK.

caine

2:46 pm on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> For Google, content is nothing.

I think the engineers at the Plex might be a little disheartened that you would think that there algo is that simple!

I would certainly look at the alt tags, structure of the site (should be terrible if no SEO), and lastly the incoming link text, whether high or low.

the_beest

3:09 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



I don't think content is irrelevent to Google, but link anchor text is certainly important. I was searching for something obscure the other day, and at result #3 was a 403 forbidden page, that didn't mention the search term. A look at the cache reveals why, "These terms only appear in links pointing to this page".

Mark_A

3:14 pm on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



seems this is almost the same as "the blog effect" :-)

jambo

3:27 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



Thanks for these comments guys, obviously apart from FLaMiN. Still scratching my head though. I'd love to sticky someone the details to check I'm not going mad. Any volunteers?

jambo

3:43 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



Well I think I've got the answer. A number of you are suggesting that it's a case of good link text in the backlinks. I've trawled through the backlinks and there's no good link text at all - except in 1 case: This company DOES have a DMOZ listing, and the search term appears in THAT link text. I'll bet that's the reason for the wildly over-inflated ranking. If this is the case then it's really disappointing because, I've been trying to get a DMOZ listing for months without any success. I think it's time for me to apply to be an editor...uggghhhhhh!

Mark_A

3:51 pm on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



what PR is the ODP page in which the link is?
And is it also now in the other directories fed by odp?
May also influence but may not show in links in google?

Gus_R

4:10 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



Google cares so much anchor text because it's difficult to spam, not like page content.
I think it's only a casual case doesn't affect serps quality average.

jambo

4:33 pm on Sep 1, 2003 (gmt 0)

10+ Year Member



The DMOZ page on which the link appears has PR5. Can anyone tell me which other search engines are fed by DMOZ?

4eyes

5:12 pm on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've trawled through the backlinks

Don't trust the Google backlinks - check the backlinks on Alltheweb instead - it will show up all the lower page rank sites.

You can make a big difference with good anchor text and a few hundred low PR links, none of which will show when you check on Google.

Kackle

6:07 pm on Sep 1, 2003 (gmt 0)



From The Anatomy of a Large-Scale Hypertextual Web Search Engine [www-db.stanford.edu]:
"Our compact encoding uses two bytes for every hit. There are two types of hits: fancy hits and plain hits. Fancy hits include hits occurring in a URL, title, anchor text, or meta tag. Plain hits include everything else."

Originally Google was set up with two inverted indexes. One was the fancy index, the other was the plain index. The idea behind this was that many one-word searches could be satisfied by consulting the smaller fancy index, without the need for going on to the plain (full text) index.

I have seen many (too many) examples of pages ranking high when the keywords appear only in anchor text from backlinks, and do not appear at all on the page itself.

What Google has done in their war against spam, it seems to me, is to overhaul their fancy index so that instead of being based on scraping the most important words off of a page, now it's based on scraping the anchor text from backlinks. This makes sense for two reasons: 1) anchor text (especially if the backlink is external to the site) is more immune to spamming, compared to on-page features, and 2) the entire front end of their ranking process was tuned to PageRank, which was link based, and was precomputed irrespective of the content on the page. It would not be that difficult to compile a separate fancy inverted index based on anchor text in links, at the same time that PageRank is computed, since the overall architecture would not have to change that much.

What I think happens is that if they get a very close match based on anchor text from the fancy inverted index, then this is sufficient to satisfy the search query. If they cannot return enough such matches to fill the SERP page with links, then they go on to consult the main (full text) inverted index. That's why you see pages flying to the top of the SERPs based on close matches with anchor text, when the search terms do not even appear on the page.

The entire effort at Google is optimized for speed. The more precomputation you can do before consulting the full text of a page, the faster you can return results that are superficially ranked ("superficial" here means a ranking based on something less than full-text analysis). This is what made Google scalable, and this is the reason why they can handle 200 million queries a day.

In a sense, Google became so big so fast, that they are now a prisoner of their own efficiencies.

johnser

6:15 pm on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We may be the "small (tiny) competitor" the first poster mentioned ;)

We've recently taken on a new (PR4) client who is now no. 3 in serps for a very competitive phrase without us having touched the site - yet. (No reference - yet - on the home page to the main phrase)

Achieved through use of optimised anchor text on other decent sites.

`

Liane

6:29 pm on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A friend of mine has a site which I described one way when submitting to DMOZ. A year later, I changed the main description but it remained the same in DMOZ. It is still the top site for that particular phrase, yet the phrase hasn't appeared anywhere on that site for nearly two years.

A good link from an important source with good keyword text does wonders! :)

Kackle

6:51 pm on Sep 1, 2003 (gmt 0)



By the way, if some of you have been reading about the massively inflated counts for number of hits returned (these are in a couple of other threads in this forum), I think what might be happening is this:

If Google wants an accurate count of maximum page hits, they would have to add the fancy index hits to the plain index hits and then subtract all the same-page overlaps.

But I suspect it's getting very tricky to figure out the overlaps without incurring too much overhead. Remember, they have to come up with this count once per query! It looks to me like they aren't subtracting the overlaps anymore because it's too hard to compute, and they just hoped no one would notice. It probably only happens on the higher counts, which makes it less noticeable because no one is able to prove differently anyway.

Of course, webmasters who know exactly how many files are in each of their directories can figure out instantly that Google is miscounting. But for every webmaster who says Google is wrong, you have some dimwit media pundit starting out a column with a cheap lead sentence such as, "If you search in Google for blah, blah, you get XXX,000 hits...."

If this doesn't get fixed soon, then Google may have just decided it was easier to take a little flak from WebmasterWorld than to redesign a monster algorithm.

fLaMiN

11:37 pm on Sep 1, 2003 (gmt 0)



Thanks for these comments guys, obviously apart from FLaMiN

I just said "yes" as in.. i have some ideas ..

But I cant mention them on here, they are rude words like "doorways" and "cloaking" ..

Now that you mention the ODP listing tho, that just MIGHT be the reason. lol.

You'd really need to give the URL and let someone check it out, but those are rude words too "give me url"