Forum Moderators: open
I run the web site for a company that produces software. Our Google performance is poor, mainly - I think - because we haven't got that many reciprocal links. So I did a Google search on for <the keyword phrase>, because I thought it might be interesting to see what the successful companies had that we didn't.
Top of the pile is for a product called <snip> (I would post the URL, but I don't think I'm allowed to. You can do the search if you feel inclined.) This page has beaten off competition from some big, big companies, and it looks like it was knocked together in about five minutes. Plus, it's in a frame! Plus, it's got a tiny handful of inbound links!
My question is: why does Google like this page so much?
I'd be interested to hear any comments
Cheers
Steve
[edited by: Marcia at 11:52 am (utc) on Dec. 17, 2002]
[edit reason] specific removed [/edit]
I'm currently designing what I hope will be a *very* Google-friendly site. The main keywords are very competitive and the competition includes some very big companies. But! A very small company (an individual "widget") is number 2 in Google SERPs for the main keyword phrase. The page is mostly Javascript and CSS to control layers. PR is 6.
The only common threads I've found on the handful of these pages I've looked at are:
--Very light use of keywords in text, one or two times at the most, sometimes just in the TITLE.
--Very light text, sometimes none. The page we looked at a couple of weeks ago just had a graphic -- without an alt tag. Appears to be a that very high code to text ratio is okay.
--Usually about 30 to 40 PR4+ backlinks. Some have the keyword phrase in the link text, some don't. Some have part of the keyword phrase in the links page url, some in the page title, some don't.
--Part of it is freshbot. The competitor's page I've been looking at almost always has a fresh date though the visible content doesn't change.
So, what does this mean for the SERP algo? Are PageRank and link text becoming more important? Keyword density less? Brett always preaches "the smaller the better" for page size and that seems to hold true (as far as text itself goes, a couple of these pages hit 40 to 50K with all the code thrown in). Or, should we all pay more attention to freshbot?
Jim
I'd assume they are cloaking
Report it to the spam police. Your knowledge of this can be construed as part of what your client is paying you for. INK and FAST are quite concerned about this, and they do take action.
Google is concerned about it's relevance like a mother dog to her puppies. But you'll receive feedback that's less personal, because they're so big.
Our Google performance is poor, mainly - I think - because we haven't got that many reciprocal links.
Hey, this works in reverse too.
Their Google performance is great, mainly - I think - because they have so many reciprocal links.
It may not be cloaking, after all. It could simply be that their empty site receives so many votes, Google is believing it. HERE is where I believe the Google Algo may need tweaking. To disallow blank sites, regardless of votes.
Oh well, just another something to keep an eye on.
Jim
Of course, it's an expired domain.
Go to hotbot and use linkdomain:domain.com
You will find that at least one link from an expired domain is pointing to the site.
This is a never ending horrible story, with all the expired domains in the Google directory.
Google and DMOZ are unable to change the situation. The only chance you have to make money is to register expired domains otherwise you have no chance because all the people doing this expired domain game have PR =7 and higher.