Forum Moderators: open

Message Too Old, No Replies

Ultimate Pagerank strategy?

Ultimate pagerank strategy?

         

richardvanhooijdonk

10:28 am on Sep 21, 2003 (gmt 0)

10+ Year Member


Comments on submission/pagerank strategy

I would like to receive comments on a pagerank strategy I developed based on cases, advices and comments I received from others.
Maybe you learn something from it. Mayby you can give me comments on parts of the plan. When things are wrong I would very much appreciate comments/advice.

Strategy:

1. We develop a mother-page (the general response page)

2. We developed about 400 keywords

3. We are about to register 20 domains

4. We create 20 x 400 subdomains (keyword.domainname.com)

4. We create contentpages (automated) for each subdomain. On each page we have a logo, general copy with 5-10 times our keyword mentioned.
We also create contentpages for the mother-page. For each keyword, we create one page. So 400 pages.
We also create contentpages for each domain (example: http://www.domainname.com/keywordcontentpage.htm)

Doing this we realize:
- 20 x 400 = 6000 unique subdomain pages (keyword.domainname.com)
- 20 x 400 = 6000 unique domain pages (http://www.domainname.com/keywordcontentpage.htm)
- for main/important keywords (about 50) we create extra pages on every subdomain en domain
50 x 20 = 1000
50 x 20 = 1000

5. When a user enters a contentpage of the mother-page, he will be automatically transferred tot the motherpage

6. We take care of: link text, page title, heading tags, ALT tags, Domain names, Filenames, Directory Names, Keyword Density

7. We submit: 14.000 pages en monitor 400 keywords with Webposition Gold (or home made application)

8. We submit to dmoz.org

9. We try to get additional links from individual relevant sites.

10. And wait for the next Google-dance

Thank you for your comments.

Yidaki

10:55 am on Sep 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



just a few corrections / additions to your plan:

>- 20 x 400 = 8000 unique subdomain pages
>- 20 x 400 = 8000 unique domain pages

>10. And wait for the next Google-dance

11. and wait for google's cold finger - death awaites your cluster

crowthercm

7:29 pm on Sep 21, 2003 (gmt 0)

10+ Year Member



I'm not sure I understand this step:

5. When a user enters a contentpage of the mother-page, he will be automatically transferred tot the motherpage

Are you saying you automatically redirect users from the content page to the main page or that the page is linked to it?

rfgdxm1

9:34 pm on Sep 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>11. and wait for google's cold finger - death awaites your cluster

Yep. It is just a matter of when Google would crush this.

Chris_D

9:52 pm on Sep 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi richardvanhooijdonk

Only problem is that you break the published 'rules' - and therefore 'risk' being excluded. You need to read:
[google.com...]

But then again, a very large online Auction house is doing almost exactly the same thing, but more automated - so I guess what 'Google says', and what 'Google does' - are two totally different things...

Chris_D
Sydney Australia

yonnermark

11:11 pm on Sep 21, 2003 (gmt 0)

10+ Year Member



Chris, are you referring to THE online auction house that EVERYONE knows about?

I don't ever remember seeing a single one of their auction pages on google.... so it doesn't really matter what they do.

They buy loads of Adwords though

conor

11:51 pm on Sep 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



sounds like 'the ultimate ... it willl all end in tears strategy' to me!

coolcreep

12:19 am on Sep 22, 2003 (gmt 0)



this person must be trying to get rid of competion through webmasterworld

they will have a year of fun then go away if they are for real

coolcreep

12:28 am on Sep 22, 2003 (gmt 0)



and where does their math come from
"- 20 x 400 = 6000 "

m, m, m......maybe 8000

Peace All!

[edited by: Marcia at 1:22 am (utc) on Sep. 22, 2003]

wmburke

12:51 am on Sep 22, 2003 (gmt 0)

10+ Year Member




If you're thinking Google will parse/index/cache these subdomains, uh-huh. I have subdomains under some sites that are consistently in Top 10 rankings, and every so often I'll see one pop up for a day or so, then disappear. (However, that seems to have stopped entirely as of late.)

Do some searches for popular words, drill down into the 300's and count how many subdomains you see in the SERP.

I'd bet maybe one-in-300, at best.

AAnnAArchy

1:00 am on Sep 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I checked a popular adult term and come up with 14 outta 20. How'd I do? ;)

plasma

1:08 am on Sep 22, 2003 (gmt 0)

10+ Year Member



Yep. It is just a matter of when Google would crush this.

I would like to agree, but I can't.

If they penalize him (in a few months/years or so) he already got his revenue.
Afterwards he will get new domains, new IPs and start again.

I have no doubt that he'll succeed with his plan.

- Google doesnt react when they are reported spam
- and automatically they can't effectively detect it

sad but true

Oaf357

1:49 am on Sep 22, 2003 (gmt 0)

10+ Year Member



I could go on an on about how terrible of an idea this is but I think enough people have already.

Net_Wizard

3:31 am on Sep 22, 2003 (gmt 0)



Well...

one thing I noticed about the comments in this thread that most of you are quick to render the Google death penalty to this guy.

Have you guys tried richardvanhooijdonk method?

As far as I know this technique is doing well in Google's index. In fact, his campaign is small time compared to some people I know. 20 domains? That's nothing compared to those who are in this type of business.

If this guy is good, the only way Google could catch him/her is by someone reporting the cluster and that takes time. Time for that someone to track all the domains, find the evidence, collate them into a report then send it to Google which in turn would take additional time.

By the time the cluster is removed, this guy have made a good profit which a lot of webmasters here are struggling to make the same.

If this person could make really good money doing this. What is to prevent him/her in doing it again, probably bigger the next time around? Or while he/she is in the index, turn the profit into another independent cluster and so on and so forth. If one cluster is taken out, there are more clusters to take up the place.

The only way to take out this type of business/practice is for Google to programatically identify the cluster and not by manual reporting. But that's Google's problem, not ours.

Just reality folks, whether you like it or not.

Cheers :)

loganoski

8:17 am on Sep 22, 2003 (gmt 0)

10+ Year Member



" 20 x 400 = 6000 unique subdomain pages (keyword.domainname.com) "

Shouldn't google understand that they are 20 mirror pages anche count them as one?

Oaf357

8:49 am on Sep 22, 2003 (gmt 0)

10+ Year Member



Net_Wizard is right. But, I honestly would like to think that the advice given on this forum isn't help people spam Google but to more legitimately improve their rankings in Google.

merlin30

9:26 am on Sep 22, 2003 (gmt 0)

10+ Year Member



Perhaps Mr Vanhoojidonk is having a laugh.

Marketing Guy

10:25 am on Sep 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"If this guy is good"

And therein lies the deciding factor! ;)

Scott

Macro

11:35 am on Sep 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We also create contentpages ....We also create contentpages....We also create contentpages

Er, an automated system creating content pages? I think you meant to say "spam pages".

Yeah, it will work. Terrific. Start straightaway. In fact replicate it a few times on different sets of keywords. Take on loads of staff so you can multiply the 8000 (or 6000 if you prefer) by a hundred or two hundred and make millions of sub domains and millions of unique domain pages all with unique content. Hey, that's what everyone's been waiting for.

Maybe you learn something from it
I hope everyone does, and I hope it's not the lesson you meant them to learn.

Sadguy01

12:33 pm on Sep 22, 2003 (gmt 0)

10+ Year Member



Yeah, I thought i was God when I experimented with this plan about six months ago. Actually I made some mistakes, for example, links from subdomains of the same site are counted only as a single domain link. Subdomains only work as linked-to pages in google, not as link-from.

I ended up getting banned with this method, on two separate non-adult keywords.

If you're going to do it, you're wasting your time, this was a popular method amongst russian webmasters and subsequently it gets you penalised.

P.S. It's not a pagerank strategy ;)

Don't go explaining to the world how to get them banned, it's just plain rude.

2c.

tribal

12:46 pm on Sep 22, 2003 (gmt 0)

10+ Year Member




I hope everyone does, and I hope it's not the lesson you meant them to learn.

Judging by the replies, it is clear this technique is considered unethical. But, as also stated in the "large online auction" post, it does work.
The only thing you have to really worry about, it Google being able to recognise the redirect or keyword spam on the page.

If we like it or not, we're probably going to see this more and more - until G finds a way to effectively eliminate these clusters.

Yidaki

12:47 pm on Sep 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Subdomains only work as linked-to pages in google, not as link-from.

Welcome [webmasterworld.com] Sadguy01,

sorry, please don't confuse the readers here with false statements - that is simply untrue!

cabbie

12:57 pm on Sep 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The original posters surname suggests he is having a laugh but the facts are this strategy is working.Clusters of subdomains are dominating some keywords and for the high moral ground takers it seems you can be clean broke or you can be filthy rich.:)

Sadguy01

1:10 pm on Sep 22, 2003 (gmt 0)

10+ Year Member



>sorry, please don't confuse the readers here with false statements - that is simply untrue!

Simple

Google backward links database (link:www.etc.) = links effecting the site's position + sample of all links, even those not contributing to ranking.

For example, guestbooks are sometimes showed in backward links, but are filtered from effecting ranking (it's deemed as spam by Google). Links from subdomains of a single domain are in a similar category. It's one of the first filters they applied, surely you remember this?

ciml

3:44 pm on Sep 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Fortunately there is a real, long term, spam-filter proof Ultimate Pagerank strategy [webmasterworld.com].

Net_Wizard

3:46 pm on Sep 22, 2003 (gmt 0)



But, I honestly would like to think that the advice given on this forum isn't help people spam Google but to more legitimately improve their rankings in Google.

Not to argue with you but IMO whether we are requesting for links or optimizing our content, strictly speaking we are spamming the search engines. However, I understand what you mean...that we as webmasters have formed what is acceptable and not acceptable practices/techniques to push our pages up in the serp...acceptable practices, we call it search engine optimization...not acceptable practices, we call it search engine spamming. Fact is, they are both the same.

A few years ago, hidden keywords stuffing and doorway pages are acceptable techniques now if you use them you are labeled as a spammer.

Each one of us would like to have an edge over our competitors. Some of us will push the envelope to a certain limit and there are some of us that will push the envelope all the way. It is a question of whether it's worth the risk or not. Ethics and morals have nothing to do with these practices.

Don't get me wrong, I would like to see a perfect search engine too, an engine which rewards you accordingly for your content minus the optimization but reality is, there would be always some kind of flaw on search engines and we will always exploit that defect.

However, some of us are quick to take the moral stand against those who are willing to take the risk. Could it be envy because they are afraid to take the same risk? And, because they can't compete on this level, would it be right to call the technique unethical?

Unethical to who? Is it because the risk-takers doesn't follow the publish search engine guidelines?

See, search engines relies on webmasters to comply with their rules. As I have pointed out on other threads, SEs and webmasters are in conflict of interest. There will be always someone willing to break the rules. That's the problem and when SEs are slow to enforce their own rules, it just encourage more rule breakers.

In the real world, rule breaking is not acceptable and most often, enforcement is swift and heavy, this put 'everybody' in-line with the rule. Fear is a tool, without it, the rule is useless, you can't expect everybody to comply on their own free will.

When Google is at its infancy, we webmasters, solidly supported it. We believed on Google core ideals. We believed that we will be rewarded for our hard works on our rich content site. We believed that punishment would be swift and heavy against the rule breakers.

Now, Google has matured, it no longer needs us, it has it's own interest to take care of which often times are now in conflict with a lot of webmasters and even in conflict with their own core ideals. The things that we care most...

1. ranking according to site merits(quality of content).
2. swift penalty of rule breakers

are needless to say are pipe dreams and will remain in the land of idealism.

So, if talking of multiple domains, cloaking, and other 'black' techniques are now taboo within the webmasters circle then we are just helping Google to be lazy(for lack of better word) in dealing with this techniques.

IMO, talking about it, we make Google aware that we are aware that these techniques work and if this is not taken cared of, there would be an increase of usage of such practices.

Cheers

dirkz

7:44 pm on Sep 22, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Maybe I'm just not getting it, but ...

5. When a user enters a contentpage of the mother-page, he will be automatically transferred tot the motherpage

This should be easy to detect as duplicate content or sneaky redirect.

8. We submit to dmoz.org

Only if the editor in charge is identical to the "webmaster".

9. We try to get additional links from individual relevant sites.

Hahaha.

11. and wait for google's cold finger

Yeah, that's it :)

dudmembership

2:56 am on Sep 23, 2003 (gmt 0)

10+ Year Member



The original posters surname suggests he is having a laugh

err.. perhaps I don't get it.. but there's nothing funny about "Van Hooijdonk".

9. We try to get additional links from individual relevant sites.

Indeed, this is probably the hardest part. I wouldn't link to this cluster if Meneer Van Hooijdonk paid me a million dollars :-) But then again, there are plenty of sites out there that will link to such a site without realizing the consequences.

cyberprosper

3:08 am on Sep 23, 2003 (gmt 0)

10+ Year Member



it is sad, but true >> google is easy to manipulate - his strategy will likely work. I see nothing "unethical" about it either. Google can suck up any pages they want or do not want. I also see nothing wrong with his competitor reporting him.

the only "unethical" SEO that I see is that which is dishonest. If you throw up a site optimized for widgets, and instead you direct your visitors to some triple x porn site.. THAT is unethical.

rfgdxm1

3:21 am on Sep 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>the only "unethical" SEO that I see is that which is dishonest. If you throw up a site optimized for widgets, and instead you direct your visitors to some triple x porn site.. THAT is unethical.

I agree so long as this includes not using extreme techniques like link farms to artificially boost PR, etc. As for my widget sites, *of course* I optimize it for searches on widgets. Why on Earth would I not want to optimize my site for searches on what it is about?

This 48 message thread spans 2 pages: 48