Forum Moderators: open
I would like to receive comments on a pagerank strategy I developed based on cases, advices and comments I received from others.
Maybe you learn something from it. Mayby you can give me comments on parts of the plan. When things are wrong I would very much appreciate comments/advice.
Strategy:
1. We develop a mother-page (the general response page)
2. We developed about 400 keywords
3. We are about to register 20 domains
4. We create 20 x 400 subdomains (keyword.domainname.com)
4. We create contentpages (automated) for each subdomain. On each page we have a logo, general copy with 5-10 times our keyword mentioned.
We also create contentpages for the mother-page. For each keyword, we create one page. So 400 pages.
We also create contentpages for each domain (example: http://www.domainname.com/keywordcontentpage.htm)
Doing this we realize:
- 20 x 400 = 6000 unique subdomain pages (keyword.domainname.com)
- 20 x 400 = 6000 unique domain pages (http://www.domainname.com/keywordcontentpage.htm)
- for main/important keywords (about 50) we create extra pages on every subdomain en domain
50 x 20 = 1000
50 x 20 = 1000
5. When a user enters a contentpage of the mother-page, he will be automatically transferred tot the motherpage
6. We take care of: link text, page title, heading tags, ALT tags, Domain names, Filenames, Directory Names, Keyword Density
7. We submit: 14.000 pages en monitor 400 keywords with Webposition Gold (or home made application)
8. We submit to dmoz.org
9. We try to get additional links from individual relevant sites.
10. And wait for the next Google-dance
Thank you for your comments.
Only problem is that you break the published 'rules' - and therefore 'risk' being excluded. You need to read:
[google.com...]
But then again, a very large online Auction house is doing almost exactly the same thing, but more automated - so I guess what 'Google says', and what 'Google does' - are two totally different things...
Chris_D
Sydney Australia
they will have a year of fun then go away if they are for real
m, m, m......maybe 8000
Peace All!
[edited by: Marcia at 1:22 am (utc) on Sep. 22, 2003]
Do some searches for popular words, drill down into the 300's and count how many subdomains you see in the SERP.
I'd bet maybe one-in-300, at best.
Yep. It is just a matter of when Google would crush this.
I would like to agree, but I can't.
If they penalize him (in a few months/years or so) he already got his revenue.
Afterwards he will get new domains, new IPs and start again.
I have no doubt that he'll succeed with his plan.
- Google doesnt react when they are reported spam
- and automatically they can't effectively detect it
sad but true
one thing I noticed about the comments in this thread that most of you are quick to render the Google death penalty to this guy.
Have you guys tried richardvanhooijdonk method?
As far as I know this technique is doing well in Google's index. In fact, his campaign is small time compared to some people I know. 20 domains? That's nothing compared to those who are in this type of business.
If this guy is good, the only way Google could catch him/her is by someone reporting the cluster and that takes time. Time for that someone to track all the domains, find the evidence, collate them into a report then send it to Google which in turn would take additional time.
By the time the cluster is removed, this guy have made a good profit which a lot of webmasters here are struggling to make the same.
If this person could make really good money doing this. What is to prevent him/her in doing it again, probably bigger the next time around? Or while he/she is in the index, turn the profit into another independent cluster and so on and so forth. If one cluster is taken out, there are more clusters to take up the place.
The only way to take out this type of business/practice is for Google to programatically identify the cluster and not by manual reporting. But that's Google's problem, not ours.
Just reality folks, whether you like it or not.
Cheers :)
We also create contentpages ....We also create contentpages....We also create contentpages
Er, an automated system creating content pages? I think you meant to say "spam pages".
Yeah, it will work. Terrific. Start straightaway. In fact replicate it a few times on different sets of keywords. Take on loads of staff so you can multiply the 8000 (or 6000 if you prefer) by a hundred or two hundred and make millions of sub domains and millions of unique domain pages all with unique content. Hey, that's what everyone's been waiting for.
Maybe you learn something from itI hope everyone does, and I hope it's not the lesson you meant them to learn.
I ended up getting banned with this method, on two separate non-adult keywords.
If you're going to do it, you're wasting your time, this was a popular method amongst russian webmasters and subsequently it gets you penalised.
P.S. It's not a pagerank strategy ;)
Don't go explaining to the world how to get them banned, it's just plain rude.
2c.
I hope everyone does, and I hope it's not the lesson you meant them to learn.
Judging by the replies, it is clear this technique is considered unethical. But, as also stated in the "large online auction" post, it does work.
The only thing you have to really worry about, it Google being able to recognise the redirect or keyword spam on the page.
If we like it or not, we're probably going to see this more and more - until G finds a way to effectively eliminate these clusters.
Welcome [webmasterworld.com] Sadguy01,
sorry, please don't confuse the readers here with false statements - that is simply untrue!
Simple
Google backward links database (link:www.etc.) = links effecting the site's position + sample of all links, even those not contributing to ranking.
For example, guestbooks are sometimes showed in backward links, but are filtered from effecting ranking (it's deemed as spam by Google). Links from subdomains of a single domain are in a similar category. It's one of the first filters they applied, surely you remember this?
But, I honestly would like to think that the advice given on this forum isn't help people spam Google but to more legitimately improve their rankings in Google.
Not to argue with you but IMO whether we are requesting for links or optimizing our content, strictly speaking we are spamming the search engines. However, I understand what you mean...that we as webmasters have formed what is acceptable and not acceptable practices/techniques to push our pages up in the serp...acceptable practices, we call it search engine optimization...not acceptable practices, we call it search engine spamming. Fact is, they are both the same.
A few years ago, hidden keywords stuffing and doorway pages are acceptable techniques now if you use them you are labeled as a spammer.
Each one of us would like to have an edge over our competitors. Some of us will push the envelope to a certain limit and there are some of us that will push the envelope all the way. It is a question of whether it's worth the risk or not. Ethics and morals have nothing to do with these practices.
Don't get me wrong, I would like to see a perfect search engine too, an engine which rewards you accordingly for your content minus the optimization but reality is, there would be always some kind of flaw on search engines and we will always exploit that defect.
However, some of us are quick to take the moral stand against those who are willing to take the risk. Could it be envy because they are afraid to take the same risk? And, because they can't compete on this level, would it be right to call the technique unethical?
Unethical to who? Is it because the risk-takers doesn't follow the publish search engine guidelines?
See, search engines relies on webmasters to comply with their rules. As I have pointed out on other threads, SEs and webmasters are in conflict of interest. There will be always someone willing to break the rules. That's the problem and when SEs are slow to enforce their own rules, it just encourage more rule breakers.
In the real world, rule breaking is not acceptable and most often, enforcement is swift and heavy, this put 'everybody' in-line with the rule. Fear is a tool, without it, the rule is useless, you can't expect everybody to comply on their own free will.
When Google is at its infancy, we webmasters, solidly supported it. We believed on Google core ideals. We believed that we will be rewarded for our hard works on our rich content site. We believed that punishment would be swift and heavy against the rule breakers.
Now, Google has matured, it no longer needs us, it has it's own interest to take care of which often times are now in conflict with a lot of webmasters and even in conflict with their own core ideals. The things that we care most...
1. ranking according to site merits(quality of content).
2. swift penalty of rule breakers
are needless to say are pipe dreams and will remain in the land of idealism.
So, if talking of multiple domains, cloaking, and other 'black' techniques are now taboo within the webmasters circle then we are just helping Google to be lazy(for lack of better word) in dealing with this techniques.
IMO, talking about it, we make Google aware that we are aware that these techniques work and if this is not taken cared of, there would be an increase of usage of such practices.
Cheers
5. When a user enters a contentpage of the mother-page, he will be automatically transferred tot the motherpage
8. We submit to dmoz.org
9. We try to get additional links from individual relevant sites.
11. and wait for google's cold finger
The original posters surname suggests he is having a laugh
err.. perhaps I don't get it.. but there's nothing funny about "Van Hooijdonk".
9. We try to get additional links from individual relevant sites.
Indeed, this is probably the hardest part. I wouldn't link to this cluster if Meneer Van Hooijdonk paid me a million dollars :-) But then again, there are plenty of sites out there that will link to such a site without realizing the consequences.
the only "unethical" SEO that I see is that which is dishonest. If you throw up a site optimized for widgets, and instead you direct your visitors to some triple x porn site.. THAT is unethical.
I agree so long as this includes not using extreme techniques like link farms to artificially boost PR, etc. As for my widget sites, *of course* I optimize it for searches on widgets. Why on Earth would I not want to optimize my site for searches on what it is about?