Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: martinibuster
A friend of mine asked me the following: if I put up a link to her site, she would put a link to mine. (cause she knows Google too) Plus, she'll give me a list of related sites that I could ask to do the same. Thoughtfull, huh? Well, not if you realise the list is already 150 websites long...
Cause I have to send out 148 email messages with that request. Then if accepted, I have to put up the link on my page. And I have to check whether they put up my link... And whether they keep it up, not taking it off after one week...
So I thought of the following (maybe reinventing the wheel, so stop me a.s.a.p. if that's the case):
I put up a small server with a database containing links to (related) websites. Every member gets a list of links that they can approach with a request to put up a link to their website. If the other party accepts it, they (both) put up a link to each other. So far so good: this is nothing new as there are services offering this already: what they do is providing you a 'page' on their domain where you add your links etc. In my case it's a page being part of the original domain, giving the designer full freedom about layout, 'personal' links to be included etc. AND: searchengines will see it as a genuine link page, not as a made up one, stored at some 'link-exchange'-scheme\program. I don't know exactly how these spiders work (indexing links), but this is a 'safer' way (for now & future) I guess.
This is what I would like to get an answer on:
('www.linkbank.com' is just a fantasyname)
1) What I have in mind is the following: a link on CompA.com to CompB.com would look like this: www.linkbank.com/41/678.
Meaning: linkbank.com is the database-website, 41 is for CompA, 678 is ID of CompB. Why this way? Cause then we can check links on links-pages more easily\faster (I think...) and we can produce statistics about traffic.
This method is needed as I can't access the individual links-pages: cause they are on the host of the website. This way, a visitor clicking on the link will be connected to the link-server and from there being 'redirected' via the links-database to the requested page.
But this also provides the option of 'blocking': in case site A takes off the link to site B, it's wouldn't be fair that B keeps linking to A. So after testing the links and one of them being faulty, both links are 'blocked' or disabled.
2) In case point 1 is a good solution: a 'normal link' (www.CompB.com) that can be crawled & spidered & indexed. It should be somewhere in the page or link. So how do GoogleBots 'read' a link? Do they also look at the name of a link? Or can one include a 'normal' link in a 'composed' link: e.g: 'www.linkbank.com/41/678+www.CompB.com'
(meaning that Google should only take the part after the plus (+) sign or any other code, structure or syntax?)
Please let me know:
a: whether I'm reinventing the wheel. If so where can I find the existing, spinning wheel?
b: where I can find more detailed info about these topics?
c: whether my point 2 (Google-link-syntax) is an option\workable solution?
d: whether I'm wasting my time, cause one won't achieve much better rankings because of links on let's say, 200 websites?
Any help\advise is welcome. We are in a not too affluent industry (many small, battling businesses) in a partly sophisticated country, so this might be a reasonable option for trying to attract some more traffic.
Thanks in advance!
I've explained the 'whole' concept, which is not entirely Google-related, but for better understanding of my questions, I thought it would be better this way, so don't ostracize me for it...
What you describe is a linkfarm or a linkcircle - try the site search here at WebmasterWorld and you'll see, that you would waste your time with that!
Focus on quality content - first and all - the rest will come by itself! ;)
I was suggesting a 'linkfarm', she's 'only' putting many links up on her website.
I'm not quite sure, reading your comments, whether her basic method is also a bad idea.
One other thing to avoid is linking to problem sites - if a site has PR0 or is employing any obvious spammy techniques, don't link to it.
I'm sorry, asking a lot, but as said, really new to this and as I realised myself after spending a lot of time reading about it, going through manuals (WebPositionGold), I have still no clear picture what is advisable, the do's and dont's. I have my website up and running, but I'm far from sure whether my keywords are right ones, my ranking should or could be approved. In short, newbie.
Thanks for your patience and understanding :-) And for your replies!
As Marcia just wrote, this is the best site to learn about Google. But it takes a long time :(
I suggest that you first quicky read Google Information for Webmasters [google.com], it has a brief section on Page Rank (PR) to get you started. Also download the Google Toolbar [toolbar.google.com] that allows you to measure (approximately) the PR of a page.
Then look at the top of the WebmasterWorld page, you will find a site search. Search for PR0 (zero), which is what Google gives to people who cheat.
It is probably best to ask Google specific questions in the Google Forum, also read what is being posted there. WARNING: That is not a low volume forum :)
joined:Sept 1, 2000
Also, the Crosslinking issue was brought up and here is reference to a discussion I put together
Crosslinking, Interlinking and Reciprocal Linking [webmasterworld.com] – I need to update this with new discussions, sorry but it’s very helpful to get you going on what folks have been thinking on the topic.
I hope these help.