| 12:23 pm on Sep 9, 2004 (gmt 0)|
i've seen 1000+ links per page, and that page was NOT considered as spam-page.
| 12:45 pm on Sep 9, 2004 (gmt 0)|
Google will very rarely use a single measure (e.g. number of links) to determine whether or not a page is spam. They could, but then they'd be hurting a lot of legitimate pages as well. As a result they probably look at several factors.
For the other part of your question, give it time. Google can take a while to fully crawl a site. Many have suggested that better Page Rank will get more pages on your site crawled faster, so you may want to investigate that option. Otherwise, make sure your pages are linked well so that a spider can find them all and be patient.
| 12:50 pm on Sep 9, 2004 (gmt 0)|
I'll keep it short, since all these questions have been answered before.
Google recommends less than 100 links per page. This is a recommendation and doesn't mean pages with more links won't work.
The page rank determines how much time the google spider will spend on crawling your website and how frequently the crawler will return. This means there is no page limit, but rather a time limit. To get more pages into google you can either increase your page rank or make the page delivery faster.
| 4:57 pm on Sep 9, 2004 (gmt 0)|
I didnt ask for how long it would take to get my links indexed, i'm just curious how many links to put into a sitemap without spamming my page and hurting its seo. Hasnt anyone really considered this dilmemma? In addition, i'm only interested in the first pass by google, we've run several experiments as to how many pages google will follow, and thought perhaps others in this field might share their experiences. Also curious if placing all these links next to one another might make the page more spammy? Wouldnt everyone agree that part of Google's algorhythm devalues a page (or adds seo value) based on how many links are on that page, and possibly their proximity to one another.
| 5:04 pm on Sep 9, 2004 (gmt 0)|
>Wouldnt everyone agree that part of Google's algorhythm devalues a page (or adds seo value) based on how many links are on that page, and possibly their proximity to one another.
I would disagree based on the evidence. If Google devalued pages with lots of links on them, I would expect directories like the ODP to rank very poorly in Google. They don't. In fact, people here have been complaining that directory pages tend to dominate in the SERPs over specific sites relevant to the query.
| 6:51 pm on Sep 9, 2004 (gmt 0)|
" would expect directories like the ODP to rank very poorly in Google. They don't. In fact, people here have been complaining that directory pages tend to dominate in the SERPs over specific sites relevant to the query.
I disagree. What what happens to a page you cram 101 links into, and then do this for all your webpages. Also in general ODP does NOT put over 100 (in most cases no where near that many links in a single page). I think your example fails to justify your assumption. Anyone have direct experience with number of links per page? Or examples of sites with hundredS of links per page that do well?
| 11:32 pm on Sep 9, 2004 (gmt 0)|
I'm just confused, are you guys recommending that it would be ok to put hundreds (thousands?) of links per page, and that this setup would NOT harm that page's SERPS? I completely disagree, and furthermore believe that the number of links per page is somewhat relative to that pages individual seo strength.
In addition, are you saying that eventually if you did have hundreds or thousands of links on a page, google would eventually follow every single one of them and index them all? From our experience, neither scenario is a optimistic or realistic approach.
| 12:01 am on Sep 10, 2004 (gmt 0)|
Personally I wouldnít go over 100 links per page. I read somewhere that Google will stop spidering text at 100 kb, and only after it comes back will pick up the rest of the text on a page that it hasnít completely spidered.
Iím not sure if the above statement is completely true, but something else to think about. Is if your visitors will like a website that has over 100 links on a page.
I would think that the website would look a little to busy for my taste.
Just something to think about.
| 12:11 am on Sep 10, 2004 (gmt 0)|
giga if your page has a 5+PR and has lots of visitors per day somehow i have the theory that like alexa google notices that via the toolbar,i believe what happens is that those havily visited pages ,if you notice every day they have a new date on the results and is mostly the top 50,now all the most visited subpages of those ones they have a day earlier or later on the results,that means that google indead crowls all the links that are eather at the index or at the site map of a popular and highPR page ,as about how many links you can put a good portion is about 150 of your most targeted subpages ,while you add other links in your high PR subpages
| 2:08 am on Sep 10, 2004 (gmt 0)|
I have a pr5 page that has 152 links plus a 54 link menu. The page doesn't do very well at all, which is how I planned it.
BUT, the 154 pages linked to from this "directory" type page, do pretty well; in the range of 2nd-12th out of 800k-1.2 million SERPS depending on the 3 word search term used.
I wouldn't run more than 100 links per page on a regular basis (just a feeling), but it works from time to time in specific instances.
| 2:53 am on Sep 10, 2004 (gmt 0)|
Now thats the kinda replies and info I was looking for. Good advice, and thanks for sharing your experience.
| 3:27 am on Sep 10, 2004 (gmt 0)|
It's not a great idea IMO to have tons of pages with 100's if not 1000's of links each.
I think everyone here is giving you examples of what they would/wouldn't do in order to create *good* pages but you obviously have a different agenda.
Without explaining your intentions it's going to be hard for people to put out an answer that is really directed to your question's underlyings.
If you have 1000's of links on multiple pages with <PR6 or so it will:
1. Take awhile for these links to be crawled
2. Make the pages too large for general SE page size guidelines
3. Be of little use to most visitors (that I can imagine)
4. Lend very little bonus to the linked to pages
5. Look pretty spammy
| 4:14 am on Sep 10, 2004 (gmt 0)|
To answer the original question, Google will pickup the first 100 Kb (note this is not the first 100 links, but 100 kb of html content) of a page, even if that consists totally of links.
The only negative effect I've seen is that it will tend to split the page's outgoing PR up a lot, but that doesn't matter as much, assuming all the linked to pages send PR back to be recycled within the site in some fashion.
| 5:47 am on Sep 10, 2004 (gmt 0)|
Note message #5 from GoogleGuy and this first thread...
How strict is the 100 links per page concept?
Getting Google to crawl 150,000 pages
What's the best way to get Google to crawl deep?
|Hasnt anyone really considered this dilmemma? |
giga - Airportibo's answer is a very good one. Maybe you should read it again. Searching WebmasterWorld on Google is an excellent way to find answers to questions that have been asked a lot, and to see whether they've been asked.
| 6:08 am on Sep 10, 2004 (gmt 0)|
|The only negative effect I've seen is that it will tend to split the page's outgoing PR up a lot, but that doesn't matter as much, assuming all the linked to pages send PR back to be recycled within the site in some fashion. |
THANK YOU VERY MUCH, excellent points.