We recently got the "smart" idea of expanding our sitemap to individually list and link to every page on our site using a format like keyword-A, keyword-B, keyword-C...where the same keyword is listed dozens of times. We were actually thinking we could make the sitemap into a table of contents that surfers could actually use. Our's is a commercial site consisting of a couple hundred pages (many being pdf spec sheets) showing the products we manufacture and sell, however we do not sell online. The purpose of the site is to inform and generate request for quotes. There are six different keywords on the sitemap, each repeated dozens of times with unique, trailing, qualifying adjectives. The sitemap is linked from the index page and every other page on the site. Before the "expansion", it used to carry a pagerank of 5; now it is 0 (a white bar). Initially the expansion was a success. We maintained pr5 and the sitemap itself did rank fairly high for a number of keywords; in many cases it was our highest ranked page for a particular keyword. But did we outsmart ourselves and incur a Google penalty for spam? Would this be an automatic, algorithm-generated penalty or one that would have to be reported manually?