Google, via Matt Cutts, have stated that they have the ability to discount link value based on many factors including location within a page.
Similar page sections, typically found at the end of an article, can be considered automated and it's clear that Google isn't a big fan of anything automated in general.
Of course the best type of interlinking is manually within articles but on large sites that's not an easy task. There is software that allows a webmaster to specify keywords and have them point to pre-selected pages. Some of these software applications are quite advanced and allow you to choose how many links per page are included, wether or not multiple keywords can point to one page from any other page etc..etc.
If I switch from using the article bottom similar pages section to software that links actual words within articles (a la wikipedia) and set it up so that the number of links per page remains the same, points to the same places, etc should I expect a slight rankings boost over time?
Or do similar page sections get full value?