Mapping site design to target keywords is a fine art. The goal is to create a site architecture that is mapped to providing keyword rich content that is structured to provide quality content for users, and rank well in search engines. One of the key mistakes people make is that they create multiple pages that end up competing for the same keywords.
For most sites, this is a bad mistake. By competing with each other, there is a strong chance that the search engine will see the pages as duplicate content. In addition, you are dividing your link power with two pages competing for one set of keywords. Best case, you will end up with a double listing (one of those listings where two of your pages are shown together in the search results). Worst case, the search engines will see the pages as being too similar (duplicate), and list only one of your pages.
The problem is that the best case is really unlikely to occur. You might be able to make it occur with a lot of forethought and planning. Some sites do this well, such as Amazon and CD Universe (for example: try searching on Nirvana CDs in Google). They clearly have built an entire site architecture that provides related information on the same topic in a well structured manner.
These companies have access to content generating machines. With an enormous wealth of content, you can architect it in a fashion similar to that done by these sites.
But for most webmasters, a substantial amount of effort is required to get their content together. There will be a finite set of keywords that you want to write content for. Meeting that initial demand for content, and expanding the number of keywords you are competing for later on, are much higher priorities than trying to double up on individual keywords.
Broader search term coverage will usually bring much better results for your site. And, of course, if you are awash in content, you can then indulge in the luxury of doubling up.