Not sure if this is philosophical meandering or a statistical question. (Also not sure if I'm in the right forum, but the next passing Moderator will know.)
I took a closer look at my htaccess files. Plural*, because I've got one directory-specific file thanks to massive rearranging of the whole directory. That one's got nothing but unconditional redirects; everything else is in the real htaccess.
Every single thing in the secondary htaccess, and at least 90% of the main htaccess, exists solely for the benefit of robots. (The other 10% is the no-hotlinking routine and my personal decision to lock out an entire country. Oh, and one picture that I've got linked from a forum but I can't remember where, so I can't edit its address at the source.)
If there were no robots, the /paintings/ directory would get along handily with its directory-specific 404 page listing the new subdirectories, along with a one-line htaccess drawing attention to it. Within the directory, all links are current and correct.
When I change an illustration's format from jpg to png, or put it into a subdirectory, up goes another block of htaccess. Humans don't need it; the illustrations are called by the page they're on.
When I make a mistake in a link, so people are pointed to a nonexistent page, up goes another block of htaccess to intercept any robots who happened to see the incorrect link. (It was only up for a few days and thankfully did not catch the attention of g###, or that particular redirect would have to stay there forever.)
Would the world end if everyone said To ### with it and just wrote their htaccess for humans?
* Dual, if you speak the appropriate language. I don't count the occasional Options +Indexes one-liners.