"live" to "live". Piped from my Ruby (sorry, I said it was C++ earlier) capture program to ffmpeg. ffmpeg has built-in support to send output to ffserver.
Note, however, that ffserver itself does use a temporary file for buffering. You have control over where the file lives through ffserver.conf, though, so you could arrange to store it in a tmpfs (RAM) filesystem.
Of course, this was on Linux - won't work on Windows due to the lack of real piping. The "-i -" parameter tells ffmpeg to take the input stream from STDIN.
./vbc50cap.rb ¦ ffmpeg -an -r 15 -f mjpeg -i - -s cif -r 15 -g 15 -aic -umv -me full -vstats [localhost:809...]
# bind to all IPs aliased or not
# max number of simultaneous clients
# max bandwidth per-client (kb/s)
# Suppress that if you want to launch ffserver as a daemon.
# FLV output - good for streaming
# the source feed
# the output stream format - FLV = FLash Video
# this must match the ffmpeg -r argument
# generally leave this is a large number
# another quality tweak
# quality ranges - 1-31 (1 = best, 31 = worst)
# this sets how many seconds in past to start
# wecams don't have audio
You certainly can have multiple copies of ffmpeg running, and ffserver supports multiple streams.
Of course, if you are scaling and/or transcoding, you will be limited at some point by CPU power.