|live, streaming video. How?|
To "televise" a series of conference speaker events, I need to get my infrastructure in place for live, streaming video onto the interweb.
- 1 camera... maybe two. definitely 2 microphones. Mixed live.
- a moderate amount of latency is OK. The speakers won't be doing live phone calls with the remote audience or anything like that. But they will be reacting to interactions via a twitter feed w/ hashtags. So it's got to be actually live, not pre-recorded.
- minimalistic hardware please. If I can do this all on a macbook pro and a cloud server, I'll be ecstatic.
I've only done a little bit of research so far, and I'm not finding much helpful information. It seems like most people hire another company to do this, and it's rather costly. I'm more interested in doing it myself, and I'll rent whatever equipment, cameras, servers etc that I need to pull it off.
I'm not a technical newb and I am willing to put a lot of time and effort into getting this set up. I'm just not sure where to start my research.
can you point me the right way?
First you need a media server and today there are lot's of free streaming services. Search for "free conference streaming" and you should find lots of options. Their services are quite sophisticated, enabling you to create one or more channels, and they will provide free encoding software. Their encoders will enable you to stream to a Flash media server and you should be able to get encoder software for Macs, because they have used Flash throughout.
Next you will need a camera, and here you have to be careful to check that the input ports are suitable for the encoder. I used to use a couple of cameras for live events using DV outputs (don't actually recall the port type) but the newer cameras don't have these ports any more. So before you get yourself a camera download the encoding software and read up on which ports can be used.
Then you'll need a decent Internet connection to stream. You can lower the bitrate on your stream but you need consistency. Also, consider your users as some may only get 500 k/sec, and even on higher speed networks they do get overloaded. There's nothing worse than watching something that lags.
What I do recommend before committing any cash is to set up a free streaming account and then borrow a camera. Don't try to save a backup while streaming because your notebook will be lucky to provide the memory resource just for the stream. Backups can be created on the media server end.
If you end up with a stream service that has a free threshold and charges thereafter, please be aware that whatever bitrate you are providing to one user, you can multiply for ever extra user... so 1.5 Mb/sec x 100 users = 150Mb/sec!
Frame size, frame skipping, number of colors and to a degree sound quality can be used to dramatically lower the bit rate per user.
Back in the day (CU-SeeMe) frame size was 160x120 px and it took a LOT of users to swamp a T1 connection!
Would you say that this isn't a DIY project that I would want to set up on my own? I'm keen to run this show myself, perhaps on a rented cloud server, but... tell me if that idea is insane.
Cameras, I can get - all sorts and types with any connection imaginable. I have connections in the video world - a brother who does video editing & production ;) but he doesn't know anything about streaming online :(
One other thing I know zilch about is how I'd mix the two audio sources (one mic on stage, another in the audience) with the video, and get that going in real time for the broadcast. Oh and it would be amazing if I could also add graphics, like titles or those infobars at the bottom of the screen with people's names on them.
Perhaps I'm getting too ambitious already. First things first.
Learn to walk before trying to run. See it working in its basics first. Mixing mikes and cameras will require additional software. So does overlaying titles. I remember a time when we had to make our our software for live overlays but there's a lot available today.
I've signed up for "livestream free", and I need look no further, it's everything I wanted and more.
thanks for the advice!
I haven't looked into the Livestream site very deeply... but to answer one of your questions, the device that's needed to allow you to use multiple cameras, as well as graphic inputs, is called a "video switcher"... or video switcher and mixer. Livestream may have all this integrated into one package, along with a web interface, but I'm not seeing exactly where.
Video switchers used to take up a whole studio. Now they can be put into small portable devices. Can't just be software, though there are some ingenious software packages... but hardware interfaces of all sorts are necessary to connect the cameras and microphones and switchers you use.
One camera is many orders or magnitude simpler than multiple cameras, though necessarily also much more limited.
Let me say that there's both art and technology involved in coordinating a live multi-camera video broadcast, or a live video taping session. It's more than a one-person job, and involves properly matched cameras, live color correction capabilities, appropriate technical specs for all pieces to interconnect, audio mixers, and color-calibrated video monitors... along with people who know how to use the tools. Pre-production of graphics is of course necessary.
The live Twitter feed suggests either a separate graphics input (and thus video switching)... or you might try projecting the Twitter feed on a screen and having the speakers interact with it that way... with the camera panning off to the screen. That can be pretty awkward, though, depending on how well you set that up, balance the light levels, etc... and how good the camera is.
You also need to understand the dynamics of video presentation and editing enough to do it in real time. I wouldn't try this with anything important without a lot of practice and preparation. There is a reason people hire professionals, even for the most apparently simple jobs, on such a production. I've found that every shortcut you take in such situations, unless you really know what you're doing (and are lucky), can ultimately be very costly.
Good audio, btw, is critical, and is the common failing of much of the web video I've seen.
That's a lot to learn... luckily I have a sibling who does this professionally and he's offered to help with the production. He's got dozens of cameras, dozens of microphones of every sort, mixing equipment, monitors... it couldn't be more convenient. I've experimented a little with LiveStream already, and I realize that to pull this off is going to take practice and a dozen dry runs.
One thing I think is pure awesome is that from one console, I can have a fixed camera pointed at the podium, but I can also stream in multiple channels from other cameras, but I can also "plant" handheld cameras in the audience and on stage, by streaming in video from an iphone.
As you describe them, Livestream's capabilities are impressive, and I'd like to know more. From what I saw as far as I looked on their site, I'm not at all sure how the components hooked up and what the actual on-the-frontline control interface is like.
Their online video demo hit me as superficial, almost to the point of being deceptive. I mean, that sailing footage shown in the video was not lived-switch by someone with a laptop calmly sitting on the deck. On one yacht racing shoot I remember vividly, I was hanging on for dear life and my crew was below decks taking turns puking into the galley sink. The Livestream video made all that look way too easy, and I think they must have glossed over the switching operations too. ;)
I got a sense from the video that the Livestream system itself provides software based switching of visual sources, but probably in addition to the traditional video camera switching probably handled by a traditional video switcher. Or does Livestream handle video switching too?. I'm thinking the software must convert all video and web feed inputs to a common video streaming protocol, and Livestream must control them all and ultimately deliver audio and video channels to a streaming server. Does this describe anything close to what it does? Would love some background on how it's set up.
|...luckily I have a sibling who does this professionally and he's offered to help with the production. |
That's very lucky, particularly because, as you described him, he does "video editing & production", and those are key skills here. In a live production, the video director, who calls the moves in real time and maybe does his own switching (real time editing) is key... and it takes an amazing presence of mind to handle it all. Some of the work is best divided up among several people, with headsets used to coordinate everybody.
|I know zilch about is how I'd mix the two audio sources (one mic on stage, another in the audience) |
I don't see the audio as being particularly difficult, but ideally you should have an audio-mixer (ie, a person, listening and running the controls, in addition to the audio mixing hardware/software). Otherwise, you're going to have multiple microphones open, and that's liable to give you some occasionally very odd sound.
The traditional physical analogy of multi-camera shoot is that each input is a separate audio or video "track", and you can switch and/or fade between these, or layer them. In real time, with multiple cameras, you use the time during which a camera isn't "live" to reset it for the next shot.
It's helpful to plan the basic coverage in advance, and then improvise around that. Overall, though, everything has to run like clockwork, particularly your graphics sources, which need to be in careful sequence. Plan out every detail, things like whether the principle speaker is going to read the questions on Twitter aloud.
I myself wouldn't have a fixed camera on the speaker at the podium, but that's me. Lots of people these days are using fixed cameras on lectures... but, IMO, not having a camera operator forces your main shot of the speaker to be way too wide, and the whole presentation becomes a lot duller. I'd sacrifice the hand-held camera onstage to have an operator on the podium camera.
I think you'll see as you run through it that it may be helpful to have separation of tasks... producing, directing, switching, engineering... depending upon complexity. A live production is much more demanding than one where you record the isolated cameras and edit later. The simpler you can make things, the smaller your crew can become.
When you use iPhones as handheld video sources, btw, try to make their angles distinctive enough that you can get away with the extreme quality mismatch you'll have between those and your main tripod cameras. IMO, you're better off using them for audience coverage and for, say, an extremely wide shot, than you are to try to use them for another angle on your speaker on stage. Pay special attention to "screen direction", and try to keep that screen direction consistent. Your brother can tell you more about that.
When is this going to happen, and what part are you going to handle?