Multi-multi video/audio broadcast with WebRTC in browser
My last project requires audio/video call feature where user can turn on audio video during the call and can initiate it without other part is online and waiting for the connection.
After some research i’ve found WebRTC solution that fit my requirements: RTCMultiConnection
Here i want to demonstrate step by step instruction for audio/video broadcast implementation with firebase stream id sharing
Full source code for this sample can be found on GitHub
To start using this lib we need to include it. On demo i’ll use demo server for streaming purpose, but you can deploy your own server.
To support firebase database, also add required dependencies:
Let’go code. To handle audio/video state i will use simple “model” with 2 fields which reflects checkbox state
On model change we will try to determine what should we do with the sender stream: start/toggle mic,video/stop
If there is no current streaming - create connection and start it.
If we are already online - change stream options - stop/start video or audio stream
To stop where both toggles are off
To send information to other side of broadcast, we need to share streamId, which is just a random genenerated string in demo.
To use this we use firebase database storage. When we get streamPublished callback, just update ref on firebase database.
Make sure we remove ref on disconnect, so clients can handle disconnection.
Clients listening the ref on page load, and handle new items appear in it and start playing: