Sending WebRTC MediaStream over Websocket (RTP over HTTP/Websocket)

No, that will not be possible using WebRTC.

WebRTC was built to give browsers three main features:

  1. Capability to access the device's cameras and microphones;
  2. Capability to establish SRTP sessions to send/receive audio and video;
  3. Capability to establish peer-to-peer data channels between browsers;

This features are accessible to web applications via a Javascript API defined here. To access media devices, you can use getUserMedia() and you will get a MediaStream to attach to HTML5 audio and video tags. To create an SRTP session, you need to create a peer connection and manage the streams to use.

You have to request the browser a media SDP offer and send it to the other party using any protocol (e.g. websockets). When the other party receives your SDP offer, it can inject it into the browser, request an SDP answer, and send it back. Once both browsers have the offers, they start the SRTP negotiation, using ICE.

So, you will not have access to RTP packets to send them over websockets.


Actually, the plan is to support RTCP-mux RFC 5761 and some form of BUNDLE (still under debate) to merge all streams onto a single port. However, the port will be chosen by ICE/STUN. When needed, it would also use TURN, and eventually support TURN-TCP, which could run over port 80 I believe. Quality will suffer, however.