Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 4.0

Background

WebRTC is an effort to develop defined Javascript APIs that can be used to establish audio and video communication between a browser and another entity. By placing most of the required elements within the browser no additional plugins are required to utilize the APIs. The effort has concentrated on codecs involved, how media is transported, and how the media target information is declared. The discussion of these aspects have resulted in a lot of existing specifications and technologies getting used. Each aspect is mentioned below with the specification and technology that is used. The effort has NOT defined how the session information is communicated between the browser and remote entity. This is left to the individual doing the implementation. The popular choice at this point in time to facilitate testing is using WebSocket to exchange the session information between browsers through an intermediary server. Since the method by which the session information is communicated is left undefined in the future this may change to include SIP, Jingle, and other session protocols.

Codecs

Supported codecs right now include G.711, G.722, iLBC, and iSAC. Video uses the VP8 codec.

Transport

Media is sent and received using RTP with a preference for SRTP.

NAT

STUN and ICE are used to determine the type of NAT the browser is behind and how it behaves. If direct communication is not possible a TURN server can be used to relay the media traffic.

Session Information

JSEP (specification can be found here) is used as the API to process and create session descriptions. The session descriptions are formatted as SDP and contain codec details, ICE details, and more.

API

If you would like to view the API the latest specification is available here.

Implementation Proposal

As WebSocket has become the current method of exchanging session information I propose implementing a WebSocket server within Asterisk itself. This would allow WebRTC users instant access to all the protocols Asterisk has to offer with minimal work required on their side.