mediasoup

/ home / Documentation / v3 / Communication Between Client and Server

Communication Between Client and Server

mediasoup does not provide any signaling protocol to communicate clients and server. It's up to the application communicate them by using WebSocket, HTTP or whichever communication means, and exchange mediasoup related parameters, requests/responses and notifications between clients and server. In most scenarios this communication must be bidirectional so a full-duplex channel is usually required. However the application can reuse the same channel for non mediasoup related message exchange (such as authentication procedures, chat messages, file transfer and whatever the application wishes to implement).

Check the RTP Parameters and Capabilities section for more details.


Guidelines for mediasoup-client and libmediasoupclient

Let's assume our JavaScript or C++ client side application instantiates a mediasoup-client Device or a libmediasoupclient Device object to connect to a mediasoup Router (already created in the server) and send and receive media over WebRTC.

Both mediasoup-client (client side JavaScript library) and libmediasoupclient (C++ library based on libwebrtc) generate RTP parameters suitable for mediasoup, thus simplifying the development of the client side application.

Signaling and Peers

The application may use WebSocket and associate each authenticated WebSocket connection with a “peer”.

Notice that there are no “peers” per se in mediasoup. However the application may wish to define “peers”, which may identify and associate a specific user account, WebSocket connection, metadata, and a set of mediasoup transports, producers, consumers. data producers and data consumers.

Device Loading

The client side application loads its mediasoup device by providing it with the RTP capabilities of the server side mediasoup router. See device.load().

Creating Transports

Both mediasoup-client and libmediasoupclient need separate WebRTC transports for sending and receiving. Typically, the client application creates those transports in advance, before even wishing to send or receive media.

For sending media:

For receiving media:

If SCTP (AKA DataChannel in WebRTC) are desired on those transports, enableSctp must be enabled in them (with proper numSctpStreams) and other SCTP related settings.

Producing Media

Once the send transport is created, the client side application can produce multiple audio and video tracks on it.

Consuming Media

Once the receive transport is created, the client side application can consume multiple audio and video tracks on it. However the order is the opposite (here the consumer must be created in the server first).

Producing Data (DataChannels)

Once the send transport is created, the client side application can produce multiple DataChannels on it.

Consuming Data (DataChannels)

Once the receive transport is created, the client side application can consume multiple DataChannels on it. However the order is the opposite (here the consumer must be created in the server first).

Communicating Actions and Events

As a core principle, calling a method is a mediasoup instance does not generate a direct event in that instance. In summary, this means that calling close() on a router, transport, producer, consumer, data producer or data consumer will not trigger any event on it.

When a transport, producer, consumer, data producer or data consumer is closed in client or server side (e.g. by calling close() on it), the application should signal its closure to the other side which should also call close() on the corresponding entity. In addition, the server side application should listen for the following closure events and notify the client about them:

The same happens when pausing a RTP producer or consumer in client or server side. The action must be signaled to the other side. In addition, the server side application should listen for the following events and notify the client about them:

When simulcast or SVC is in use, the application may be interested in signaling preferred layers and effective layers between client and server side consumers.

Guidelines for FFmpeg and GStreamer

Both, FFmpeg and GStreamer (and any other similar software), can be used to inject media into a mediasoup router or to consume media from a mediasoup router (for recording purposes, transcoding, streaming using HLS, etc).

This can be done by creating a server side plain transport (via router.createPlainTransport()) and then calling produce() or consume() on it with the appropriate parameters.

Check the broadcaster example (based on FFmpeg) in the mediasoup demo application.

Some useful resources:

Producing Media from an External Endpoint (RTP In)

If you wish to produce media in a mediasoup router by using an external tool (such as FFmpeg or GStreamer) or make mediasoup receive media produced by other RTP source:

Consuming Media in an External Endpoint (RTP Out)

If you wish to route the media of a producer to an external RTP device or endpoint (such as FFmpeg or GStreamer):

Example: Inject Audio and Video using FFmpeg

Let's assume we have a /home/foo/party.mp4 file with a stereo audio track and a video track that we want to inject into a mediasoup router. We run FFmpeg in the server host so media transmission takes places in the localhost network.

const audioTransport = await router.createPlainTransport(
  { 
    listenIp : '127.0.0.1',
    rtcpMux  : false,
    comedia  : true
  });

// Read the transport local RTP port.
const audioRtpPort = audioTransport.tuple.localPort;
// => 3301

// Read the transport local RTCP port.
const audioRtcpPort = audioTransport.rtcpTuple.localPort;
// => 4502
const videoTransport = await router.createPlainTransport(
  { 
    listenIp : '127.0.0.1',
    rtcpMux  : false,
    comedia  : true
  });

// Read the transport local RTP port.
const videoRtpPort = videoTransport.tuple.localPort;
// => 3501

// Read the transport local RTCP port.
const videoRtcpPort = videoTransport.rtcpTuple.localPort;
// => 2989
const audioProducer = await audioTransport.produce(
  {
    kind          : 'audio',
    rtpParameters :
    {
      codecs :
      [
        {
          mimeType     : 'audio/opus',
          clockRate    : 48000,
          payloadType  : 101,
          channels     : 2,
          rtcpFeedback : [ ],
          parameters   : { sprop-stereo: 1 }
        }
      ],
      encodings : [ { ssrc: 11111111 } ]
    }
  });
const videoProducer = await videoTransport.produce(
  {
    kind          : 'video',
    rtpParameters :
    {
      codecs :
      [
        {
          mimeType     : 'video/vp8',
          clockRate    : 90000,
          payloadType  : 102,
          rtcpFeedback : [ ], // FFmpeg does not support NACK nor PLI/FIR.
        }
      ],
      encodings : [ { ssrc: 22222222 } ]
    }
  });
ffmpeg \
  -re \
  -v info \
  -stream_loop -1 \
  -i /home/foo/party.mp4 \
  -map 0:a:0 \
  -acodec libopus -ab 128k -ac 2 -ar 48000 \
  -map 0:v:0 \
  -pix_fmt yuv420p -c:v libvpx -b:v 1000k -deadline realtime -cpu-used 4 \
  -f tee \
  "[select=a:f=rtp:ssrc=11111111:payload_type=101]rtp://127.0.0.1:3301?rtcpport=4502|[select=v:f=rtp:ssrc=22222222:payload_type=102]rtp://127.0.0.1:3501?rtcpport=2989"

The FFmpeg command line arguments above may not be perfect. This is the mediasoup documentation. Check the FFmpeg documentation or the GStreamer documentation to properly use them.

In other words: Please do not make questions about FFmpeg or GStreamer in the mediasoup Discourse Group.

Guidelines for SRTP Capable Endpoints

Since mediasoup 3.5.0, both pipe and plain transports support SRTP. There are endpoints that do not support WebRTC but do support SRTP. Those endpoints can be connected to mediasoup via a plain transport by enabling SRTP for encrypted RTP transmission.

SRTP Endpoint as SDP Offerer

Let's assume RTCP-mux support in the SRTP Endpoint and also comedia mode (read the PlainTransportOptions API documentation in mediasoup for more information about it).

await plainTransport.connect(
  {
    cryptoSuite: "AES_CM_128_HMAC_SHA1_80",
    keyBase64: "PS1uQCVeeCFCanVmcjkpPywjNWhcYD0mXXtxaVBR"
  });

SRTP Endpoint as SDP Answerer

In this case (if the SRPT endpoint won't send RTP but just receive it), comedia: true is not valid.

Guidelines for DataChannel termination in Node.js

Related:

Guidelines for node-sctp (SCTP/DataChannel in Node.js)

Probably you want to avoid this and focus on Guidelines for DataChannel termination in Node.js instead.

The node-sctp library can be used to send and receive SCTP messages into/from a mediasoup router and, hence, interact via DataChannel with WebRTC endpoints.

mediasoup (also) supports SCTP over plain UDP, which is also supported by node-sctp. Therefore, in order to create a DataProducer in Node.js:

Remember to close the UDP socket (udpSocket.close()) and also call sctpSocket.end() once the SCTP socket should be destroyed to avoid leaks.

See a complete usage example with both, Node.js DataProducers and DataConsumers, in the server/lib/Bot.js file of the mediasoup-demo project.