mediasoup does not provide any signaling protocol to communicate clients and server. It's up to the application communicate them by using WebSocket, HTTP or whichever communication means, and exchange mediasoup related parameters, requests/responses and notifications between clients and server. In most scenarios this communication must be bidirectional so a full-duplex channel is usually required. However the application can reuse the same channel for non mediasoup related message exchange (such as authentication procedures, chat messages, file transfer and whatever the application wishes to implement).
Check the RTP Parameters and Capabilities section for more details.
Let's assume our JavaScript or C++ client side application instantiates a mediasoup-client Device or a libmediasoupclient Device object to connect to a mediasoup Router (already created in the server) and send and receive media over WebRTC.
Both mediasoup-client (client side JavaScript library) and libmediasoupclient (C++ library based on libwebrtc) generate RTP parameters suitable for mediasoup, thus simplifying the development of the client side application.
The application may use WebSocket and associate each authenticated WebSocket connection with a “peer”.
Notice that there are no “peers” per se in mediasoup. However the application may wish to define “peers”, which may identify and associate a specific user account, WebSocket connection, metadata, and a set of mediasoup transports, producers, consumers. data producers and data consumers.
The client side application loads its mediasoup device by providing it with the RTP capabilities of the server side mediasoup router. See device.load().
Both mediasoup-client and libmediasoupclient need separate WebRTC transports for sending and receiving. Typically, the client application creates those transports in advance, before even wishing to send or receive media.
For sending media:
For receiving media:
If SCTP (AKA DataChannel in WebRTC) are desired on those transports, enableSctp
must be enabled in them (with proper numSctpStreams) and other SCTP related settings.
Once the send transport is created, the client side application can produce multiple audio and video tracks on it.
navigator.mediaDevices.getUserMedia()
API).transport.produce()
will resolve with a Producer instance in client side.Once the receive transport is created, the client side application can consume multiple audio and video tracks on it. However the order is the opposite (here the consumer must be created in the server first).
paused: true
and resume it once created in the remote endpoint.transport.consume()
.transport.consume()
will resolve with a Consumer instance in client side.Once the send transport is created, the client side application can produce multiple DataChannels on it.
transport.produceData()
.transport.produceData()
will resolve with a DataProducer instance in client side.Once the receive transport is created, the client side application can consume multiple DataChannels on it. However the order is the opposite (here the consumer must be created in the server first).
transport.consumeData()
.transport.consumeData()
will resolve with a DataConsumer instance in client side.As a core principle, calling a method is a mediasoup instance does not generate a direct event in that instance. In summary, this means that calling close()
on a router, transport, producer, consumer, data producer or data consumer will not trigger any event on it.
When a transport, producer, consumer, data producer or data consumer is closed in client or server side (e.g. by calling close()
on it), the application should signal its closure to the other side which should also call close()
on the corresponding entity. In addition, the server side application should listen for the following closure events and notify the client about them:
close()
on the corresponding local transport.close()
on the corresponding local producer.close()
on the corresponding local consumer.close()
on the corresponding local consumer.close()
on the corresponding local data producer.close()
on the corresponding local data consumer.close()
on the corresponding local data consumer.The same happens when pausing a RTP producer or consumer in client or server side. The action must be signaled to the other side. In addition, the server side application should listen for the following events and notify the client about them:
pause()
on the corresponding local consumer.resume()
on the corresponding local consumer (unless the consumer itself was also paused on purpose).When simulcast or SVC is in use, the application may be interested in signaling preferred layers and effective layers between client and server side consumers.
Both, FFmpeg and GStreamer (and any other similar software), can be used to inject media into a mediasoup router or to consume media from a mediasoup router (for recording purposes, transcoding, streaming using HLS, etc).
This can be done by creating a server side plain transport (via router.createPlainTransport()) and then calling produce() or consume() on it with the appropriate parameters.
Check the broadcaster example (based on FFmpeg) in the mediasoup demo application.
Some useful resources:
If you wish to produce media in a mediasoup router by using an external tool (such as FFmpeg or GStreamer) or make mediasoup receive media produced by other RTP source:
transport.produce()
) on top of the plain transport with those RTP parameters.If you wish to route the media of a producer to an external RTP device or endpoint (such as FFmpeg or GStreamer):
preferredPayloadType
values and RTP header extension preferredId
values.transport.consume()
) on top of the plain transport with the corresponding producerId
and the generated rtpCapabilities
of your external endpoint.Let's assume we have a /home/foo/party.mp4
file with a stereo audio track and a video track that we want to inject into a mediasoup router. We run FFmpeg in the server host so media transmission takes places in the localhost network.
const audioTransport = await router.createPlainTransport(
{
listenIp : '127.0.0.1',
rtcpMux : false,
comedia : true
});
// Read the transport local RTP port.
const audioRtpPort = audioTransport.tuple.localPort;
// => 3301
// Read the transport local RTCP port.
const audioRtcpPort = audioTransport.rtcpTuple.localPort;
// => 4502
const videoTransport = await router.createPlainTransport(
{
listenIp : '127.0.0.1',
rtcpMux : false,
comedia : true
});
// Read the transport local RTP port.
const videoRtpPort = videoTransport.tuple.localPort;
// => 3501
// Read the transport local RTCP port.
const videoRtcpPort = videoTransport.rtcpTuple.localPort;
// => 2989
const audioProducer = await audioTransport.produce(
{
kind : 'audio',
rtpParameters :
{
codecs :
[
{
mimeType : 'audio/opus',
clockRate : 48000,
payloadType : 101,
channels : 2,
rtcpFeedback : [ ],
parameters : { sprop-stereo: 1 }
}
],
encodings : [ { ssrc: 11111111 } ]
}
});
const videoProducer = await videoTransport.produce(
{
kind : 'video',
rtpParameters :
{
codecs :
[
{
mimeType : 'video/vp8',
clockRate : 90000,
payloadType : 102,
rtcpFeedback : [ ], // FFmpeg does not support NACK nor PLI/FIR.
}
],
encodings : [ { ssrc: 22222222 } ]
}
});
ffmpeg \
-re \
-v info \
-stream_loop -1 \
-i /home/foo/party.mp4 \
-map 0:a:0 \
-acodec libopus -ab 128k -ac 2 -ar 48000 \
-map 0:v:0 \
-pix_fmt yuv420p -c:v libvpx -b:v 1000k -deadline realtime -cpu-used 4 \
-f tee \
"[select=a:f=rtp:ssrc=11111111:payload_type=101]rtp://127.0.0.1:3301?rtcpport=4502|[select=v:f=rtp:ssrc=22222222:payload_type=102]rtp://127.0.0.1:3501?rtcpport=2989"
The FFmpeg command line arguments above may not be perfect. This is the mediasoup documentation. Check the FFmpeg documentation or the GStreamer documentation to properly use them.
In other words: Please do not make questions about FFmpeg or GStreamer in the mediasoup Discourse Group.
Since mediasoup 3.5.0, both pipe and plain transports support SRTP. There are endpoints that do not support WebRTC but do support SRTP. Those endpoints can be connected to mediasoup via a plain transport by enabling SRTP for encrypted RTP transmission.
Let's assume RTCP-mux support in the SRTP Endpoint and also comedia mode (read the PlainTransportOptions API documentation in mediasoup for more information about it).
router.createPlainTransport(options)
with comedia: true
, rtcpMux: true
and enable: true
.plainTransport.connect({ srtpParameters })
:await plainTransport.connect(
{
cryptoSuite: "AES_CM_128_HMAC_SHA1_80",
keyBase64: "PS1uQCVeeCFCanVmcjkpPywjNWhcYD0mXXtxaVBR"
});
plainTransport.srtpParameters
and build the SDP answer with a “a=crypto” attribute containing the cryptoSuite
and keyBase64
fields of plainTransport.srtpParameters
.In this case (if the SRPT endpoint won't send RTP but just receive it), comedia: true
is not valid.
router.createPlainTransport(options)
with comedia: true
, rtcpMux: true
and enable: true
(and optionally with srtpCryptoSuite: "AES_CM_128_HMAC_SHA1_32"
if the SRTP endpoint does not support “AES_CM_128_HMAC_SHA1_80” crypto suite).plainTransport.srtpParameters
and adding the corresponding “a=crypto” line with them.plainTransport.connect({ ip, port, srtpParameters })
.router.createDirectTransport()
to create a DirectTransport
and then create use directTransport.consumeData()
on it to create a DataConsumer
that will receive the data messages sent by WebRTC endpoints.dataConsumer.on('message')
event will trigger with those received messages so the Node.js application can handle them.directTransport.produceData()
to create a DataProducer
in Node.js and make the WebRTC peers consume it as usual.dataProducer.send()
method to send text or binary messages to them.Related:
Probably you want to avoid this and focus on Guidelines for DataChannel termination in Node.js instead.
The node-sctp library can be used to send and receive SCTP messages into/from a mediasoup router and, hence, interact via DataChannel with WebRTC endpoints.
mediasoup (also) supports SCTP over plain UDP, which is also supported by node-sctp. Therefore, in order to create a DataProducer
in Node.js:
streamId
.DataProducer
on the mediasoup transport via transport.produceData()
with the same streamId
.PPID
value (it must be 51 for WebRTC String and 53 for WebRTC Binary).Remember to close the UDP socket (udpSocket.close()
) and also call sctpSocket.end()
once the SCTP socket should be destroyed to avoid leaks.
See a complete usage example with both, Node.js DataProducers
and DataConsumers
, in the server/lib/Bot.js file of the mediasoup-demo project.