Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This page provides the fastest way to check playback of WebRTC and LLHLS using OvenMediaEngine. For installation and detailed settings, please refer to other pages.
Run docker with the command below. OME_HOST_IP
must be an IP address accessible by the player.
Publish your live stream to OvenMediaEngine using a live encoder like OBS.
The RTMP publishing address is :
Server rtmp://Your.Docker.Host.IP:1935/app
Stream Key stream
The settings below are recommended for ultra-low latency.
Open the installed OvenPlayer Demo page in your browser.
http://Your.Docker.Host.IP:8090/
Add ws://Your.Docker.Host.IP:3333/app/stream
to the Playback URL and click the ADD SOURCE and LOAD PLAYER button to play the live stream with WebRTC.
Add http://Your.Docker.Host.IP:3333/app/stream/llhls.m3u8
to the Playback URL and click the ADD SOURCE and LOAD PLAYER button to play the live stream with LLHLS.
Setting | Value |
---|---|
Keyframe Interval
1s (DO NOT set it to 0)
CPU Usage Preset
ultrafast
Profile
baseline
Tune
zerolatency
Most browsers can't load resources via HTTP and WS (WebSocket) from HTTPS web pages secured with TLS. Therefore, if the player is on an HTTPS page, the player must request streaming through "https" and "wss" URLs secured with TLS. In this case, you must apply the TLS certificate to the OvenMediaEngine.
You can set the port for TLS in TLSPort
. Currently, LLHLS and WebRTC Signaling support TLS.
Add your certificate files to as follows:
To configure HTTPs for HLS, and WebRTC Signaling server, the TLS element must be enabled. The CertPath
has to indicate a server certificate and the KeyPath
has to indicate a private key file. They can be set to absolute paths or relative paths from the executable. If the server certificate is issued using an intermediate certificate, some browsers may complain about a certificate. In this case, you should set a bundle of chained certificates provided by a Certificate Authority in ChainCertPath
.
If you set up TLS, you can't set IP or * into <Name>. You can only set Domains that the certificate contains. If you have a certificate for *.host.com
, it means you can set domains such as aaa.host.com
, bbb.host.com
, and *.host.com
.
If the certificate settings are completed correctly, WebRTC streaming can be played wss://url
with HLS and DASH streaming https://url
.
OvenMediaEngine supports multiple protocols for input from various live sources, without compromising basic usability. This allows you to publish a variety of live sources with sub-second latency. See the sub-page for more information.
OvenMediaEngine has an XML configuration file. If you start OvenMediaEngine with systemctl start ovenmediaengine
, the config file is loaded from the following path.
If you run it directly from the command line, it loads the configuration file from:
If you run it in Docker container, the path to the configuration file is:
The Server
is the root element of the configuration file. The version
attribute indicates the version of the configuration file. OvenMediaEngine uses this version information to check if the config file is a compatible version.
The IP address
is OvenMediaEngine will bind to. If you set *, all IP addresses of the system are used. If you enter a specific IP, the Host uses that IP only.
PrivacyProtection is an option to comply with GDPR, PIPEDA, CCPA, LGPD, etc. by deleting the client's personal information (IP, Port) from all records. When this option is turned on, the client's IP and Port are converted to xxx.xxx.xxx.xxx:xxx
in all logs and REST APIs.
OvenMediaEngine needs to know its public IP in order to connect to the player through WebRTC. The server must inform the player of the IceCandidates and TURN server addresses when signaling, and this information must be the IP the player can connect to. However, in environments such as Docker or AWS, public IP cannot be obtained through a local interface, so a method of obtaining public IP using stun server is provided (available from version 0.11.1).
If OvenMediaEngine obtains the public IP through communication with the set stun server, you can set the public IP by using * or ${PublicIP} in IceCandidate and TcpRelay.
The Bind
is the configuration for the server port that will be used. Bind consists of Providers
and Publishers
. The Providers are the server for stream input, and the Publishers are the server for streaming.
The meaning of each element is shown in the following table:
VirtualHosts
are a way to run more than one streaming server on a single machine. OvenMediaEngine supports IP-based virtual host and Domain-based virtual host. "IP-based" means that you can separate streaming servers into multiples by setting different IP addresses, and "Domain-based" means that even if the streaming servers use the same IP address, you can split the streaming servers into multiples by setting different domain names.
VirtualHosts
consist of Name
, Host
, Origins
, SignedPolicy
, and Applications
.
The Domain has Names
and TLS. Names can be either a domain address or an IP address. Setting * means it allows all domains and IP addresses.
SignedPolicy is a module that limits the user's privileges and time. For example, operators can distribute RTMP URLs that can be accessed for 60 seconds to authorized users, and limit RTMP transmission to 1 hour. The provided URL will be destroyed after 60 seconds, and transmission will automatically stop after 1 hour. Users who are provided with a SingedPolicy URL cannot access resources other than the provided URL. This is because the SignedPolicy URL is authenticated. See the SignedPolicy chapter for more information.
Origins (also we called OriginMap) are a feature to pull streams from external servers. It now supports OVT and RTSP for the pulling protocols. OVT is a protocol defined by OvenMediaEngine for Origin-Edge communication. It allows OvenMediaEngine to relay a stream from other OvenMediaEngines that have OVP Publisher turned on. Using RTSP, OvenMediaEngine pulls a stream from an RTSP server and creates a stream. RTSP stream from external servers can stream by WebRTC, HLS, and MPEG-DASH.
The Origin has Location
and Pass
elements. Location is a URI pattern for incoming requests. If the incoming URL request matches Location, OvenMediaEngine pulls the stream according to a Pass element. In the Pass element, you can set the origin stream's protocol and URLs.
To run the Edge server, Origin creates application and stream if there isn't those when user request. For more learn about Origin-Edge, see the Live Source chapter.
<Application>
consists of various elements that can define the operation of the stream, including Stream input, Encoding, and Stream output. In other words, you can create as many <Application>
as you like and build various streaming environments.
<Application>
needs to set <Name>
and <Type>
as follows:
<Name>
is used to configure the Streaming URL.
<Type>
defines the operation of <Application>
. Currently, there is only a live
type.
<OutputProfile>
is a configuration that creates an output stream. Output stream name can be set with <OutputStreamName>
, and transcoding properties can be set through <Encodes>
. If you want to stream one input to multiple output streams, you can set multiple <OutputProfile>
.
For more information about the OutputProfiles, please see the Transcoding chapter.
Providers
ingest streams that come from a media source.
If you want to get more information about the <Providers>
, please refer to the Live Source chapter.
You can configure the Output Stream operation in <Publishers>
. <ThreadCount>
is the number of threads used by each component responsible for the <Publishers>
protocol.
You need many threads to transmit streams to a large number of users at the same time. So it's better to use a higher core CPU and set <ThreadCount>
equal to the number of CPU cores.
​OvenMediaEngine currently supports WebRTC, Low-Latency DASH, MEPG-DASH, and HLS. If you don't want to use any protocol then you can delete that protocol setting, the component for that protocol isn't initialized. As a result, you can save system resources by deleting the settings of unused protocol components.
If you want to learn more about WebRTC, visit the WebRTC Streaming chapter. And if you want to get more information on Low-Latency DASH, MPEG-DASH, and HLS, refer to the chapter on HLS & MPEG-DASH Streaming.
Finally, Server.xml
is configured as follows:
Providers
ingests streams that come from a media source. OvenMediaEngine supports RTMP protocol. You can set it in the configuration as follows:
When a live source inputs to the <Application>
, a stream is automatically created in the <Application>
. The created stream is passed to Encoder and Publisher.
If you set up a live stream using an RTMP-based encoder, you need to set the following in Server.xml
:
<BlockDuplicateStreamName>
is a policy for streams that are inputted as overlaps.
<BlockDuplicateStreamName>
works with the following rules:
To allow the duplicated stream name feature can cause several problems. When a new stream is an input the player may be disconnected. Most encoders have the ability to automatically reconnect when it is disconnected from the server. As a result, two encoders compete and disconnect each other, which can cause serious problems in playback.
If you want to publish the source stream, you need to set the following in the Encoder:
URL
RTMP://<OvenMediaEngine IP>[:<RTMP Listen Port>]/<App Name]>
Stream Key
Stream Name
If you use the default configuration, the <RTMP><ListenPort>
is 1935, which is the default port for RTMP. So it can be omitted. Also, since the Application named app
is created by default in the default configuration, you can enter app
in the [App Name]
. You can define a Stream Key and use it in the Encoder, and the Streaming URL will change according to the Stream Key.
Moreover, some encoders can include a stream key in the URL, and if you use these encoders, you need to set it as follows:
URL
RTMP://<OvenMediaEngine IP>[:<RTMP Listen Port>/<App Name>/<Stream Name>
If you are using the default configuration, press the URL button in the top right corner of OvenStreamEnoder, and enter the URL as shown below:
Also, <App name>
and <Stream name>
can be changed and used as desired in the configuration.
If you use the default configuration, set the OBS as follows:
You can set the Stream Key to any name you like at any time.
Starting from version OME v0.15.1, IPv6 is supported.
To use IPv6, you need to change the settings of the Server.xml
file as follows:
You can use /Server/IP
to support IPv6. In versions prior to v0.15.0, only one /Server/IP
setting could be specified, but in versions after v0.15.1, multiple settings can be specified. That is, if you add an /Server/IP
element for IPv6 to the existing configuration as follows, you can accept IPv6 requests from clients:
*
means 0.0.0.0
(INADDR_ANY
) in IPv4, and ::
means ::0
(in6addr_any
) in IPv6.
Of course, you can also specify a specific IP address of an interface instead of ::
.
OME listens to the 1935 port for RTMP as follows:
OME listens to the 1935 port for RTMP as follows:
OME listens to the 1935 port for RTMP as follows:
IceCandidates
(for WebRTC)When you specify IPv6 interface /Server/IP
, most Providers/Publishers will work with IPv6, but WebRTC will not. While the WebSocket server used as the WebRTC Signalling server works well with the above setting, but more setting is required for ICE Candidates that actually transmit/receive data.
To use IPv6 ICE Candidate, you need to add an IPv6 IceCandidate
to /Server/Bind/(Providers|Publishers)/WebRTC/IceCandidates
.
To support IPv6 in URL format settings, use [::]
instead of ::
The IceCandidate
settings for Providers and Publishers are the same.
By setting up as above, OME is ready to use ICE Candidates for IPv6 as well as IPv4. The ICE Candidate generated here can be viewed in the signaling step of the web browser.
<Origin>
Now you can set up the OME edge to look at an origin with an IPv6 IP address. To do this, you can set /Server/VirtualHosts/VirtualHost/Origins/Origin/Pass/Urls/Url
as follows:
This configuration creates a stream that refers an RTSP source provided on port 1234 of an origin which has an IPv6 address of 1:2:3:4:5:6:7:8
.
<AdmissionWebhooks>
You can also specify an IPv6 address for the server that AdmissionWebhooks
is using. To do this, set the value of /Server/VirtualHosts/VirtualHost/AdmissionWebhooks/ControlServerUrl
as follows:
The above configuration asks whether the client has the permission to publish or playback using http://[1:2:3:4:5:6:7:8]:7000/a/b/c
.
You can set the following environment variables.
OvenMediaEngine can work with a variety of open-sources and libraries. First, install them on your clean Linux machine as described below. We think that OME can support most Linux packages, but the tested platforms we use are Ubuntu 18+, Fedora 28+, and CentOS 7+.
You can build the OvenMediaEngine source using the following command:
In addition, we recommend that you permanently set environment variables as follows.
The default configuration uses the following ports, so you need to open it in your firewall settings.
You can open firewall ports as in the following example:
Element | Description |
---|---|
Value | Description |
---|
OvenMediaEngine supports the Docker image from (airensoft/ovenmediaengine) repository. After installing , you can simply run the following command:
To use TLS, you must set up a certificate. See for more information.
Env | Default Value |
---|
If the prerequisites.sh script fails, try to run sudo apt-get update
and rerun it. If it's not enough proceed with the .
if systemctl start ovenmediaengine
fails in Fedora, SELinux may be the cause. See .
Port | Purpose |
---|
To use TLS, you must set up a certificate. See for more information.
<Managers><API>
REST API Server port
RTMP
RTMP port for incoming RTMP stream.
SRT
SRT port for incoming SRT stream
MPEG-TS
MPEGTS ports for incoming MPEGTS/UDP stream.
WebRTC
Port for WebRTC. If you want more information on the WebRTC port, see the WebRTC Ingest and WebRTC Streaming chapters.
OVT
OVT port for an origin server.
OVT is a protocol defined by OvenMediaEngine for Origin-Edge communication. For more information about Origin-Edge, see the Origin-Edge Clustering chapter.
LLHLS
HTTP(s) port for LLHLS streaming.
true |
|
false | Accepts a new stream inputted as overlap and disconnects the existing stream. |
OME_HOST_IP | * |
OME_ORIGIN_PORT | 9000 |
OME_RTMP_PROV_PORT | 1935 |
OME_SRT_PROV_PORT | 9999/udp |
OME_MPEGTS_PROV_PORT | 4000/udp |
OME_LLHLS_STREAM_PORT | 3333 |
OME_LLHLS_STREAM_TLS_PORT | 3334 |
OME_WEBRTC_SIGNALLING_PORT | 3333 |
OME_WEBRTC_SIGNALLING_TLS_PORT | 3334 |
OME_WEBRTC_TCP_RELAY_PORT | 3478 |
OME_WEBRTC_CANDIDATE_PORT | 10000-10004/udp |
1935/TCP | RTMP Input |
9999/UDP | SRT Input |
4000/UDP | MPEG-2 TS Input |
9000/TCP | Origin Server (OVT) |
3333/TCP 3334/TLS | LLHLS Streaming * Streaming over Non-TLS is not allowed with modern browsers. |
3333/TCP 3334/TLS | WebRTC Signaling (both ingest and streaming) |
3478/TCP | WebRTC TCP relay (TURN Server, both ingest and streaming) |
10000 - 10009/UDP | WebRTC Ice candidate (both ingest and streaming) |
From version 0.10.4, MPEG-2 TS input is supported as a beta version. The supported codecs are H.264, AAC(ADTS). Supported codecs will continue to be added. And the current version only supports basic MPEG-2 TS with 188 bytes packet size. Since the information about the input stream is obtained using PAT and PMT, the client must send this table information as required.
This version supports MPEG-2 TS over UDP. MPEG-2 TS over TCP or MPEG-2 TS over SRT will be supported soon.
To enable MPEG-2 TS, you must bind the ports fist and map the bound ports and streams.
To use multiple streams, it is necessary to bind multiple ports, so we provide a way to bind multiple ports as in the example below. You can use the dash to specify the port as a range, such as Start port-End port
, and multiple ports using commas.
First, name the stream and map the port bound above. The macro ${Port} is provided to map multiple streams at once. Check out the example below.
This is an example of publishing using FFMPEG.
Giving the -pes_payload_size 0 option to the AAC codec is very important for AV synchronization and low latency. If this option is not given, FFMPEG bundles several ADTSs and is transmitted at once, which may cause high latency and AV synchronization errors.
From version 0.10.4, RTSP Pull input is supported as a beta version. The supported codecs are H.264, AAC(ADTS). Supported codecs will continue to be added.
This function pulls a stream from an external RTSP server and operates as an RTSP client.
RTSP Pull is provided through OriginMap configuration. OriginMap is the rule that the Edge server pulls the stream of the Origin server. Edge server can pull a stream of origin with RTSP and OVT (protocol defined by OvenMediaEngine for Origin-Edge) protocol. See the Clustering section for more information about OVT.
For example, in the above setup, when a player requests "ws://ome.com/app_name/rtsp_stream_name" to stream WebRTC, it pulls the stream from "rtsp://192.168.0.200:554" and publishes it to WebRTC.
If the app name set in Location isn't created, OvenMediaEngine creates the app with default settings. The default generated app doesn't have an OPUS encoding profile, so to use WebRTC streaming, you need to add the app to your configuration.
The pull-type provider is activated by the publisher's streaming request. And if there is no client playing for 30 seconds, the provider is automatically disabled.
According to the above setting, the RTSP pull provider operates for the following streaming URLs.
Secure Reliable Transport (or SRT in short) is an open source video transport protocol and technology stack that optimizes streaming performance across unpredictable networks with secure streams and easy firewall traversal, bringing the best quality live video over the worst networks. We consider SRT to be one of the great alternatives to RTMP, and OvenMediaEngine can receive video streaming over SRT. For more information on SRT, please visit the SRT Alliance website.
SRT uses the MPEG-TS format when transmitting live streams. This means that unlike RTMP, it can support many codecs. Currently, OvenMediaEngine supports H.264, H.265, and AAC codecs received by SRT.
Set the SRT listen port as follows:
SRT input can be turned on/off for each application. As follows Setting enables the SRT input function of the application.
There are various encoders that support SRT such as FFMPEG, OBS Studio, and srt-live-transmit. Please check the specifications of each encoder on how to transmit streams through SRT from the encoder. We describe an example using OBS Studio.
OvenMediaEngine classifies each stream using SRT's streamid. This means that unlike MEPG-TS/udp, OvenMediaEngine can receive multiple SRT streams through one port. For more information on streamid, see Haivision's official documentation.
Therefore, in order for the SRT encoder to transmit a stream to OvenMediaEngine, the following information must be included in the streamid as percent encoded.
streamid = percent_encoding("srt://{host}[:port]/{app name}/{stream name}[?query=value]")
The streamid contains the URL format, so it must be percent encoded****
OBS Studio 25.0 or later supports SRT. Please refer to the OBS official documentation for more information. Enter the address of OvenMediaEngine in OBS Studio's Server as follows: When using SRT in OBS, you can leave the Stream Key blank.
srt://ip:port?streamid=srt%3A%2F%2F{domain or IP address}[%3APort]%2F{App name}%2F{Stream name}
User can send video/audio from web browser to OvenMediaEngine via WebRTC without plug-in. Of course, you can use any encoder that supports WebRTC transmission as well as a browser.
OvenMediaEngine supports self-defined signaling protocol and WHIP for WebRTC ingest.
You can set the port to use for signaling in <Bind><Provider><WebRTC><Signaling>
. <Port>
is for setting an unsecured HTTP port, and <TLSPort>
is for setting a secured HTTP port that is encrypted with TLS.
For WebRTC ingest, you must set the ICE candidates of the OvenMediaEnigne server to <IceCandidates>
. The candidates set in <IceCandate>
are delivered to the WebRTC peer, and the peer requests communication with this candidate. Therefore, you must set the IP that the peer can access. If the IP is specified as *, OvenMediaEngine gathers all IPs of the server and delivers them to the peer.
<TcpRelay>
means OvenMediaEngine's built-in TURN Server. When this is enabled, the address of this turn server is passed to the peer via self-defined signaling protocol or WHIP, and the peer communicates with this turn server over TCP. This allows OvenMediaEngine to support WebRTC/TCP itself. For more information on URL settings, check out WebRTC over TCP.
WebRTC input can be turned on/off for each application. As follows Setting enables the WebRTC input function of the application. The <CrossDomains>
setting is used in WebRTC signaling.
OvenMediaEnigne supports self-defined signaling protocol and WHIP for WebRTC ingest.
The signaling URL for WebRTC ingest uses the query string ?direction=send
as follows to distinguish it from the url for WebRTC playback. Since the self-defined WebRTC signaling protocol is based on WebSocket, you must specify ws[s] as the scheme.
ws[s]://<host>[:signaling port]/<app name>/<stream name>?direction=send
For ingest from the WHIP client, put ?direction=whip
in the query string in the signaling URL as in the example below. Since WHIP is based on HTTP, you must specify http[s] as the scheme.
http[s]://<host>[:signaling port]/<app name>/<stream name>?direction=whip
WebRTC transmission is sensitive to packet loss because it affects all players who access the stream. Therefore, it is recommended to provide WebRTC transmission over TCP. OvenMediaEngine has a built-in TURN server for WebRTC/TCP, and receives or transmits streams using the TCP session that the player's TURN client connects to the TURN server as it is. To use WebRTC/TCP, use transport=tcp query string as in WebRTC playback. See WebRTC/tcp playback for more information.
ws[s]://<host>[:port]/<app name>/<stream name>?direction=send&transport=tcp
http[s]://<host>[:port]/<app name>/<stream name>?direction=whip&transport=tcp
To use WebRTC/tcp, <TcpRelay>
must be turned on in <Bind>
setting.
If <TcpForce>
is set to true, it works over TCP even if you omit the ?transport=tcp
query string from the URL.
We provide a demo page so you can easily test your WebRTC input. You can access the demo page at the URL below.
The getUserMedia API to access the local device only works in a secure context. So, the WebRTC Input demo page can only work on the https site **** https://demo.ovenplayer.com/demo_input.html. This means that due to mixed content you have to install the certificate in OvenMediaEngine and use the signaling URL as wss to test this. If you can't install the certificate in OvenMediaEngine, you can temporarily test it by allowing the insecure content of the demo.ovenplayer.com URL in your browser.
To create a custom WebRTC Producer, you need to implement OvenMediaEngine's Self-defined Signaling Protocol or WHIP. Self-defined protocol is structured in a simple format and uses the same method as WebRTC Streaming.
When the player connects to ws[s]://host:port/app/stream?direction=send through a web socket and sends a request offer command, the server responds to the offer sdp. If transport=tcp exists in the query string of the URL, iceServers information is included in offer sdp, which contains the information of OvenMediaEngine's built-in TURN server, so you need to set this in RTCPeerConnection to use WebRTC/TCP. The player then setsRemoteDescription and addIceCandidate offer sdp, generates an answer sdp, and responds to the server.
Protocol
URL
WebRTC
ws:://ome.com:3333/app_name/rtsp_stream_name
HLS
http://ome.com:8080/app_name/rtsp_stream_name/playlist.m3u8
DASH
http://ome.com:8080/app_name/rtsp_stream_name/manifest.mpd
LL DASH
http://ome.com:8080/app_name/rtsp_stream_name/manifest_ll.mpd
Apple supports Low-Latency HLS (LLHLS), which enables low-latency video streaming while maintaining scalability. LLHLS enables broadcasting with an end-to-end latency of about 2 to 5 seconds. OvenMediaEngine officially supports LLHLS as of v0.14.0.
LLHLS is an extension of HLS, so legacy HLS players can play LLHLS streams. However, the legacy HLS player plays the stream without using the low-latency function.
To use LLHLS, you need to add the <LLHLS>
elements to the <Publishers>
in the configuration as shown in the following example.
HTTP/2 outperforms HTTP/1.1, especially with LLHLS. Since all current browsers only support h2, HTTP/2 is supported only on TLS port. Therefore, it is highly recommended to use LLHLS on the TLS port.
LLHLS can deliver adaptive bitrate streaming. OME encodes the same source with multiple renditions and delivers it to the players. And LLHLS Player, including OvenPlayer, selects the best quality rendition according to its network environment. Of course, these players also provide option for users to manually select rendition.
See the Adaptive Bitrates Streaming section for how to configure renditions.
Most browsers and players prohibit accessing other domain resources in the currently running domain. You can control this situation through Cross-Origin Resource Sharing (CORS) or Cross-Domain (CrossDomain). You can set CORS and Cross-Domain as <CrossDomains>
element.
You can set it using the <Url>
element as shown above, and you can use the following values:
LLHLS is ready when a live source is inputted and a stream is created. Viewers can stream using OvenPlayer or other players.
If your input stream is already h.264/aac, you can use the input stream as is like below. If not, or if you want to change the encoding quality, you can do Transcoding.
When you create a stream, as shown above, you can play LLHLS with the following URL:
http[s]://domain[:port]/<app name>/<stream name>/llhls.m3u8
If you use the default configuration, you can start streaming with the following URL:
https://domain:3334/app/<stream name>/llhls.m3u8
We have prepared a test player that you can quickly see if OvenMediaEngine is working. Please refer to the Test Player for more information.
You can create as long a playlist as you want by setting <DVR>
to the LLHLS publisher as shown below. This allows the player to rewind the live stream and play older segments. OvenMediaEngine stores and uses old segments in a file in <DVR><TempStoragePath>
to prevent excessive memory usage. It stores as much as <DVR><MaxDuration>
and the unit is seconds.
ID3 Timed metadata can be sent to the LLHLS stream through the Send Event API.
You can dump the LLHLS stream for VoD. You can enable it by setting the following in <Application><Publishers><LLHLS>
. Dump function can also be controlled by Dump API.
TargetStreamName
The name of the stream to dump to. You can use * and ? to filter stream names.
Playlists
The name of the master playlist file to be dumped together.
OutputPath
The folder to output to. In the OutputPath you can use the macros shown in the table below. You must have write permission on the specified folder.
OvenMediaEngine uses WebRTC to provide sub-second latency streaming. WebRTC uses RTP for media transmission and provides various extensions.
OvenMediaEngine provides the following features:
If you want to use the WebRTC feature, you need to add <WebRTC>
element to the <Publishers>
and <Ports> in the Server.xml
configuration file, as shown in the example below.
WebRTC uses ICE for connections and specifically NAT traversal. The web browser or player exchanges the Ice Candidate with each other in the Signalling phase. Therefore, OvenMediaEngine provides an ICE for WebRTC connectivity.
If you set IceCandidate to *: 10000-10005/udp
, as in the example above, OvenMediaEngine automatically gets IP from the server and generates IceCandidate
using UDP ports from 10000 to 10005. If you want to use a specific IP as IceCandidate, specify a specific IP. You can also use only one 10000 UDP Port, not a range, by setting it to *: 10000.
OvenMediaEngine has embedded a WebSocket-based signalling server and provides our defined signalling protocol. Also, OvenPlayer supports our signalling protocol. WebRTC requires signalling to exchange Offer SDP and Answer SDP, but this part isn't standardized. If you want to use SDP, you need to create your exchange protocol yourself.
If you want to change the signaling port, change the value of <Ports><WebRTC><Signalling>
.
The Signalling protocol is defined in a simple way:
If you want to use a player other than OvenPlayer, you need to develop the signalling protocol as shown above and can integrate OvenMediaEngine.
Add WebRTC
element to Publisher to provide streaming through WebRTC.
WebRTC Publisher's <JitterBuffer>
is a function that evenly outputs A/V (interleave) and is useful when A/V synchronization is no longer possible in the browser (player) as follows.
If the A/V sync is excessively out of sync, some browsers may not be able to handle this or it may take several seconds to synchronize.
Players that do not support RTCP also cannot A/V sync.
WebRTC Streaming starts when a live source is inputted and a stream is created. Viewers can stream using OvenPlayer or players that have developed or applied the OvenMediaEngine Signalling protocol.
Also, the codecs supported by each browser are different, so you need to set the Transcoding profile according to the browser you want to support. For example, Safari for iOS supports H.264 but not VP8. If you want to support all browsers, please set up VP8, H.264, and Opus codecs in all transcoders.
WebRTC doesn't support AAC, so when trying to bypass transcoding RTMP input, audio must be encoded as opus. See the settings below.
Some browsers support both H.264 and VP8 to send Answer SDP to OvenMediaEngine, but sometimes H.264 can't be played. In this situation, if you write the VP8 above the H.264 code line in the Transcoding profile setting, you can increase the priority of the VP8.
Using this manner so that some browsers, support H.264 but can't be played, can stream smoothly using VP8. This means that you can solve most problems with this method.
If you created a stream as shown in the table above, you can play WebRTC on OvenPlayer via the following URL:
If you use the default configuration, you can stream to the following URL:
ws://[OvenMediaEngine IP]:3333/app/stream
wss://[OvenMediaEngine IP]:3333/app/stream
We have prepared a test player to make it easy to check if OvenMediaEngine is working. Please see the Test Player chapter for more information.
OvenMediaEnigne provides adaptive bitrates streaming over WebRTC. OvenPlayer can also play and display OvenMediaEngine's WebRTC ABR URL.
You can provide ABR by creating a playlist
in <OutputProfile>
as shown below. The URL to play the playlist is ws[s]://domain[:port]/<app name>/<stream name>/<playlist file name>
<Playlist><Rendition><Video>
and <Playlist><Rendition><Audio>
can connected using <Encodes><Video><Name>
or <Encodes><Audio><Name>
.
It is not recommended to use a <Bypass>true</Bypass> encode item if you want a seamless transition between renditions because there is a time difference between the transcoded track and bypassed track.
If <Options><WebRtcAutoAbr>
is set to true, OvenMediaEngine will measure the bandwidth of the player session and automatically switch to the appropriate rendition.
Here is an example play URL for ABR in the playlist settings below. wss://domain:13334/app/stream/abr
Streaming starts from the top rendition of Playlist, and when Auto ABR is true, the server finds the best rendition and switches to it. Alternatively, the user can switch manually by selecting a rendition in the player.
See the Adaptive Bitrates Streaming section for more details on how to configure renditions.
WebRTC can negotiate codecs with SDP to support more devices. Playlist can set rendition with different kinds of codec. And OvenMediaEngine includes only renditions corresponding to the negotiated codec in the playlist and provides it to the player.
If an unsupported codec is included in the Rendition, the Rendition is not used. For example, if the Rendition's Audio contains aac, WebRTC ignores the Rendition.
In the example below, it consists of renditions with H.264 and Opus codecs set and renditions with VP8 and Opus codecs set. If the player selects VP8 in the answer SDP, OvenMediaEngine creates a playlist with only renditions containing VP8 and Opus and passes it to the player.
There are environments where the network speed is fast but UDP packet loss is abnormally high. In such an environment, WebRTC may not play normally. WebRTC does not support streaming using TCP, but connections to the TURN (https://tools.ietf.org/html/rfc8656) server support TCP. Based on these characteristics of WebRTC, OvenMediaEngine supports TCP connections from the player to OvenMediaEngine by embedding a TURN server.
You can turn on the TURN server by setting <TcpRelay> in the WebRTC Bind.
Example : <TcpRelay>*:3478</TcpRelay>
OME may sometimes not be able to get the server's public IP to its local interface. (Environment like Docker or AWS) So, specify the public IP for Relay IP
. If * is used, the public IP obtained from <StunServer> and all IPs obtained from the local interface are used. Port
is the tcp port on which the TURN server is listening.
If * is used as the IP of TcpRelay and IceCandidate, all available candidates are generated and sent to the player, so the player tries to connect to all candidates until a connection is established. This can cause delay in initial playback. Therefore, specifying the ${PublicIP} macro or IP directly may be more beneficial to quality.
WebRTC players can configure the TURN server through the iceServers setting.
You can play the WebRTC stream over TCP by attaching the query transport=tcp
to the existing WebRTC play URL as follows.
OvenPlayer automatically sets iceServers by obtaining TURN server information set in <TcpRelay> through signaling with OvenMediaEngine.
If <TcpForce>
is set to true, it will force a TCP connection even if ?transport=tcp
is not present. To use this, <TcpRelay>
must be set.
If you are using custom player, set iceServers like this:
When sending Request Offer
in the signaling phase with OvenMediaEngine, if you send the transport=tcp
query string, ice_servers
information is delivered as follows. You can use this information to set iceServers.
From OvenMediaEngine v0.14.0, updates to legacy HLS, DASH, and LLDASH are now discontinued. These will be deprecated.
LLHLS, released from v0.14.0, is superior to Dash and LLDASH in all aspects of compatibility, performance and function, and also support legacy HLS players. Therefore, we decided not to update legacy HLS, DASH and LLDASH anymore. With the energy that was used to maintain these features, we will focus on more wonderful features in the future.
OvenMediaEngine supports GPU-based hardware decoding and encoding. Currently supported GPU acceleration devices are Intel's QuickSync and NVIDIA's NVDECODE/NVENCODE. This document describes how to install the video driver for OvenMediaEngine to use the GPU and how to set the Config file. Please check what graphics card you have and refer to the NVIDIA or Intel driver installation guide.
If you are using an Intel CPU that supports QuickSync, please refer to the following guide to install the driver. The OSes that support installation using the provided scripts are CentOS 7/8 and Ubuntu 18/20 versions. If you want to install the driver on a different OS, please refer to the Manual Installation Guide document.
When the Intel QuickSync driver installation is complete, the OS must be rebooted for normal operation.
After the driver installation is complete, check whether the driver operates normally with the Matrix Monitor program.
If you are using an NVIDIA graphics card, please refer to the following guide to install the driver. The OS that supports installation with the provided script are CentOS 7/8 and Ubuntu 18/20 versions. If you want to install the driver in another OS, please refer to the manual installation guide document.
CentOS environment requires the process of uninstalling the nouveau driver. After uninstalling the driver, the first reboot is required, and a new NVIDIA driver must be installed and rebooted. Therefore, two install scripts must be executed.
After the driver installation is complete, check whether the driver is operating normally with the nvidia-smi command.
Describes how to enable GPU acceleration for users running OvenMediaEngine in the Docker runtime environment. To use GPU acceleration in Docker, the NVIDIA Driver must be installed on the host OS and the NVIDIA Container Toolkit must be installed. This toolkit includes container runtime libraries and utilities to use NVIDIA GPUs in Docker containers.
The NVIDIA Driver must have been previously installed
To use GPU when running Docker, you need to add the --gpus all option.
If the provided installation script fails, please refer to the manual installation guide.
If you have finished installing the driver to use the GPU, you need to reinstall the open source library using Prerequisites.sh . The purpose is to allow external libraries to use the installed graphics driver.
To use hardware acceleration, set the HardwareAcceleration option to true under OutputProfiles. If this option is enabled, a hardware codec is automatically used when creating a stream, and if it is unavailable due to insufficient hardware resources, it is replaced with a software codec.
You can build the OvenMediaEngine source using the following command. Same as the contents of Getting Started.
The codecs available using hardware accelerators in OvenMediaEngine are as shown in the table below. Different GPUs support different codecs. If the hardware codec is not available, you should check if your GPU device supports the codec.
D : Decoding, E : Encoding
All libraries are installed, the system must be rebooted.
Title | Descriptions |
---|---|
Element | Decscription |
---|---|
Url Value | Description |
---|---|
Macro | Description |
---|---|
Title | Functions |
---|---|
Option | Description | Default |
---|---|---|
Protocol | URL format |
---|---|
Quick Sync Video format support:
Device | H264 | H265 | VP8 | VP9 |
---|
Quick Sync Video Format :
NVIDIA NVDEC Video Format :
NVIDIA NVENV Video Format :
CUDA Toolkit Installation Guide :
NVIDIA Container Toolkit :
Delivery
HTTP/1.1 HTTP/2
Security
TLS (HTTPS)
Container
fMP4
Codecs
H.264 AAC
Bind
Set the HTTP ports to provide LLHLS.
ChunkDuration
Set the partial segment length to fractional seconds. This value affects low-latency HLS player. We recommend 0.2 seconds for this value.
SegmentDuration
Set the length of the segment in seconds. Therefore, a shorter value allows the stream to start faster. However, a value that is too short will make legacy HLS players unstable. Apple recommends 6 seconds for this value.
SegmentCount
The number of segments listed in the playlist. This value has little effect on LLHLS players, so use 10 as recommended by Apple. 5 is recommended for legacy HLS players. Do not set below 3. It can only be used for experimentation.
CrossDomains
Control the domain in which the player works through <CorssDomain>
. For more information, please refer to the CrossDomain section.
*
Allows requests from all Domains
domain
Allows both HTTP and HTTPS requests from the specified Domain
http://domain
Allows HTTP requests from the specified Domain
https://domain
Allows HTTPS requests from the specified Domain
${VHostName}
Virtual Host Name
${AppName}
Application Name
${StreamName}
Stream Name
${YYYY}
Year
${MM}
Month
${DD}
Day
${hh}
Hour
${mm}
Minute
${ss}
Second
${S}
Timezone
${z}
UTC offset (ex: +0900)
${ISO8601}
Current time in ISO8601 format
Delivery
RTP / RTCP
Security
DTLS, SRTP
Connectivity
ICE
Error Correction
ULPFEC (VP8, H.264), In-band FEC (Opus)
Codec
VP8, H.264, Opus
Signalling
Self-Defined Signalling Protocol and Embedded Web Socket-Based Server
Timeout
ICE (STUN request/response) timeout as milliseconds, if there is no request or response during this time, the session is terminated.
30000
Rtx
WebRTC retransmission, a useful option in WebRTC/udp, but ineffective in WebRTC/tcp.
false
Ulpfec
WebRTC forward error correction, a useful option in WebRTC/udp, but ineffective in WebRTC/tcp.
false
JitterBuffer
Audio and video are interleaved and output evenly, see below for details
false
WebRTC Signalling
ws://<Server IP>[:<Signalling Port]/<Application name>/<Stream name>
Secure WebRTC Signalling
wss://<Server IP>[:<Signalling Port]/<Application name>/<Stream name>
QuickSync | D / E | D / E | - | - |
NVIDIA | D / E | D / E | - | - |
Docker on NVIDIA Container Toolkit | D / E | D / E | - | - |
AdmissionWebhooks are HTTP callbacks that query the control server to control publishing and playback admission requests.
Users can use the AdmissionWebhook for a variety of purposes, including customer authentication, tracking published streams, hide app/stream names, logging and more.
AdmissionWebhooks can be set up on VirtualHost, as shown below.
AdmissionWebhooks send HTTP/1.1 request message to the configured user's control server when an encoder requests publishing or a player requests playback. The request message format is as follows.
The message is sent in POST method and the payload is in application/json format. X-OME-Signature is a base64 url safe encoded value obtained by encrypting the payload with HMAC-SHA1 so that the ControlServer can validate this message. See the Security section for more information on X-OME-Signature.
Here is a detailed explanation of each element of Json payload:
The control server may need to validate incoming http requests for security reasons. To do this, the AdmissionWebhooks module puts the X-OME-Signature
value in the HTTP request header. X-OME-Signature
is a base64 url safe encoded value obtained by encrypting the payload of an HTTP request with the HMAC-SHA1 algorithm using the secret key set in <AdmissionWebhooks><SecretKey>
of the configuration.
As shown below, the trigger condition of request is different for each protocol.
The engine in the closing state does not need any parameter in response. To the query just answer with empty json object.
ControlServer must respond with the following Json format. In particular, the "allowed"
element is required.
new_url
redirects the original request to another app/stream. This can be used to hide the actual app/stream name from the user or to authenticate the user by inserting additional information instead of the app/stream name.
For example, you can issue a WebRTC streaming URL by inserting the user ID as follows: ws://domain.com:3333/user_id
It will be more effective if you issue a URl with the encrypted value that contains the user ID, url expiration time, and other information.
After the Control Server checks whether the user is authorized to play using user_id
, and responds with ws://domain.com:3333/app/sport-3
to new_url
, the user can play app/sport-3.
If the user has only one hour of playback rights, the Control Server responds by putting 3600000 in the lifetime
.
SignedPolicy is a module that limits the user's privileges and time. For example, operators can distribute RTMP URLs that can be accessed for 60 seconds to authorized users, and limit RTMP transmission to 1 hour. The provided URL will be destroyed after 60 seconds, and transmission will automatically stop after 1 hour. Users who are provided with a SignedPolicy URL cannot access resources other than the provided URL. This is because the SignedPolicy URL is authenticated.
SignedPolicy URL consists of the query string of the streaming URL with Policy and Signature as shown below. If SignedPolicy is enabled in the configuration of OvenMediaEngine, access to URLs with no signature or invalid signature is not allowed. Signature uses HMAC-SHA1 to authenticate all URLs except signature.
Policy is in json format and provides the following properties.
url_expire means the time the URL is valid, so if you connect before the URL expires, you can continue to use it, and sessions that have already been connected will not be deleted even if the time expires. However, stream_expire forcibly terminates the session when the time expires even if it is already playing.
Signature is generated by HMAC-SHA1 encoding all URLs except signature query string. The generated Signature is encoded using Base64URL and included as a query string of the existing URL.
The URL entered into HMAC to generate the Signature must include :port.
When creating a signature, you cannot omit the default port such as http port 80, https port 443, or rtmp port 1935. This is because when OvenMediaEngine creates a signature for checking the signature, it is created by putting the port value.
When using SignedPolicy with SRT providers, only use the streamid portion of the URL, e.g. srt://myserver:9999?streamid=srt://myserver:9999/app/stream?policy=abc123
To enable SignedPolicy, you need to add the following <SignedPolicy> setting in Server.xml under <VirtualHost>.
We provide a script that can easily generate SignedPolicy URL. The script can be found in the path below.
Here's how to use this script:
For example, you can use it like this:
We hope to provide SignedPolicy URL Generator Library in various languages. If you have created the SignedPolicy URL Generator Library in another language, please send a Pull Request to our GITHUB. Thank you for your open source contributions.
In order to include the policy in the URL, it must be encoded with Base64URL.
Policy encoded with Base64URL is added as a query string to the existing streaming URL. (The query string key is set in Server.xml.)
Signature hashes the entire URL including the policy in HMAC (SHA-1) method, encodes it as Base64URL, and includes it in the query string.
Create a hash using the secret key (1kU^b6 in the example) and the URL above using HMAC-SHA1.
If you include it as a signature query string (query string key is set in Server.xml), the following SignedPolicy URL is finally generated.
Generate SignedPolicy URL with the script.
Separate the URL based on "app" as shown in the example below and enter all the parts under the stream in the Stream Key.
OvenMediaEngine supports clustering and ensures High Availability (HA) and scalability.
OvenMediaEngine supports the Origin-Edge structure for cluster configuration and provides scalability. Also, you can set Origin as Primary
and Secondary
in OvenMediaEngine for HA.
The OvenMediaEngine running as edge pulls a stream from an external server when a user requests it. The external server could be another OvenMediaEngine with OVT enabled or another stream server that supports RTSP.
The OVT is a protocol defined by OvenMediaEngine to relay stream between Origin-Edge and OVT can be run over SRT and TCP. For more information on the SRT Protocol, please visit the SRT Alliance site.
OvenMediaEngine provides OVT protocol for passing streams from the origin to the edge. To run OvenMediaEngine as Origin, OVT port, and OVT Publisher must be enabled as follows :
The role of the edge is to receive and distribute streams from an origin. You can configure hundreds of Edge to distribute traffic to your players. As a result of testing, a single edge can stream 4-5Gbps traffic by WebRTC based on AWS C5.2XLarge. If you need to stream to thousands of people, you can configure and use multiple edges.
The edge supports OVT and RTSP to pull stream from an origin. In the near future, we will support more protocols. The stream pulled through OVT or RTSP is bypassed without being encoded.
In order to re-encode the stream created by OVT and RTSP, the function to put into an existing application will be supported in the future.
To run OvenMediaEngine as Edge, you need to add Origins elements to the configuration file as follows:
The <Origin>
is a rule about where to pull a stream from for what request.
The <Origin>
has the ability to automatically create an application with that name if the application you set in <Location>
doesn't exist on the server. If an application exists in the system, a stream will be created in the application.
The automatically created application by <Origin>
enables all providers but if you create an application yourself, you must enable the provider that matches the setting as follows.
NoInputFailoverTimeout (default 3000)
NoInputFailoverTimeout is the time (in milliseconds) to switch to the next URL if there is no input for the set time.
UnusedStreamDeletionTimeout (default 60000)
UnusedStreamDeletionTimeout is a function that deletes a stream created with OriginMap if there is no viewer for a set amount of time (milliseconds). This helps to save network traffic and system resources for Origin and Edge.
For a detailed description of Origin's elements, see:
Location
Origin is already filtered by domain because it belongs to VirtualHost. Therefore, in Location, set App, Stream, and File to match except domain area. If a request matches multiple Origins, the top of them runs.
Pass
Pass consists of Scheme and Url.
<Scheme>
is the protocol that will use to pull from the Origin Stream. It currently can be configured as OVT
or RTSP
.
If the origin server is OvenMediaEngine, you have to set OVT
into the <Scheme>
.
You can pull the stream from the RTSP server by setting RTSP
into the<Scheme>
. In this case, the <RTSPPull>
provider must be enabled. The application automatically generated by Origin doesn't need to worry because all providers are enabled.
Urls
is the address of origin stream and can consist of multiple URLs.
ForwardQueryParams
is an option to determine whether to pass the query string part to the server at the URL you requested to play.(Default : true) Some RTSP servers classify streams according to query strings, so you may want this option to be set to false. For example, if a user requests ws://host:port/app/stream?transport=tcp
to play WebRTC, the ?transport=tcp
may also be forwarded to the RTSP server, so the stream may not be found on the RTSP server. On the other hand, OVT does not affect anything, so you can use it as the default setting.
The final address to be requested by OvenMediaEngine is generated by combining the configured Url and user's request except for Location. For example, if the following is set
If a user requests http://edge.com/edge_app/stream, OvenMediaEngine makes an address to ovt: //origin.com: 9000/origin_app/stream.
OriginMapStore is designed to make it easier to support autoscaling within a cluster. All Origin Servers and Edge Servers in the cluster share stream information and origin OVT URLs through Redis. That is, when a stream is created on the Origin server, the Origin server sets the app/stream name and OVT url to access the stream to the Redis server. Edge gets the OVT url corresponding to the app/stream from the Redis server when the user's playback request comes in.
This means that existing settings do not need to be updated when extending Origin servers and Edge servers. Therefore, all Origins can be grouped into one domain, and all Edges can be bundled with one domain. OriginMapStore allows you to expand Origins or Edges within a cluster without any additional configuration.
OriginMapStore functionality has been tested with Redis Server 5.0.7. You can enable this feature by adding the following settings to Server.xml of Origin and Edge. Note that must be set in Server.xml of the Origin server. This is used when Origin registers its own OVT url, so you just need to set a domain name or IP address that can be accessed as an OVT publisher.
When you are configuring Load Balancer, you need to use third-party solutions such as L4 Switch, LVS, or GSLB, but we recommend using DNS Round Robin. Also, services such as cloud-based AWS Route53, Azure DNS, or Google Cloud DNS can be a good alternative.
OvenMediaEngine can generate thumbnails from live streams. This allows you to organize a broadcast list on your website or monitor multiple streams at the same time.
Thumbnails are published via HTTP(s). Set the port for thumbnails as follows. Thumbnail publisher can use the same port number as HLS and DASH.
In order to publish thumbnails, an encoding profile for thumbnails must be set. JPG and PNG are supported as codec. And framerate and resolution can be adjusted. Framerate is the number of thumbnails extracted per second. We recommend 1 as the thumbnail framerate. Thumbnail encoding uses a lot of resources. Therefore, if you increase this value excessively, it can cause a failure due to excessive use of system resources. The resolution can be set as desired by the user, and if the ratio is different from the input image, it is stretched. We plan to support various ratio modes in the future.
Declaring a thumbnail publisher. Cross-domain settings are available as a detailed option.
When the setting is made for the thumbnail and the stream is input, you can view the thumbnail through the following URL.
The REST APIs provided by OME allow you to query or change settings such as VirtualHost and Application/Stream.
The APIs are currently beta version, so there are some limitations/considerations.
Settings of VirtualHost can only be viewed and cannot be changed or deleted.
If you add/change/delete the settings of the App/Output Profile by invoking the API, the app will be restarted. This means that all sessions associated with the app will be disconnected.
The API version is fixed with v1 until the experimental stage is complete, and the detailed specification can be changed at any time.
By default, OvenMediaEngine's APIs are disabled, so the following settings are required to use the API:
Set the <Port>
to use by the API server. If you omit <Port>
, you will use the API server's default port, port 8081
.
<Host>
sets the Host name and TLS certificate information to be used by the API server, and <AccessToken>
sets the token to be used for authentication when calling the APIs. You must use this token to invoke the API of OvenMediaEngine.
If you face a CORS problem by calling the OME API on your browser, you can set <CrossDomains>
as follows:
If protocol is omitted like *.airensoft.com
, both HTTP and HTTPS are supported.
In this manual, the following format is used when describing the API.
GET
http://<OME_HOST>:<API_PORT>/<VERSION>/<API_PATH>[/...]
Here is the description of the API
Request Example:
- Method: GET
- URL: http://1.2.3.4:8081/v1/vhost
- Header:
authorization: Basic b21ldGVzdA==
This means the IP or domain of the server on which your OME is running.
This means the port number of the API you set up in Server.xml
. The default value is 8081.
Indicates the version of the API. Currently, all APIs are v1.
Indicates the API path to be called. The API path is usually in the following form:
resource
means an item, such as VirtualHost
or Application
, and action
is used to command an action to a specific resource, such as push
or record
.
VirtualHost
settings created by Server.xml
cannot be modified through API. This rule also applies to Application
/OutputStream
, etc. within that VirtualHost. So, if you call a POST/PUT/DELETE API for VirtualHost
/Application
/OutputProfile
declared in Server.xml
, it will not work with a 403 Forbidden
error.
OvenMediaEngine can record live streams. You can start and stop recording the output stream through REST API. When the recording is complete, a recording information file is created together with the recorded file so that the user can perform various post-recording processing.
To enable recording, add the <FILE>
publisher to the configuration file as shown below. <FilePath>
and <InfoPath>
are required and used as default values. <FilePath> is the setting for the file path and file name. <InfoPath>
is the setting for the path and name of the XML file that contains information about the recorded files. If there is no file path value among parameters when requesting recording through API, recording is performed with the set default value. This may be necessary if for security reasons you do not want to specify the file path when calling the API to avoid exposing the server's internal path. <<RootPath>
is an optional parameter. It is used when requesting with a relative path is required when requesting an API. also, it is applied to <FilePath>
and <InfoPath>
as in the example below.
You must specify .ts
or .mp4
at the end of the FilePath string to select a container for the recording file. We recommend using .ts unless you have a special case. This is because vp8 and opus codecs are not recorded due to container limitations if you choose .mp4.
Various macro values are supported for file paths and names as shown below.
For control of recording, use the REST API. Recording can be requested based on the output stream name (specified in the JSON body), and all/some tracks can be selectively recorded. And, it is possible to simultaneously record multiple files for the same stream. When recording is complete, an XML file is created at the path specified in InfoPath. For a sample of the recorded file information XML, refer to Appendix A.
For how to use the API, please refer to the link below.
Split recording methods provide interval and schedule. The interval method splits files based on the accumulated recording time. The Schedule method then splits files according to scheduling options based on system time. The scheduling option is the same as the pattern used in crontab. However, only three options are used: seconds/minutes/hour.
interval and schedule methods cannot be used simultaneously.
The following is a sample of an XML file that expresses information on a recorded file.
Key | Description |
---|---|
Element | Sub-Element | Description |
---|---|---|
Protocol | Condition |
---|---|
Element | Description |
---|---|
Key | Value | Description |
---|---|---|
Key | Description |
---|---|
Name | Type | Description |
---|
Name | Type | Description |
---|
OvenMediaEngine API uses Basic HTTP Authentication Scheme to authenticate clients.
All response results are provided in the HTTP status code and response body, and if there are multiple response results in the response, the HTTP status code will be 207 MultiStatus
. The API response data is in the form of an array of or as follows:
Macro | Description |
---|
ControlServerUrl
The HTTP Server to receive the query. HTTP and HTTPS are available.
SecretKey
The secret key used when encrypting with HMAC-SHA1
For more information, see Security.
Timeout
Time to wait for a response after request (in milliseconds)
Enables
Enable Providers and Publishers to use AdmissionWebhooks
client
Information of the client who requested the connection.
address
Client's IP address
port
Client's Port number
request
Information about the client's request
direction
incoming : A client requests to publish a stream
outgoing : A client requests to play a stream
protocol
webrtc, srt, rtmp, hls, dash, lldash
status
opening : A client requests to open a stream
outgoing : A client closed the stream
url
url requested by the client
time
time requested by the client (ISO8601 format)
WebRTC
When a client requests Offer SDP
RTMP
When a client sends a publish message
SRT
When a client send a streamid
LLHLS
When a client requests a playlist (llhls.m3u8)
allowed (required)
true or false
Allows or rejects the client's request.
new_url (optional)
Redirects the client to a new url. However, the scheme
, port
, and file
cannot be different from the request. The host can only be changed to another virtual host on the same server.
lifetime (optional)
The amount of time (in milliseconds) that a client can maintain a connection (Publishing or Playback)
0 means infinity
HTTP based streaming (HLS, DASH, LLDASH) does not keep a connection, so this value does not apply.
reason (optional)
If allowed is false, it will be output to the log.
url_expire
(Required)
<Number> Milliseconds since unix epoch
The time the URL expires Reject on request after the expiration
url_activate
(Optional)
<Number> Milliseconds since unix epoch
The time the URL activates Reject on request before activation
stream_expire
(Optional)
<Number> Milliseconds since unix epoch
The time the Stream expires Transmission and playback stop when the time expires
allow_ip
(Optional)
<String> IPv4 CIDR
Allowed IP address range, 192.168.0.0/24
PolicyQueryKeyName
The query string key name in the URL pointing to the policy value
SignatureQueryKeyName
The query string key name in the URL pointing to the signature value
SecretKey
The secret key used when encoding with HMAC-SHA1
Enables
List of providers and publishers to enable SignedPolicy. Currently, SignedPolicy supports rtmp among providers, and among publishers, WebRTC, LLHLS are supported.
Method | URL Pattern |
GET | http(s)://<ome_host>:<port>/<app_name>/<output_stream_name>/thumb.<jpg|png> |
string |
authorization | string |
|
${TransactionId} | Unique ID for the recording transaction. It is automatically created when recording starts. and is released when recording is stopped. In case of split recording, it is distinguished that it is the same transaction. |
${Id} | User-defined identification ID |
${StartTime:YYYYMMDDhhmmss} | Recording start time YYYY - Year MM - Month DD - Days hh : Hours (0 mm : Minutes (0059) ss : Seconds (00~59) |
${EndTime:YYYYMMDDhhmmss} | Recording end time YYYY - Year MM - Month DD - Days hh : Hours (0 mm : Minutes (0059) ss : Seconds (00~59) |
${VirtualHost} | Virtual host name |
${Application} | Application name |
${SourceStream} | Source stream name |
${Stream} | Output stream name |
${Sequence} | Sequence value that increases when splitting a file in a single transaction |
ApplicationType
>Application type
Examples
"live"
"vod"
Codec
>Codecs
Examples
"h264"
"h265"
"vp8"
"opus"
"aac"
StreamSourceType
>A type of input stream
Examples
"Ovt"
"Rtmp"
"Rtspc"
"RtspPull"
"MpegTs"
MediaType
>type
Examples
"video"
"audio"
SessionState
>A state of the session
Examples
"Ready"
"Started"
"Stopping"
"Stopped"
"Error"
AudioLayout
>Audio layout
Examples
"stereo"
"mono"
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps
Creates Application
s in the VirtualHost
Request Example:
POST http://1.2.3.4:8081/v1/vhosts/default/apps
[ { "name": "app", "type": "live", "outputProfiles": { "outputProfile": [ { "name": "bypass_profile", "outputStreamName": "${OriginStreamName}", "encodes": { "videos": [ { "bypass": true } ], "audios": [ { "bypass": true } ] } } ] } ]
GET
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps
Lists all application names in the VirtualHost
Request Example:
GET http://1.2.3.4:8081/v1/vhosts/default/apps
GET
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}
Gets the configuration of the Application
Request Example:
GET http://1.2.3.4:8081/v1/vhosts/default/apps/app
PUT
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}
Changes the configuration of the Application
Request Example:
PUT http://1.2.3.4:8081/v1/vhosts/default/apps/app
{
"type": "live"
}
DELETE
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}
Deletes the Application
Request Example:
DELETE http://1.2.3.4:8081/v1/vhosts/default/apps/app
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts
Creates VirtualHost
s
Request Example:
POST http://1.2.3.4:8081/v1/vhosts
[ { "name": "default" } ]
Return type: Response<VirtualHost>
GET
http://<OME_HOST>:<API_PORT>/v1/vhosts
Lists all virtual host names
Request Example:
GET http://1.2.3.4:8081/v1/vhosts
Return type: Response<List>
GET
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}
Gets the configuration of the VirtualHost
Request Example:
GET http://1.2.3.4:8081/v1/vhosts/default
[TYPE]
>It allows you to insert events into streams. Right now events only support the ID3v2 format and only the LLHLS publisher handles it. Events delivered to LLHLS Publisher are inserted as emsg boxes within the m4s container.
POST
http[s]://{Host}/v1/vhosts/{vhost name}/apps/{app name}/streams/{stream name}:sendEvent
Body
eventFormat  Currently only
id3v2
is supported.eventType (Optional, Default :
event
)   Select one ofevent
,video
, andaudio
.event
inserts an event into every track.video
inserts events only on tracks of type video.audio
inserts events only on tracks of audio type.events   It accepts only Json array format and can contain multiple events.
 frameType     Currently, only TXXX and T??? (Text Information Frames, e.g. TIT2) are supported.  info    This field is used only in TXXX and is entered in the Description field of TXXX.  data    If the frameType is TXXX, it is entered in the Value field, and if the frameType is "T???", it is entered in the Information field.
POST
http[s]://{Host}/v1/vhosts/<vhost_name>/apps/<app_name>/streams/<stream_name>:startHlsDump
Body
outputStreamName (required)
 The name of the output stream created with OutputProfile.
id (required)
 ID for this API request.
outputPath (required)
 Directory path to output. The directory must be writable by the OME process. OME will create the directory if it doesn't exist.
playlist (optional)
 Dump the master playlist set in outputPath. It must be entered in Json array format, and multiple playlists can be specified.
infoFile (optional)
 This is the name of the DB file in which the information of the dumped files is updated. If this value is not provided, no file is created. An error occurs if a file with the same name exists. (More details below)
userData (optional)
 If infoFile is specified, this data is written to infoFile. Does not work if infoFile is not specified.
POST
http[s]://{Host}/v1/vhosts/<vhost_name>/apps/<app_name>/streams/<stream_name>:stopHlsDump
Body
outputStreamName (required)
 The name of the output stream created with OutputProfile.
id (optional)
 This is the id passed when calling the startHlsDump API. If id is not passed, all dump in progress at outputStreamName is aborted.
The info file is continuously updated after the dump file is written. It is in XML format and is as follows. will continue to be added.
GET
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}/streams
Lists all stream names in the Application
Request Example:
GET http://1.2.3.4:8081/v1/vhosts/default/apps/app/streams
GET
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}/streams/{stream_name}
Gets the configuration of the Stream
Request Example:
GET http://1.2.3.4:8081/v1/vhosts/default/apps/app/streams/stream
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
The can be controlled with this API.
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Type
Description
Examples
Short
16bits integer
12345
Int
32bits integer
1234941932
Long
64bits integer
391859818923919232311
Float
64bits real number
3.5483
String
A string
"Hello"
Bool
true/false
true
Timestamp (String)
A timestamp in ISO8601 format
"2021-01-01T11:00:00.000+09:00"
TimeInterval (Long)
A time interval (unit: milliseconds)
349820
IP (String)
IP address
"127.0.0.1"
RangedPort (String)
Port numbers with range (it can contain multiple ports and protocols)
start_port[-end_port][,start_port[-end_port][,start_port[-end_port]...]][/protocol]
"40000-40005/tcp"
"40000-40005"
"40000-40005,10000,20000/tcp"
Port (String)
A port number
start_port[/protocol]
"1935/tcp"
"1935"
vhost_name
string
A name of VirtualHost
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
(json body)
array
A list of Application
vhost_name
string
A name of VirtualHost
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
vhost_name
string
A name of VirtualHost
app_name
string
A name of Application
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
vhost_name
string
A name of VirtualHost
app_name
string
A name of Application
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
(json body)
object
Application
vhost_name
string
A name of VirtualHost
app_name
string
A name of Application
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
authorization | string | A string for authentication in |
(json body)* | array | A list of |
authorization | string | A string for authentication in |
vhost_name | string | A name of |
authorization | string | A string for authentication in |
Type | Name | Optional | Description | Examples |
Int | statusCode | N | Status code |
|
String | message | N | A message describing the value returned |
|
| response | Y | A response data |
|
Type | Name | Optional | Description | Examples |
String | name | N | A name of Virtual Host |
|
| td | Y | Host |
|
| signedPolicy | Y | SignedPolicy |
|
| signedToken | Y | SignedToken |
|
List< | td | Y | A list of Origin map |
|
Type | Name | Optional | Description | Examples |
List<String> | td | N | A list of hosts |
|
Tls | td | Y | TLS |
|
Type | Name | Optional | Description | Examples |
String | certPath | N | A path of cert file |
|
String | keyPath | N | A path of private key file |
|
String | chainCertPath | Y | A path of chain cert file |
|
Type | Name | Optional | Description | Examples |
String | policyQueryKey | N |
String | signatureQueryKey | N |
String | secretKey | N |
Type | Name | Optional | Description | Examples |
String | cryptoKey | N |
String | queryStringKey | N |
Type | Name | Optional | Description | Examples |
String | location | N | A pattern to map origin |
|
| pass | N | What to request with Origin if the pattern matches |
|
Type | Name | Optional | Description | Examples |
String | scheme | N | Scheme to distinguish the provider |
|
List<String> | urls | N | An address list to pull from provider |
|
Type | Name | Optional | Description | Examples |
String | name | N | App name (You cannot change this value after you create it) |
|
Bool | dynamic | N | Whether the app was created using |
|
Enum< | type | N | App type |
|
| providers | Y | A list of |
|
| publishers | Y | A list of |
|
List< | outputProfiles | Y | A list of |
|
Type | Name | Optional | Description | Examples |
| rtmp | Y |
|
| rtspPull | Y |
|
| rtsp | Y |
|
| ovt | Y |
|
| mpegts | Y |
|
Type | Name | Optional | Description | Examples |
(Reserved for future use) | - | - | - |
Type | Name | Optional | Description | Examples |
(Reserved for future use) | - | - | - |
Type | Name | Optional | Description | Examples |
(Reserved for future use) | - | - | - |
Type | Name | Optional | Description | Examples |
(Reserved for future use) | - | - | - |
Type | Name | Optional | Description | Examples |
List< | streams | Y | MPEG-TS Stream map |
|
Type | Name | Optional | Description | Examples |
String | name | N | A name to generate when MPEG-TS stream is received |
|
| port | Y | MPEG-TS Port |
|
Type | Name | Optional | Description | Examples |
Int | threadCount | N | Number of threads |
|
| rtmpPush | Y |
|
| hls | Y |
|
| dash | Y |
|
| llDash | Y |
|
| webrtc | Y |
|
| ovt | Y |
|
| file | Y |
|
| thumbnail | Y |
|
Type | Name | Optional | Description | Examples |
(Reserved for future use) | - | - | - |
Type | Name | Optional | Description | Examples |
Int | segmentCount | N | Segment count in the playlist.m3u8 |
|
Int | segmentDuration | N | Segment duration (unit: seconds) |
|
List<String> | crossDomains | Y | Cross domain URLs |
|
Type | Name | Optional | Description | Examples |
Int | segmentCount | N | Segment count in the manifest.mpd |
|
Int | segmentDuration | N | Segment duration (unit: seconds) |
|
List<String> | crossDomains | Y | Cross domain URLs |
|
Type | Name | Optional | Description | Examples |
Int | segmentDuration | N | Segment duration (unit: seconds) |
|
List<String> | crossDomains | Y | Cross domain URLs |
|
Type | Name | Optional | Description | Examples |
| timeout | Y | ICE timeout (unit: seconds) |
|
Type | Name | Optional | Description | Examples |
(Reserved for future use) | - | - | - |
Type | Name | Optional | Description | Examples |
String | filePath | Y | A path to store recorded file
You can use the following macros:
|
|
String | fileInfoPath | Y | A path of recorded files |
|
Type | Name | Optional | Description | Examples |
List<String> | crossDomains | Y | Cross domain URLs |
|
Type | Name | Optional | Description | Examples |
String | name | N | A name of |
|
String | outputStreamName | N | A name of output stream |
|
| encodes | Y |
|
Type | Name | Optional | Description | Examples |
List< | videos | Y |
|
List< | audios | Y |
|
List< | images | Y |
|
Type | Name | Optional | Description | Examples |
Bool | bypass | Y |
|
Enum< | codec | Conditional | Video codec |
|
Int | width | Conditional |
|
Int | height | Conditional |
|
String | bitrate | Conditional | bitrate (You can use "K" or "M" suffix like |
|
Float | framerate | Conditional |
|
Type | Name | Optional | Description | Examples |
Bool | bypass | Y |
|
Enum< | codec | Conditional | Audio codec |
|
Int | samplerate | Conditional |
|
Int | channel | Conditional |
|
String | bitrate | Conditional | bitrate (You can use "K" or "M" suffix like |
|
Type | Name | Optional | Description | Examples |
Enum< | codec | N |
|
Int | width | Conditional |
|
Int | height | Conditional |
|
Float | framerate | N | An interval of image creation |
|
Type | Name | Optional | Description | Examples |
String | name | N | A name of stream |
|
| input | N | An information of input stream |
|
List< | outputs | N | An information of output streams |
|
Type | Name | Optional | Description | Examples |
String | name | N | A name of stream to create |
|
| pull | Y | pull |
|
| mpegts | Y | Creates a |
|
Type | Name | Optional | Description | Examples |
String | url | N | URL to pull |
|
Type | Name | Optional | Description | Examples |
String | agent | Y | A name of broadcast tool |
|
String | from | N | URI stream created |
|
String | to | Y | URI represents connection with the input |
|
List< | tracks | N | A list of tracks in input stream |
|
| createdTime | N | Creation time |
|
Type | Name | Optional | Description | Examples |
String | name | N | An name of |
|
List< | tracks | N | A list of tracks in |
|
Type | Name | Optional | Description | Examples |
Enum< | type | Y | Media type |
|
| video | Conditional | A configuration of video encoding |
|
| audio | Conditional | A configuration of audio encoding |
|
Type | Name | Optional | Description | Examples |
(Extends | - | - |
| timebase | Y | Timebase |
|
Type | Name | Optional | Description | Examples |
Int | num | N | Numerator |
|
Int | den | N | Denominator |
|
Type | Name | Optional | Description | Examples |
(Extends | - | - |
|
| timebase | Y | Timebase |
|
Type | Name | Optional | Description | Examples |
String | id | Y | Unique identifier |
| streams | N | A combination of output stream's track name and track id |
Enum< | state | N | Record state |
String | filePath | N | A path of recorded files |
String | fileInfoPath | N | A path of recorded file informations |
String | recordedBytes | N | Recorded bytes |
Int | recordedTime | N | Recorded time |
| startTime | N | Started time |
| finishTime | N | Finished time |
Int | bitrate | N | Average bitrate |
Type | Name | Optional | Description | Examples |
String | id | N | Unique identifier |
| stream | Y | A combination of output stream's track name and track id |
Enum< | protocol | Y | Protocol of input stream |
String | url | Y | Destination URL |
String | streamKey | Conditional | Stream key of destination |
Enum< | state | N | Push state |
Int | sentBytes | N | Sent bytes |
Int | sentPackets | N | Sent packets count |
Int | sentErrorBytes | N | Error bytes |
Int | sentErrorPackets | N | Error packets count |
Int | reconnect | N | Reconnect count |
| startTime | N | Started time |
| finishTime | N | Finished time |
Int | bitrate | N | Average bitrate |
Type | Name | Optional | Description | Examples |
| createdTime | N | Creation time |
|
| lastUpdatedTime | N | Modified time |
|
Long | totalBytesIn | N | Received bytes |
|
Long | totalBytesOut | N | Sent bytes |
|
Int | totalConnections | N | Current connections |
|
Int | maxTotalConnections | N | Max connections since the stream is created |
|
| maxTotalConnectionTime | N | When the maximum number of concurrent connections has been updated. |
|
| lastRecvTime | N | Last time data was received |
|
| lastSentTime | N | Last time data was sent |
|
Type | Name | Optional | Description | Examples |
(Extends | - | - | Includes all fields of |
| requestTimeToOrigin | Y | A elapsed time to connect to Origin |
|
| responseTimeFromOrigin | Y | A elapsed time from Origin to respond |
|
vhost_name | string | A name of |
app_name | string | A name of |
authorization | string | A string for authentication in |
vhost_name | string | A name of |
app_name | string | A name of |
stream_name | string | A name of |
authorization | string | A string for authentication in |
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}/outputProfiles
Creates OutputProfile
s in the Application
Request Example:
POST http://1.2.3.4:8081/v1/vhosts/default/apps/app/outputProfiles
[
{
"name": "bypass_profile",
"outputStreamName": "${OriginStreamName}",
"encodes": {
"videos": [
{
"bypass": true
}
],
"audios": [
{
"bypass": true
}
]
}
}
]
GET
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}/outputProfiles
Lists all output profile names in the Application
Request Example:
GET http://1.2.3.4:8081/v1/vhosts/default/apps/app/outputProfiles
GET
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}/outputProfiles/{profile_name}
Gets the configuration of the OutputProfile
Request Example:
GET http://1.2.3.4:8081/v1/vhosts/default/apps/app/outputProfiles/bypass_profile
PUT
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}/outputProfiles/{profile_name}
Changes the configuration of the OutputProfile
Request Example:
PUT http://1.2.3.4:8081/v1/vhosts/default/apps/app/outputProfiles/bypass_profile
{
"outputStreamName": "${OriginStreamName}",
"encodes": {
"videos": [
{
"codec": "h264",
"bitrate": "3M",
"width": 1280,
"height": 720,
"framerate": 30
}
],
"audios": [
{
"bypass": true
}
]
}
}
DELETE
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}/outputProfiles/{profile_name}
Deletes the OutputProfile
Request Example:
DELETE http://1.2.3.4:8081/v1/vhosts/default/apps/app/outputProfiles/bypass_profile
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}:startRecord
Description of the Start Recording API
Example - Recording by Output Stream Name
POST http[s]://{host}/v1/vhosts/default/apps/app:startRecord
{
"id": "{unique_record_id}",
"stream": {
"name": "{output_stream_name}",
}
}
Example - Recording by Output Stream Name with Track Ids
POST http[s]://{host}/v1/vhosts/default/apps/app:startRecord
{
"id": "{unique_record_id}",
"stream": {
"name": "{output_stream_name}",
"trackIds": [ 100, 200 ]
}
}
Example - Recording by Output Stream Name with Variant Names
POST http[s]://{host}/v1/vhosts/default/apps/app:startRecord
{
"id": "{unique_record_id}",
"stream": {
"name": "{output_stream_name}",
"variantNames": [ "h264_fhd", "aac" ]
}
}
* variantName means Application.OutputProfiles.OutputProfie.Encodes.[Video|Audio|Data].Name in the Server.xml configuration file.
Example - Split Recording by Interval
POST http[s]://{host}/v1/vhosts/default/apps/app:startRecord
{
"id": "{unique_record_id}",
"stream": {
"name": "{output_stream_name}"
},
"interval": 60000,
"segmentationRule": "discontinuity"
}
Example - Split Recording by Schedule
POST http[s]://{host}/v1/vhosts/default/apps/app:startRecord
{
"id": "{unique_record_id}",
"stream": {
"name": "{output_stream_name}"
},
"schedule" : "0 */1 *"
"segmentationRule": "continuity"
}
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}:stopRecord
Description of the Stop Recording API
Request Example
POST http[s]://{host}/v1/vhosts/default/apps/app:stopRecord
{
"id": "{unique_record_id}"
}
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}:records
Description of the Recording Status API
Request Example:
POST http[s]://{host}/v1/vhosts/default/apps/app:records
{
"id" : "{unique_record_id}"
}
GET
http://<OME_HOST>:<API_PORT>/v1/stats/current/vhosts/{vhost_name}
Usage statistics of the VirtualHost
Request Example:
GET http://1.2.3.4:8081/v1/stats/current/vhosts/default
GET
http://<OME_HOST>:<API_PORT>/v1/stats/current/vhosts/{vhost_name}/apps/{app_name}
Usage statistics of the Application
Request Example:
GET http://1.2.3.4:8081/v1/stats/current/vhosts/default/apps/app
GET
http://<OME_HOST>:<API_PORT>/v1/stats/current/vhosts/{vhost_name}/apps/{app_name}/streams/{stream}
Usage statistics of the Stream
Request Example:
GET http://1.2.3.4:8081/v1/stats/current/vhosts/default/apps/app/streams/{stream}
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}:startPush
Example - RTMP push publishing by Output Stream Name
POST http[s]://{host}/v1/vhosts/default/apps/app:startPush
{
"id": "{unique_push_id}",
"stream": {
"name": "{output_stream_name}"
},
"protocol": "rtmp",
"url":"rtmp://{host}[:port]/{app_ame}",
"streamKey":"{stream_name}"
}
Example - MPEG TS push publishing by Output Stream Name
POST http[s]://{host}/v1/vhosts/default/apps/app:startPush
{
"id": "{unique_push_id}",
"stream": {
"name": "{output_stream_name}"
},
"protocol": "mpegts",
"url":"udp://{host}[:port]",
"streamKey":""
}
Example - Push publishing by Output Stream Name and Track Ids
POST http[s]://{host}/v1/vhosts/default/apps/app:startPush
{
"id": "{unique_push_id}",
"stream": {
"name": "{output_stream_name}",
"trackIds": [ 101, 102 ]
},
"protocol": "rtmp",
"url":"rtmp://{host}[:port]/{appName}",
"streamKey":"{stream_name}"
}
Example - Push publishing by Output Stream Name and Variant Names
POST http[s]://{host}/v1/vhosts/default/apps/app:startPush
{
"id": "{unique_push_id}",
"stream": {
"name": "{output_stream_name}",
"variantNames": [ "h264_fhd", "aac" ]
},
"protocol": "rtmp",
"url":"rtmp://{host}[:port]/{app_name}",
"streamKey":"{stream_name}"
}
* variantName means Application.OutputProfiles.OutputProfie.Encodes.[Video|Audio|Data].Name in the Server.xml configuration file.
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}:stopPush
Example
POST http[s]://{host}/v1/vhosts/default/apps/app:stopRecord
{
"id": "{unique_push_id}"
}
POST
http://<OME_HOST>:<API_PORT>/v1/vhosts/{vhost_name}/apps/{app_name}:pushes
Example
POST http[s]://{host}/v1/vhosts/default/apps/app:pushes
{
"id": "{unique_push_id}"
}
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|---|---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
Name | Type | Description |
---|
vhost_name
string
A name of VirtualHost
app_name
string
A name of Application
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
(json body)
array
List<OutputProfile
>
vhost_name
string
A name of VirtualHost
app_name
string
A name of Application
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
vhost_name
string
A name of VirtualHost
app_name
string
A name of Application
profile_name
string
A name of OutputProfile
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
vhost_name
string
A name of VirtualHost
app_name
string
A name of Application
profile_name
string
A name of OutputProfile
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
(json body)
object
OutputProfile
vhost_name
string
A name of VirtualHost
app_name
string
A name of Application
profile_name
string
A name of OutputProfile
authorization
string
A string for authentication in Basic Base64(AccessToken)
format.
For example, Basic b21lLWFjY2Vzcy10b2tlbg==
if access token is ome-access-token
.
vhost_name* | string | A name of |
app_name* | string | A name of |
authorization* | string | A string for authentication in |
segmentationRule | string | Define the policy for continuously or discontinuously timestamp in divided recorded files. - continuity - discontinuity (default) |
id* | string | An unique identifier for recording job. |
stream* | string | Output stream. |
name* | string | Output stream name. |
trackIds | array | Used for recording specific track IDs. |
schedule | string | Schedule-based split recording settings. Same as crontab setting. Unable to use with interval. Format : <second minute hour> |
interval | number | Interval based split recording settings. Unable to use with schedule. Format : Milliseconds |
filePath | string | Set the path of the file to be recorded. Format: See Config Settings |
infoPath | string | Set the path to the information file to be recorded. Format: See Config Settings |
variantNames | array | Used for recording specific variant names. |
vhost_name* | string | A name of |
app_name* | string | A name of |
authorization* | string | A string for authentication in |
id* | string | An unique identifier for recording job. |
vhost_name* | string | A name of |
app_name* | string | A name of |
authorization* | string | A string for authentication in |
id | string | An unique identifier for recording job. If no value is specified, the entire recording job is requested. |
vhost_name | string | A name of |
access_token | string | A token for authentication |
vhost_name | string | A name of |
app_name | string | A name of |
access_token | string | A token for authentication |
vhost_name | string | A name of |
app_name | string | A name of |
stream_name | string | A name of |
access_token | string | A token for authentication |
vhost_name* | string | A name of |
app_name* | string | A name of |
authorization* | string | A string for authentication in For example, |
id* | string | Unique identifier of push publishing |
stream* | string | Output stream for push. |
name* | string | Output stream name |
trackIds | array | Used for push publishing specific track ids. |
protocol* | string | Transport protocol [rtmp | mpegts] |
url* | string | Destination URL. |
streamKey* | object | Destination stream key. |
variantNames | array | Used for push publishing specific variant names. |
vhost_name* | string | A name of |
app_name* | string | A name of |
authorization* | string | A string for authentication in For example, |
id* | string | Unique identifier of push publishing |
vhost_name* | string | A name of
|
app_name* | string | A name of
|
authorization* | string | A string for authentication in
format. \ For example,
if access token is
. |
id | string | Unique identifier of push publishing |
To monitor the OvenMediaEngine, you can view in real-time the log files generated by itself. You can configure a log type and level by creating the Logger.xml
configuration file in the same location as Server.xml.
You can set up Logger.xml as shown in the following example: OvenMediaEngine prints logs separated by many tag names and levels. Set <Tag name=".*" level="debug">
to have OvenMediaEngine print all logs and read the logs. And then it's better to disable tags that you don't need.
OvenMediaEngine generates log files. If you start OvenMediaEngine by systemctl start ovenmediaengine
, the log file is generated to the following path.
If you run it directly from the command line, it will be generated to the following location:
If you run it in the Docker container, the log file is in the following path:
Following is the example of real logs.
OvenMediaEngine collects the following metrics for each host, application, and stream.
Bytes in/out by protocol
Connections by protocol
Maximum connections and time
Time is taken to connect to origin
You can get the current statistics using the REST API. See Stat API for the statistics REST API.
Files such as webrtc_stat.log and hls_rtsp_xxxx.log that were previously output are deprecated in the current version. We are developing a formal stats file, which will be open in the future.
The Enterprise Edition of OvenMediaEnigne has been released to provide commercial codecs. The Enterprise version is distributed for a fee because it has a commercial codec installed, and you must agree to the EULA to use it.
In addition, it provides valuable features to managers in a commercial environment, such as the Web Console, and some additional functions not available in the Community Edition of OvenMediaEngine.
All features of OvenMediaEngine are identical to the Open Source version and the Enterprise version, with the following exceptions.
Enterprise adds:
Commercial library
The features that the customer does not allow to be disclosed as open source among the features developed by the customer.
OvenMediaEngine provides a tester for measuring WebRTC performance called OvenRtcTester. It is developed in Go language and uses the pion/webrtc/v3 and gorilla/websocket modules. Many thanks to the pion/webrtc and gorilla/websocket teams for contributing this wonderful project.
Since OvenRtcTester is developed in Go language, Go must be installed on your system. Install Go from the following URL: https://golang.org/doc/install
OvenRtcTester was tested with the latest version of go 1.17.
You can simply run it like this: -url is required. If the -life option is not used, it will run indefinitely until the user presses ctrl+c
.
You can also use go build
or go install
depending on your preference.
OvenRtcTester must test OvenMediaEngine 0.12.4 or higher as the target system. OvenMediaEngine versions below 0.12.4 have a problem with incorrectly calculating the RTP timestamp, so OvenRtcTester calculates the Video Delay
value incorrectly.
Linux has various tools to monitor CPU usage per thread. We will check the simplest with the top command. If you issue the top -H -p [pid] command, you will see the following screen.
You can use OvenRtcTester to test the capacity of the server as shown below. When testing the maximum performance, OvenRtcTester also uses a lot of system resources, so test it separately from the system where OvenMediaEngine is running. Also, it is recommended to test OvenRtcTester with multiple servers. For example, simulate 500 players with -n 500 on one OvenRtcTester, and simulate 2000 players with four servers.
Building and running OvenMediaEngine in debug mode results in very poor performance. Be sure to test the maximum performance using the binary generated by make release && make install .
If the OvenMediaEngine's capacity is exceeded, you will notice it in OvenRtcTester's Summary report with Avg Video Delay
and Avg Audio Delay
or Packet loss
.
On the right side of the above capture screen, we simulate 400 players with OvenRtcTester. <Summary> of OvenRtcTester shows that Avg Video Delay
and Avg Audio Delay
are very high, and Avg FPS
is low.
And on the left, you can check the CPU usage by thread with the top -H -p
command. This confirms that the StreamWorker threads are being used at 100%, and now you can scale the server by increasing the number of StreamWorker threads. If OvenMediaEngine is not using 100% of all cores of the server, you can improve performance by tuning the number of threads.
This is the result of tuning the number of StreamWorkerCount to 8 in config. This time, we simulated 1000 players with OvenRtcTester, and you can see that it works stably.
The WorkerCount in <Bind>
can set the thread responsible for sending and receiving over the socket. Publisher's AppWorkerCount allows you to set the number of threads used for per-stream processing such as RTP packaging, and StreamWorkerCount allows you to set the number of threads for per-session processing such as SRTP encryption.
With AppWorkerCount
, you can set the number of threads for distributed processing of streams when hundreds of streams are created in one application. When an application is requested to create a stream, the stream is evenly attached to one of created threads. The main role of Stream is to packetize raw media packets into the media format of the protocol to be transmitted. When there are thousands of streams, it is difficult to process them in one thread. Also, if StreamWorkerCount is set to 0, AppWorkerCount is responsible for sending media packets to the session.
It is recommended that this value does not exceed the number of CPU cores.
It may be impossible to send data to thousands of viewers in one thread. StreamWorkerCount allows sessions to be distributed across multiple threads and transmitted simultaneously. This means that resources required for SRTP encryption of WebRTC or TLS encryption of HLS/DASH can be distributed and processed by multiple threads. It is recommended that this value not exceed the number of CPU cores.
If a large number of streams are created and very few viewers connect to each stream, increase AppWorkerCount and lower StreamWorkerCount as follows.
If a small number of streams are created and a very large number of viewers are connected to each stream, lower AppWorkerCount and increase StreamWorkerCount as follows.
We provide you our test player to make sure that OvenMediaEngine works well. Most browsers prohibit access to the TLS-based HTTPS site through unsecured HTTP or WebSocket (WS) for security reasons. Thus, we have prepared the HTTP or HTTPS based player as follows:
When playing Low-Latency DASH, you can control the delay time in the player as shown below. Delay time is closely related to the buffering size. The smaller the value, the shorter the latency, but if it is too small, there is no buffer and playback may not be smooth. In a typical network environment, it is most stable to give 2 as the delay value.
OvenMediaEngine provides P2P Delivery to be able to distribute Edge Traffic to Player. This feature is currently the Preview version, and if you want to use it, you need only to use OvenPlayer. Moreover, we plan to perform more experiments in various real-world and then upgrade it to the full version in OvenMediaEngine.
First of all, we have rules. The peer that sends the Traffic in the P2P network is called a Host Peer, and the peer that receives the Traffic from the Host Peer is called a Client Peer. Also, P2P Delivery in OvenMediaEngine doesn't designate the Client Peer as the Host Peer again. In other words, it only operates as 1 Depth.
According to our experiments so far, P2P Delivery provides the best performance and stability when using 1 Depth to connect between Players and connecting up to two Players to one Player.
In other words, P2P Delivery has distributed two-thirds of existing Traffic. So, this means that it can expand the Capacity of the Edge Network by three times and reduce Traffic costs by two-thirds.
You can use the P2P function of OvenMediaEngine by adding the <P2P>
element as the following settings:
Also, If you want to use P2P Delivery when your OvenMediaEngine is running in Origin-Edge Cluster-Mode, you need to apply this setting to all the Edges. You can instantly test P2P Delivery with OvenPlayer.
<MaxClientPeersPerHostPeer>
sets the number of Client Peers connecting to one Host Peer.
When OvenMediaEngine receives a WebRTC connection request from a new player, it determines the Host Peer or Client Peer according to the following rules:
When any Host Peer is disconnected, OvenMediaEngine detects this situation and immediately reconnects the Client Peer connected to that Host Peer to the Edge to ensure stability.
Also, we are preparing a smarter algorithm based on user location, platform performance, and network statistical information for classifying Host Peers or Client Peers.
We will update this document as we gather troubleshooting examples. (Written in Nov 04, 2021)
prerequisites.sh
Script FailedIf you have problems with the prerequisites.sh
the script we have provided, please install it manually as follows.
systemctl start ovenmediaengine
failedIf SELinux is running on your system, SELinux can deny the execution of OvenMediaEngine.
You can choose between two methods of adding a policy to SELinux or setting SELinux to permissive mode. To add a policy, you must apply the SELinux policy file for the OvenMediaEngine service to your system as follows:
Setting SELinux to permissive mode is as simple as follows. But we don't recommend this method.
WebRTC does not support b-frame of H.264. But if your encoder sends b-frames the video will be stuttered in the player. In this case, you can solve the problem by disabling the b-frame function in your encoder. For OBS, you can set bframes=0 option as below.
Or by activating the encoding options in OvenMediaEngine.
In this case, you are probably trying to stream with UDP in an environment where packet loss is high due to network performance, connection problems, etc., the interruption during stream playback may more and more worsen. This problem can be solved simply by playing with WebRTC/TCP.
If you want to monitor packet loss in your Chrome browser, you can access it by typing 'chrome://webrtc-internals' in the address bar.
Also, if the device's network speed, which is running the player, isn't fast enough to accommodate the stream's BPS, the stuttering during streaming won't resolve and will eventually drop the connection. In this case, there is no other way than to speed up your network.
If the Origin server uses excessive CPU/Memory/Network, all players may experience stuttering during streaming.
When you see Origin is CPU intensive on your Origin-Edge structure, the transcoding options in the OvenMediaEngine may be the primary cause. That is, you may have set the quality of the input stream too high, or the output stream to exceed the capabilities of your hardware significantly. In this case, it can be solved by enabling the hardware encoder in OvenMediaEngine.
If the edge server excessively uses CPU/Memory/Network, the player connected to that Edge may experience stuttering during streaming. In this case, it can be solved by expanding Edge.
When you see a specific thread overuses the CPU, the video may not stream smoothly. Please refer to the manual below for more information on this.
The Linux kernel, which is set by default, cannot handle 1Gbps output, so put it as follows:
The mobile environment used by many people uses a wireless network. It has a high network speed but, conversely, can cause high packet loss.
Look, CUBIC, the Congestion Control set by default in your Linux, adjusts the TCP Window by packet loss, so it is not suitable to provide stable streaming in such an environment.
So our suggestion is to use Google's BBR. This setting is even more important if you mainly provide WebRTC services to mobile users who use a wireless network. Change the Congestion Control from CUBIC to BBR on your Linux.
If you try to access OvenMediaEngine's WebRTC URL starting with ws:// (Non-TLS) from an HTTPS (HTTP/TLS) site, the connection may be rejected due to a mixed content problem depending on the browser.
In this case, you can solve this by installing a certificate in OvenMediaEngine and trying to connect with the wss:// (WebSocket/TLS) URL.
At some point, when the message "Too many open files" is output in your OvenMediaEngine log, it may not be able to handle any more player connections. In this case, you can solve the problem by setting it as follows:
If you use Transcoding as Bypass in OvenMediaEngine and set a long keyframe interval in the encoder, the WebRTC player cannot start streaming until a keyframe is an input.
In this case, you can solve this by setting the keyframe interval in the encoder to 1-2 seconds,
Or by enabling the encoding options in OvenMediaEngine.
A/V may not be input evenly from the encoder. There are some encoders with policies for reliable streaming that they decide, for example, sending audio first and video very later, or video first and audio very late.
OvenMediaEngine outputs the input received from the encoder as-is for sub-second latency streaming. The WebRTC player also streams the received input as-is, so the A/V sync may not match during the initial playback due to the policy of specific encoders.
However, this can be resolved naturally as the player will sync A/V while streaming based on Timestamp. Still, if this work looks like an error, you can also solve it by enabling JitterBuffer in OvenMediaEngine.
Also, suppose you are using a transcoder in OvenMediaEngine and trying to input with b-frames of H264. Audio is encoded fast, but a video is buffered at the decoder because of b-frames. Therefore, there is a time difference at the start of each encoding, which may cause the A/V to be out of sync. Even in this case, enabling JitterBuffer will solve this problem.
There may be cases where the A/V sync is not corrected even after a certain amount of time has elapsed after playback. This problem is caused by small internal buffers in some browsers such as Firefox, which causes the player to give up calibration if the A/V sync differs too much. But this can also be solved by enabling JitterBuffer.
Nevertheless, if the A/V sync is not corrected, you should suspect an error in the original video file, which can be checked by playing as HLS.
WebRTC supports Opus, not AAC, as an audio codec. Because RTMP and other protocols mainly use and transmit AAC as the audio codec, you may not have set up Opus, but WebRTC cannot output audio without Opus. This can be solved by setting Opus in OvenMediaEnigne.
If you are using video encoding in OME, the video bitrate may be set low. In this case, the video quality can be improved by increasing the unit of video bitrate.
However, since OvenMediaEngine has the default to the fastest encoding option for sub-second latency streaming, the video quality may not be as good as the set video bitrate. In this case, OvenMediaEngine provides an output profile preset that can control the quality, so you can choose to solve it.
Since the encoder is transmitting video to OvenMediaEngine in low quality, you can solve it by increasing the input quality in the encoder settings.
Qualification for Host Peer | Qualification for Client Peer |
---|
If you have a better idea, we hope that you improve our code and contribute to our project. Please visit .
Setting up Transcoding options in OvenMediaEngine:
Setting up WebRTC over TCP in OvenMediaEngine:
Setting up GPU Acceleration in OvenMediaEngine:
Tuning OvenMediaEngine Performance:
Setting up TLS Encryption in OvenMediaEngine:
As of October 2021, most browsers have enforced the , and CORS errors often occur when requesting access to other domains if it is not a TLS site. In this case, you can solve the problem by installing a certificate on the site that loads the player.
Setting up Transcoding options in OvenMediaEngine:
Setting up WebRTC JitterBuffer in OvenMediaEngine:
Setting up WebRTC JitterBuffer in OvenMediaEngine:
However, if A/V sync is well during streaming with HLS, this is OvenMediaEnigne's bug. If you find any bugs, please feel free to report them to .
Setting up Opus Codec in OvenMediaEngine:
Choosing an Encoding Preset in OvenMediaEngine:
Thread name
Element in the configuration
AW-XXX
<Application><Publishers><AppWorkerCount>
StreamWorker
<Application><Publishers><StreamWorkerCount>
SPICE-XXX
<Bind><Provider><WebRTC><IceCandidates><TcpRelayWorkerCount>
<Bind><Pubishers><WebRTC><IceCandidates><TcpRelayWorkerCount>
SPRtcSignalling
<Bind><Provider><WebRTC><Signalling><WorkerCount>
<Bind><Pubishers><WebRTC><Signalling><WorkerCount>
SPSegPub
<Bind><Pubishers><HLS><WorkerCount>
<Bind><Pubishers><DASH><WorkerCount>
SPRTMP-XXX
<Bind><Providers><RTMP><WorkerCount>
SPMPEGTS
<Bind><Providers><MPEGTS><WorkerCount>
SPOvtPub
<Bind><Pubishers><OVT><WorkerCount>
SPSRT
<Bind><Providers><SRT><WorkerCount>
Type
Value
Default
1
Minimum
1
Maximum
72
Type
Value
Default
8
Minimum
0
Maximum
72
Site URL
Description
A player without TLS that can test all streaming URLs such as WS://
, WSS://
, HTTP://
, HTTPS://
A player with TLS that can only test WSS://
and HTTPS://
based streaming URLs
|
|
The Web Console is a web application that allows you to monitor and operate OvenMediaEngine.
The main features of Web Console are:
Read and display the config file of OvenMediaEngine.
Interpret the OvenMediaEngine configuration file and work with the REST API.
Use the REST API to display a list of Virtual Hosts, Applications, Streams, and configuration information.
You can reload Virtual Host and Application.
You can load and unload Virtual Hosts that have been added or removed from the OvenMediaEngine configuration file.
Provide a Test Player that can play on the generated stream.
The Web Console is distributed as part of the OvenMediaEngine Enterprise package. The default installation path and running port for the Web Console are:
The default configuration for the Web Console is as follows:
This section describes the common user interface of Web Console.
You can sign in from this login screen. A Username and Password of the initial account are admin
.
This logo is a link to the first entry screen. You can check the settings of OvenMediaEngine.
This tab is a link to a page where you can edit your account information.
Sign out and then expire the session.
Tree-structured navigation with access to Virtual Hosts, Applications, and Streams running on OvenMediaEngine. The selected resource is highlighted.
You can load and unload virtual hosts or applications when added or removed from the OvenMediaEngine configuration.
Reload the Virtual Host, Application, and Stream information. If a new stream is added to your application, you can view the new stream.
Shows the path of the selected Virtual Host, Application, and Stream. You can click the parent path to choose that resource.
Displays Statistics, Actions, and Setting information of the selected resource.
Describe the main features of Web Console.
The OvenMediaEngine Console displays each Server, Virtual Host, Application, and Stream configuration.
â‘ Display Configuration Files
If you select Server, you will see all the configuration files used by OvenMediaEngine.
If you select Virtual Host, Application, or Stream, you can see traffic/connection statistics and configuration.
â‘ Statistics
You can check the cumulative bytes in/out and the number of connected sessions for each protocol.
â‘¡ Creation Time
You can check the creation time of the selected resource.
â‘¢ Detailed Information
Display detailed information about the selected resource. When Virtual Host or Application is selected, the configuration is displayed, and when Stream is selected, the input stream and output stream information is displayed.
If you change the Virtual Host or Application settings directly in the Server.xml or include the Virtual Host configuration file, you can apply the changes to OvenMediaEngine without restarting OvenMediaEngine. Use the Actions → Reload button on the details page of Virtual Host or Application.
Suppose you modify the OvenMediaEngine Server.xml directly to add a new Virtual Host or remove an existing Virtual Host. In that case, you can reflect the added or removed Virtual Host to the OvenMediaEngine without restarting the OvenMediaEngine. Likewise, if you add or delete the separated Virtual Host configuration files, it will work similarly.
â‘ Reload the OvenMediaEngine Configuration
To reload the modified configuration file, refresh each page or use the Refresh
button in the left navigation.
â‘¡ Add a new Virtual Host
Suppose a new Virtual Host is added to the configuration file. In that case, the new Virtual Host will be added to the left navigation, and you will see a Load
button that can be immediately reflected in the OvenMediaEngine.
â‘¢ Remove the existing Virtual Host
Similarly, if an existing Virtual Host is removed from the configuration file, you will see an Unload
button in the left navigation. You can use the Unload button to remove a current Virtual Host from OvenMediaEngine.
As with Virtual Host, if you modify the OvenMediaEngine configuration file directly to add or remove an Application, you can add or remove the Application to OvenMediaEngine without restarting OvenMediaEngine.
â‘ Reload OvenMediaEngine Configuration
To reread the modified configuration file, refresh each page or use the Refresh
button in the left navigation.
â‘¡ Add a new Application
When a new Application is added to the config file, the new Application is added to the left navigation with a Load button that can be immediately applied to the OvenMediaEngine. You can use the Load button to apply the new Application to OvenMediaEngine.
â‘¢ Remove the existing Application
Similarly, if an existing Application is removed from the configuration file, an Unload
button will appear in the left navigation.
We provide a Test Player that can play that Stream.
â‘ Test Playback Link
If you select a Stream, you can find the section where you can play the stream with the Test Playback
link.
â‘¡ OvenMediaEngine Host Settings
Set the IP address or domain of OvenMediaEngine (required).
Host information is different depending on the operating environment of OvenMediaEngine, so enter it manually.
â‘¢ TCP Playback Settings
Check if you want to play the WebRTC stream with TCP.
â‘£ Output Stream List
It interprets the information set in â‘¡
and â‘¢
and the configurations of OvenMediaEngine and display the list of playable output streams.
⑤ Output Stream Playback
If you click the Play
button, the stream is played with â‘¥
Test Player (OvenPlayer).
You can change the account information to sign in to the Web Console.
â‘ Account Setting Link
You can enter the account information page through the Account
link.
The default account is set to username: admin
, password: admin
. If you change your account information, your session will be expired, and you will need to sign in again.
Cateroty | Feature | Description | Link |
---|---|---|---|
API
Reload Virtual Host
Reload Virtual Host
Reload Application
Admin
Web Console
Codec
Beamr H.264 Encoder
Installation path
/usr/share/ovenmediaengine/console
Configuration file path
/usr/share/ovenmediaengine/console/conf/config.yaml
Log file path
/var/log/ovenmediaengine/console/ovenmediaengine-console.log
Running port
5000
Initial account
username: admin
password: admin
OvenMediaEngine (OME) is a Sub-Second Latency Live Streaming Server with Large-Scale and High-Definition. With OME, you can create platforms/services/systems that transmit high-definition video to hundreds-thousand viewers with sub-second latency and be scalable, depending on the number of concurrent viewers.
OvenMediaEngine can receive a video/audio, video, or audio source from encoders and cameras such as OvenLiveKit, OBS, XSplit, and more, to WebRTC, SRT, RTMP, MPEG-2 TS, and RTSP as Input. Then, OME transmits this source using LLHLS (Low Latency HLS) and WebRTC as output. Also, we provide OvenPlayer, an Open-Source and JavaScript-based WebRTC/LLHLS Player for OvenMediaEngine.
Our goal is to make it easier for you to build a stable broadcasting/streaming service with sub-second latency.
Ingest
Push: WebRTC, WHIP, SRT, RTMP, MPEG-2 TS
Pull: RTSP
Adaptive Bitrate Streaming (ABR) for LLHLS and WebRTC
Low-Latency Streaming using LLHLS
Sub-Second Latency Streaming using WebRTC
WebRTC over TCP (with embedded TURN server)
Embedded WebRTC Signaling Server (WebSocket based)
Retransmission with NACK
ULPFEC (Uneven Level Protection Forward Error Correction)
VP8, H.264
In-band FEC (Forward Error Correction)
Opus
Embedded Live Transcoder
Video: VP8, H.264, Pass-through
Audio: Opus, AAC, Pass-through
Clustering (Origin-Edge Structure)
Monitoring
Access Control
AdmissionWebhooks
SignedPolicy
File Recording
Push Publishing using RTMP and MPEG2-TS (Re-streaming)
Thumbnail
REST API
Experiment
P2P Traffic Distribution (Only WebRTC)
We have tested OvenMediaEngine on platforms, listed below. However, we think it can work with other Linux packages as well:
Ubuntu 18+
CentOS 7+
Fedora 28+
Please read Getting Started chapter in the tutorials.
Thank you so much for being so interested in OvenMediaEngine.
We need your help to keep and develop our open-source project, and we want to tell you that you can contribute in many ways. Please see our Guidelines, Rules, and Contribute.
We always hope that OvenMediaEngine will give you good inspiration.
Test Player
Without TLS: http://demo.ovenplayer.com
With TLS: https://demo.ovenplayer.com
OvenMediaEngine is licensed under the AGPL-3.0-only. However, if you need another license, please feel free to email us at contact@airensoft.com.
OvenMediaEngine has a built-in live transcoder. The live transcoder can decode the incoming live source and re-encode it with the set codec or adjust the quality to encode at multiple bitrates.
OvenMediaEngine currently supports the following codecs:
The <OutputProfile>
setting allows incoming streams to be re-encoded via the <Encodes>
setting to create a new output stream. The name of the new output stream is determined by the rules set in <OutputStreamName>
, and the newly created stream can be used according to the streaming URL format.
According to the above setting, if the incoming stream name is stream
, the output stream becomes stream_bypass
and the stream URL can be used as follows.
WebRTC
ws://192.168.0.1:3333/app/stream_bypass
LLHLS
http://192.168.0.1:8080/app/stream_bypass/llhls.m3u8
You can set the video profile as below:
The meaning of each property is as follows:
Table of presets
A table in which presets provided for each codec library are mapped to OvenMediaEngine's presets. Slow presets are of good quality and use a lot of resources, whereas Fast presets have lower quality and better performance. It can be set according to your own system environment and service purpose.
References
https://trac.ffmpeg.org/wiki/Encode/VP8
https://docs.nvidia.com/video-technologies/video-codec-sdk/nvenc-preset-migration-guide/
You can set the audio profile as below:
The meaning of each property is as follows:
It is possible to have an audio only output profile by specifying the Audio profile and omitting a Video one.
You can set the Image profile as below:
The meaning of each property is as follows:
The image encoding profile is only used by thumbnail publishers. and, bypass option is not supported.
You can configure Video and Audio to bypass transcoding as follows:
You need to consider codec compatibility with some browsers. For example, chrome only supports OPUS codec for audio to play WebRTC stream. If you set to bypass incoming audio, it can't play on chrome.
WebRTC doesn't support AAC, so if video bypasses transcoding, audio must be encoded in OPUS.
If the input codec is the same as the option to be transcoded, there is no need to perform re-transcoding while unnecessarily consuming a lot of system resources.
If the quality of the input track matches all the conditions of BypassIfMatch, it will be bypassed without encoding
eq: equal to
lte: less than or equal to
gte: greater than or equal to
To support WebRTC and LLHLS, AAC and Opus codecs must be supported at the same time. Use the settings below to reduce unnecessary audio encoding.
If a video track with a lower quality than the encoding option is input, unnecessary upscaling can be prevented.
If you want to transcode with the same quality as the original. See the sample below for possible parameters that OME supports to keep original. If you remove the Width, Height, Framerate, Samplerate, and Channel parameters. then, It is transcoded with the same options as the original.
To change the video resolution when transcoding, use the values of width and height in the Video encode option. If you don't know the resolution of the original, it will be difficult to keep the aspect ratio after transcoding. Please use the following methods to solve these problems. For example, if you input only the Width value in the Video encoding option, the Height value is automatically generated according to the ratio of the original video.
From version 0.14.0, OvenMediaEngine can encode same source with multiple bitrates renditions and deliver it to the player.
As shown in the example configuration below, you can provide ABR by adding <Playlists>
to <OutputProfile>
. There can be multiple playlists, and each playlist can be accessed with <FileName>
.
The method to access the playlist set through LLHLS is as follows.
http[s]://<domain>[:port]/<app>/<stream>/
<FileName>
.m3u8
The method to access the Playlist set through WebRTC is as follows.
ws[s]://<domain>[:port]/<app>/<stream>/
<FileName>
Note that <FileName>
must never contain the playlist
and chunklist
keywords. This is a reserved word used inside the system.
To set up <Rendition>
, you need to add <Name>
to the elements of <Encodes>
. Connect the set <Name>
into <Rendition><Video>
or <Rendition><Audio>
.
In the example below, three quality renditions are provided and the URL to play the abr
playlist as LLHLS is https://domain:port/app/stream/abr.m3u8
and The WebRTC playback URL is wss://domain:port/app/stream/abr
Even if you set up multiple codecs, there is a codec that matches each streaming protocol supported by OME, so it can automatically select and stream codecs that match the protocol. However, if you don't set a codec that matches the streaming protocol you want to use, it won't be streamed.
The following is a list of codecs that match each streaming protocol:
Therefore, you set it up as shown in the table. If you want to stream using LLHLS, you need to set up H.264 and AAC, and if you want to stream using WebRTC, you need to set up Opus.
Also, if you are going to use WebRTC on all platforms, you need to configure both VP8 and H.264. This is because different codecs are supported for each browser, for example, VP8 only, H264 only, or both.
However, don't worry. If you set the codecs correctly, OME automatically sends the stream of codecs requested by the browser.
Type | Decoder | Encoder |
---|
Property | Description |
---|
Presets | openh264 | libvpx | h264/265 NVC | h264/265 QSV |
---|
Property | Description |
---|
Property | Description |
---|
Protocol | Supported Codec |
---|
Video | VP8, H.264, H.265 | VP8, H.264, H.265(GPU only) |
Audio | AAC, OPUS | AAC, OPUS |
Image | JPEG, PNG |
Codec* | Specifies the |
Bitrate* | Bit per second |
Name | Encode name for Renditions |
Width | Width of resolution |
Height | Height of resolution |
Framerate | Frames per second |
Preset | Presets of encoding quality and performance |
ThreadCount | Number of threads in encoding |
slower | Quantizer(10-41) | best | hq | - |
slow | Quantizer(10-41) | best | llhq | - |
medium | Quantizer(10-51) | good | bd | - |
fast | Quantizer(25-51) | realtime | hp | - |
faster | Quantizer(25-51) | *realtime | *llhp | - |
Codec* | Specifies the |
Bitrate* | Bits per second |
Name | Encode name for Renditions |
Samplerate | Samples per second |
Channel | The number of audio channels |
Codec | Specifies the |
Width | Width of resolution |
Height | Height of resolution |
Framerate | Frames per second |
WebRTC | VP8, H.264, Opus |
LLHLS | H.264, AAC |
OvenMediaEngine supports Push Publishing function that can retransmit live streams to other systems. The protocol supported for retransmission uses RTMP or MPEGTS. Because, most services and products support this protocol. also, one output stream can be transmitted to multiple destinations at the same time. You can start and stop pushing the output stream through REST API. Note that the only codecs that can be retransmitted in RTMP and MPEGTS protocol are H264 and AAC.
To use RTMP Push Publishing, you need to declare the <RTMPPush>
publisher in the configuration. There are no other detailed options.
To use MPEGTS Push Publishing, you need to declare the <MPEGTSPush>
publisher in the configuration. There are no other detailed options.
Only H264 and AAC are supported codecs.
For control of push, use the REST API. RTMP, MPEGTS push can be requested based on the output stream name (specified in the JSON body), and you can selectively transfer all/some tracks. In addition, you must specify the URL and Stream Key of the external server to be transmitted. It can send multiple Pushes simultaneously for the same stream. If transmission is interrupted due to network or other problems, it automatically reconnects.
For how to use the API, please refer to the link below.