Skip to main content


About Q&A, please follow rules:


Here are some common questions. If you can't find your question, please search in this Issue first. If you are sure it is a bug and it has not been submitted before, please submit an Issue according to the requirements.

Note: This is FAQ about SRS, please see SRS Stack FAQ for SRS Stack.


  • Questions about RTMP/HTTP-FLV/WebRTC live streaming?
    1. SRS only supports streaming protocols, such as live streaming or WebRTC. For details, please refer to the cluster section in the WiKi.
  • Questions about HLS/DASH segmented live streaming, or on-demand/recording/VoD/DVR?
    1. SRS can record as on-demand files. Please refer to DVR
    2. SRS can generate HLS or DASH. Please refer to HLS
  • Questions about HLS/DASH/VoD/DVR distribution clusters?
    1. These are all HTTP files, and for HTTP file distribution clusters, it is recommended to use NGINX. Please refer to HLS Cluster
    2. You can use NGINX in conjunction with SRS Edge to distribute HTTP-FLV, implementing the distribution of all HTTP protocols. Please refer to Nginx For HLS


  • Pagination: For pagination issues related to console streams and clients, refer to #3451
    1. The default API parameters are start=0, count=10, and the Console does not support pagination. It is planned to be supported in the new Console.


  • CORS: Questions about cross-domain access for HTTP APIs or streams
    1. SRS 3.0 supports cross-domain (CORS) access, so there is no need for additional HTTP proxies, as it is built-in and enabled by default. Please refer to #717 #798 #1002
    2. Of course, using an Nginx proxy server can also solve cross-domain issues, so there is no need to set it in SRS. Note that you only need to proxy the API, not the media stream, because the bandwidth consumption of the stream is too high, which will cause the proxy to crash and is not necessary.
    3. Use Nginx or Caddy proxy to provide a unified HTTP/HTTPS service. Please refer to #2881

CPU and OS​

  • CPU and OS: About the CPU architecture and OS operating system supported by SRS
    1. SRS supports common CPU architectures, such as x86_64 or amd64, as well as armv7/aarch64/AppleM1, MIPS or RISCV, and Loongson loongarch. For other CPU adaptations, please refer to ST#22.
    2. SRS supports commonly used operating systems, such as Linux including CentOS and Ubuntu, macOS, and Windows.
    3. SRS also supports domestic Xin Chuang systems. If you need to adapt to a new domestic Xin Chuang system, you can submit an issue.
  • Windows: Special notes about Windows
    1. Generally, Windows is less used as a server, but there are some application scenarios. SRS 5.0 currently supports Windows, and each version will have a Windows installation package for download.
    2. Since it is difficult for everyone to download from Github, we provide a Gitee mirror download. Please see Gitee: Releases for each version's attachments.
    3. There are still some issues on the Windows platform that have not been resolved, and we will continue to improve support. For details, please refer to #2532.


  • Edge HLS/DVR/RTC: About Edge support for HLS/DVR/RTC, etc.
    1. Edge is a live streaming cluster that only supports live streaming protocols such as RTMP and FLV. Only the origin server can support HLS/DVR/RTC. Refer to #1066
    2. Currently, there is no restriction on using HLS/DVR/RTC capabilities in Edge, but they will be disabled in the future. So please do not use them this way, and they won't work.
    3. For the HLS cluster, please refer to the documentation HLS Edge Cluster
    4. The development of WebRTC and SRT clustering capabilities is in progress. Refer to #3138


  • FFmpeg: Questions related to FFmpeg
    1. If FFmpeg is not found, the error terminate, please restart it appears, compilation fails with No FFmpeg found, or FFmpeg does not support h.265 or other codecs, you need to compile or download FFmpeg yourself and place it in the specified path, then SRS will detect it. Please refer to #1523
    2. If you have questions about using FFmpeg, please do not submit issues in SRS. Instead, go to the FFmpeg community. Issues about FFmpeg in SRS will be deleted directly. Don't be lazy.


  • About supported features, outdated features, and plans?
    1. Each version supports different features, which are listed on the Github homepage, such as develop/5.0, release/4.0, release/3.0.
    2. The changes in each version are also different and are listed on the Github homepage, such as develop/5.0, release/4.0, release/3.0.
    3. In addition to adding new features, SRS will also remove unsuitable features, such as RTSP push streaming, srs-librtmp, GB SIP signaling, etc. These features may be useless, inappropriate, or provided in a more suitable way. See #1535 for more information.


  • GB28181: Questions about GB status and roadmap
    1. GB has been moved to a separate repository srs-gb28181, please refer to #2845
    2. For GB usage, please refer to #1500. Currently, GB is still in the feature/gb28181 branch. It will be merged into develop and then released after it is stable. It is expected to be released in SRS 5.0.
    3. SRS support for GB will not be comprehensive, and will only be used as an access protocol. The highly concerned intercom is planned to be supported.


  • No one answers questions in the WeChat group? The art of asking questions in the community?
    1. Please search in the various documents of the community first, and do not ask questions that already have answers.
    2. Please describe the background of the problem in detail, and show the efforts you have made.
    3. Open source community means you need to be able to solve problems yourself. If not, please consider paid consultation.

HLS Fragments​

  • HLS Fragment Duration: About HLS segment duration
    1. HLS segment duration is determined by three factors: GOP length, whether to wait for a keyframe (hls_wait_keyframe), and segment duration (hls_fragment).
    2. For example, if the GOP is set to 2s, the segment length is hls_fragment:5, and hls_wait_keyframe:on, then the actual duration of each TS segment may be around 5~6 seconds, as it needs to wait for a complete GOP before closing the segment.
    3. For example, if the GOP is set to 10s, the segment length is hls_fragment:5, and hls_wait_keyframe:on, then the actual duration of each TS segment is also over 10 seconds.
    4. For example, if the GOP is set to 10s, the segment length is hls_fragment:5, and hls_wait_keyframe:off, then the actual duration of each TS segment is around 5 seconds. The segment does not start with a keyframe, so some players may experience screen artifacts or slower video playback.
    5. For example, if the GOP is set to 2s, the segment length is hls_fragment:2, and hls_wait_keyframe:on, then the actual duration of each TS segment may be around 2 seconds. This way, the HLS delay is relatively low, and there will be no screen artifacts or decoding issues, but the encoding quality may be slightly compromised due to the smaller GOP.
    6. Although the segment size can be set to less than 1 second, such as hls_fragment:0.5, the #EXT-X-TARGETDURATION is still 1 second because it is an integer. Moreover, having too small segments can lead to an excessive number of segments, which is not conducive to CDN caching or player caching, so it is not recommended to set too small segments.
    7. If you want to reduce latency, do not set the segment duration to less than 1 second; setting it to 1 or 2 seconds is more appropriate. Because even if it is set to 1 second, due to the player's segment fetching strategy and caching policy, the latency will not be the same as RTMP or HTTP-FLV streams. The minimum latency for HLS is generally over 5 seconds.
    8. GOP refers to the number of frames between two keyframes, which needs to be set in the encoder. For example, the FFmpeg parameter -r 25 -g 50 sets the frame rate to 25fps and the GOP to 50 frames, which is equivalent to 2 seconds.
    9. In OBS, there is a Keyframe Interval(0=auto) setting. Its minimum value is 1s. If set to 0, it actually means automatic, not the lowest latency setting. For low latency, it is recommended to set it to 1s or 2s.


  • HTTP RAW API: About RAW API, dynamic recording DVR, etc.

    1. Due to various problems with the RAW API, it may lead to overuse. The feature has been removed in version 4.0. For detailed reasons, please see #2653.
    2. Again, do not use HTTP RAW API for business implementation. This is what your business system should do. You can use Go or Node.js to handle it.
  • Secure HTTP API: About API authentication, API security, etc. #api-security

    1. Regarding HTTP API authentication and how to prevent everyone from accessing it, it is currently recommended to use Nginx proxy to solve this issue. The support will be enhanced in the future. For details, please see #1657.
    2. You can also use HTTP Callback to implement authentication. When pushing or playing a stream, call your business system's API to implement the hook.
  • Dynamic DVR: About dynamic recording, regular expression matching for streams that need to be recorded, etc.

    1. You can use on_publish to callback the business system and implement complex rules.
    2. For specific recording files, use on_hls to copy the slices to the recording directory or cloud storage.
    3. You can refer to the DVR implementation in srs-stack.
    4. SRS will not support dynamic DVR, but some solutions are provided. You can also refer to #1577.


  • HTTPS: Regarding HTTPS services, API, Callback, Streaming, WebRTC, etc.
    1. HTTPS API provides transport layer security for the API. WebRTC push streaming requires HTTPS pages, which can only access HTTPS APIs.
    2. HTTPS Callback calls back to HTTPS services. If your server uses the HTTPS protocol, most business systems use HTTPS for security purposes.
    3. HTTPS Live Streaming provides transport layer security for streaming, mainly because HTTPS pages can only access HTTPS resources.
    4. Automatically apply for SSL certificates from letsencrypt for a single domain, making it easier for small and medium-sized enterprises to deploy SRS and avoiding the high overhead of HTTPS proxies for streaming media businesses. See #2864
    5. Use Nginx or Caddy as reverse proxies for HTTP/HTTPS Proxy to provide unified HTTP/HTTPS services. See #2881
  • HTTP2: Regarding HTTP2-FLV or HTTP2 HLS, etc.
    1. SRS will not implement HTTP2 or HTTP3, but instead recommends using reverse proxies to convert protocols, such as Nginx or Go.
    2. Since HTTP is a very mature protocol, existing tools and reverse proxy capabilities are very comprehensive, and SRS does not need to implement a complete protocol.
    3. SRS has implemented a simple HTTP 1.0 protocol, mainly providing API and Callback capabilities.


  • Latency: Questions about how to reduce latency, how to do low-latency live streaming, and how much latency WebRTC has.
    1. Live streaming latency is generally 1 to 3 seconds, WebRTC latency is around 100ms, why is the latency of the self-built environment so high?
    2. The most common reason for high latency is using the VLC player, which has a latency of tens of seconds. Please switch to the SRS H5 player.
    3. Latency is related to each link, not just SRS reducing latency. It is also related to the push tool (FFmpeg/OBS) and the player. Please refer to Realtime and follow the steps to set up a low-latency environment. Don't start with your own fancy operations, just follow the documentation.
    4. If you still find high latency after following the steps, how to troubleshoot? Please refer to #2742

Performance and Memory​

  • Performance: About performance optimization, concurrency, stress testing, memory leaks, and wild pointers
    1. Performance is a comprehensive topic, including the quality of the project, the capacity and concurrency it supports, how to optimize performance, and even memory issues, such as memory leaks (leading to reduced performance), out-of-bounds and wild pointer problems.
    2. If you need to understand the concurrency of SRS, you must divide it into separate concurrency for live streaming and WebRTC. Live streaming can use srs-bench, and WebRTC can use the feature/rtc branch for stress testing to obtain the concurrency supported by your hardware and software environment under specific bitrates, latency, and business characteristics.
    3. SRS also provides official concurrency data, which can be found in Performance. It also explains how to measure this concurrency, the conditions under which the data is obtained, and specific optimization code.
    4. If you need to investigate performance issues, memory leaks, or wild pointer problems, you must use system-related tools such as perf, valgrind, or gperftools. For more information, please refer to SRS Performance (CPU), Memory Optimization Tool Usage or Perf.
    5. It is important to note that valgrind has been supported since SRS 3.0 (inclusive), and the ST patch has been applied.


  • RTSP: RTSP streaming, RTSP server, RTSP playback, etc.
    1. SRS supports pulling RTSP with Ingest, but does not support pushing RTSP stream to SRS, which is not the correct usage. For detailed reasons, please refer to #2304.
    2. Of course, RTSP server and RTSP playback will not be supported either, please refer to #476.
    3. If you need a large number of camera connections, such as 10,000, using FFmpeg may be more difficult. For such large-scale businesses, the recommended solution is to use ST+SRS code to implement an RTSP forwarding server.

Source Cleanup​

  • Source Cleanup: Regarding memory growth for a large number of streams
    1. The Source object for push streaming is not cleaned up, and memory will increase as the number of push streams increases. For now, you can use Gracefully Quit as a workaround, and this issue will be addressed in the future. See #413
    2. To reiterate, you can use Gracefully Quit as a workaround. Even if this issue is resolved in the future, this solution is the most reliable and optimal one. Restarting is always a good option.

Video Guides​

Here is the video material for the Q&A session, which provides a detailed explanation of a certain topic. If your question is similar, please watch the video directly:

WebRTC Cluster​

  • WebRTC+Cluster: Questions related to WebRTC clustering
    1. WebRTC clustering is not the same as live streaming clustering (Edge+Origin Cluster), but it is called WebRTC cascading. Please refer to #2091
    2. In addition to the clustering solution, SRS will also support the Proxy solution, which is simpler than clustering and will have scalability and disaster recovery capabilities. Please refer to #3138

WebRTC Live​

  • WebRTC+Live: Questions related to WebRTC and live streaming
    1. For the conversion between WebRTC and RTMP, such as RTMP2RTC (RTMP push stream RTC playback) or RTC2RTMP (RTC push stream RTMP playback), you must specify the conversion configuration. Audio transcoding is not enabled by default to avoid significant performance loss. Please refer to #2728
    2. If SRS 4.0.174 or earlier works, but it does not work after updating, it is because rtc.conf does not enable RTMP to RTC by default. You need to use rtmp2rtc.conf or rtc2rtmp.conf. Please refer to 71ed6e5dc51df06eaa90637992731a7e75eabcd7
    3. In the future, the conversion between RTC and RTMP will not be enabled automatically, because SRS must consider the independent RTMP and independent RTC scenarios. The conversion scenario is just one of them, but due to the serious performance problems caused by the conversion scenario, it cannot be enabled by default, which will cause major problems in independent scenarios.


  • WebRTC: Questions about WebRTC push and pull streams or conferences
    1. WebRTC is much more complicated than live streaming. For many WebRTC issues, do not submit issues in SRS, but search for the problem on Google first. If you do not have this ability, do not use WebRTC. There are many pitfalls, and if you do not have the ability to crawl out of them, do not jump into them.
    2. A common issue is that the Candidate setting is incorrect, causing the push and pull streams to fail. For details, see the WebRTC usage instructions: #307
    3. There are also issues with UDP ports being inaccessible, which may be due to firewall settings or network issues. Please use tools to test, refer to #2843
    4. Another common issue is the conversion between RTMP and WebRTC. Please see the description above #webrtc-live.
    5. Then there are WebRTC permission issues, such as being able to push streams locally but not on the public network. This is a Chrome security setting issue. Please refer to #2762
    6. There are also less common issues, such as not being able to play non-HTTPS SRS streams with the official player. This is also a Chrome security policy issue. Please refer to #2787
    7. When mapping ports in docker, if you change the port, you need to modify the configuration file or specify it through eip. Please refer to #2907


  • WebSocket/WS: How to support WS-FLV or WS-TS?
    1. You can use a Go proxy to convert it once, with a few lines of key code for stability and reliability. Please refer to mse.go


WebRTC Demo Failed​

Question Failed to join RTC room or start conversation

According to the 5.0 documentation for SFU: One to One, I have completed the following configurations:

  1. Configured the CANDIDATE to use the internal IP address
  2. Used Docker to start RTC service, Signaling service, and HTTPS service.
  3. Successfully accessed and was able to open it without any issues.

However, when I click on "Start Conversation" or "Join Room," my computer's camera briefly lights up but there is no response. I have already used a self-signed OpenSSL key and crt certificate, but encountered a TLS certificate handshake error.


  1. First, it is important to clarify that you strictly followed the documentation.SFU: One to One
  2. In order to identify the cause, you can investigate potential factors such as certificate problems, HTTPS connection issues, and browser permission settings etc.


Refer this FAQ by:

See FAQ:
* Chinese:
* English:

Duplicate or pre-existing issues may be removed, as they are already present in the issues or FAQ section:

For discussion or idea, please ask in [discord](

This issue will be eliminated, see #2716
Please ask this question on Stack Overflow using the [#simple-realtime-server tag](

If want some discussion, here's the [discord](

This issue will be eliminated, see #2716