A common convention is to assign Profile 1 for LAN connections and Profile 2 for WAN connections. What's great about BI is it allows the ability to assign different streaming profiles for the LAN / WAN. Similarly, for constant quality encoding, you will simply save bitrate by choosing a slower preset. This means that, for example, if you target a certain file size or constant bit rate, you will achieve better quality with a slower preset. A slower preset will provide better compression. Presets: Presets are a collection of options that will provide a certain encoding speed to compression ratio.Also adopted by the Blu-ray disc storage format. High profile is often used in broadcasting. They can be used to save on bandwidth but harder to decode. Main: B frames (above) are only allowed in the Main profile or above. However, now more and more mobile devices support Main or High profiles. Baseline profiles were widely used for mobile devices. Main and High add features on top of Baseline. Baseline can easily be played back on most devices. Profiles: H.264 provides features that are encapsulated in profiles.īaseline restricts the encoder to certain basic features only.Maximum B-frames between P-frames: 0 (only applies to Main profiles or above) If unchecked, you are using a Variable Bit Rate (VBR). Rate control: If selected, you are choosing a Constant Bit Rate (CBR).In most cases, quality = 50 is plenty good. For example, 100 does not produce an image that is twice the quality of a 50 setting. The marginal difference in quality after a certain point is minimal. Many GPUs do a poor job of HA encoding which leads to streams that cannot be played by media players. Hardware acceleration: Usually set to No.Webserver encoding.PNG (135.87 KiB) Viewed 11741 times In terms of the video pipeline as defined in the BI Streaming Overview article: And the final destination is the endpoint playing the video, for example a browser on a laptop, a mobile device, a TV, a gaming device etc.Media servers are responsible for taking video/audio inputs and streaming the content to various endpoints. Media servers can take streams in one format, for example RTSP from a camera and convert them to another format, for example, RTMP so that users can, for example, share their camera streams elsewhere such as YouTube. Media server + Video Management Systemīlue Iris is also a media server. Network: Responsible for moving the data from cameras to Blue Iris.HLS, MPEG DASH, RTMP and RTSP are all common protocols for video streaming. Think of packaging as grouping 1-3s of video frames into one chunk of data that is transported across the network. Packaging involves bundling all the compressed frames into manageable sizes that can be transported over the network. H.264 and H.265 are standard compression algorithms supported by Blue Iris. Compression algorithms are responsible for shrinking each frame to manageable chunks which can be transported over the internet. IP Cameras are high performance computers which is why they are so expensive. The camera is dedicated hardware that usually captures 15-30 frames per second. The process starts with a video capture device otherwise known as a IP camera. Standard streaming pipeline: IP Cameras -> Media Server -> Endpoint BI webcasting.png (63.81 KiB) Viewed 13744 times
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |