ORBX was designed from the ground up to support low delay encoding and real time video streaming over the Internet. The encoder has specific optimizations in order to reduce the jitter associated with video transmission over a constrained bandwidth link such as that found in most households.
The design of the ORBX2 encoder is specially tailored to take advantage of modern massively parallel compute architectures, such
as those provided by GPUs and compute accelerators. Much of the encoding process can be implemented on these devices, allowing for extremely fast encoding. This can be directly contrasted to the industry standard H.264, where the design of the codec precludes an
implementation on GPUs without significant losses in quality and performance.
The ORBX decoder supports multi-threading, allowing it to scale easily on modern multi-core processor architectures. Much like the encoder, the decoder also supports implementation on massively parallel compute architectures.
The decoder for ORBX has intentionally been kept as low complexity as possible, without sacrificing on quality. This allows the codec to support decoding of HD resolutions even on low end platforms, such as cellphones and tablets.
With GPU based encoding, ORBX is able to achieve 4K and even 8K encoding in real-time with low latency and low jitter. This is something that is simply unachievable with any other codec.
ORBX supports channels with up to 12 bits of data without requiring any extensions to the format or special builds.
ORBX was designed with applications beyond simple color video streaming. The codec already supports depth + alpha streams such as those produced by Kinect, as well as motion point cloud data. The built in support for flexible sets of image planes with independent attributes makes ORBX able to be directly leveraged in many new and exciting applications.
ORBX video can be streamed directly to a modern browser without the need for Flash, or any additional plugin or extension.
ORBX was designed to efficiently compress heavy data sets, making it ideal for streaming light fields to thin clients like mobile VR devices.