Immersive video offers a 6-Dof-free viewing experience, potentially playing a key role in future video technology. Recently, 4D Gaussian Splatting has gained attention as an effective approach for immersive video due to its high rendering efficiency and quality, though maintaining quality with manageable storage remains challenging. To address this, we introduce GIFStream, a novel 4D Gaussian representation using a canonical space and a deformation field enhanced with time-dependent feature streams. These feature streams enable complex motion modeling and allow efficient compression by leveraging their motion-awareness and temporal correspondence. Additionally, we incorporate both temporal and spatial compression networks for endto-end compression. Experimental results show that GIFStream delivers high-quality immersive video at 30 Mbps, with real-time rendering and fast decoding on an RTX 4090.
(I. Representation) We propose enhancing deformation-based dynamic Gaussian representation using time-dependentfeature streams. We attach time-dependent feature \(\mathbf{f}_t\) and time-independent feature \(\mathbf{f}\) to a set of anchor points. These features are aggregated to decode deformation motion \(\mathbf{R}_t\), \(\mathbf{T}_t\) and Gaussian attribute \(\alpha_t\), \(\mathbf{s}_t\), \(\mathbf{r}_t\), \(\mathbf{c}_t\) at a specific timestamp \(t\) through MLPs. Finally, we render the target view through splatting.
(II. Compression) For compression, we reorganize both time-dependent and time-independent parameters into two videos. The feature streams are first pruned and then compressed in an auto-regressive manner, effectively leveraging the temporal correspondence information. During training, we jointly optimize the rendering loss \(L_{photo}\) and add an entropy constraint \(L_{entropy}\).
RD-Curve in the MPEG dataset
@misc{li2025gifstream4dgaussianbasedimmersive,
title={GIFStream: 4D Gaussian-based Immersive Video with Feature Stream},
author={Hao Li and Sicheng Li and Xiang Gao and Abudouaihati Batuer and Lu Yu and Yiyi Liao},
year={2025},
eprint={2505.07539},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2505.07539},
}