217 research outputs found
HyperThumbnail: Real-time 6K Image Rescaling with Rate-distortion Optimization
Contemporary image rescaling aims at embedding a high-resolution (HR) image
into a low-resolution (LR) thumbnail image that contains embedded information
for HR image reconstruction. Unlike traditional image super-resolution, this
enables high-fidelity HR image restoration faithful to the original one, given
the embedded information in the LR thumbnail. However, state-of-the-art image
rescaling methods do not optimize the LR image file size for efficient sharing
and fall short of real-time performance for ultra-high-resolution (e.g., 6K)
image reconstruction. To address these two challenges, we propose a novel
framework (HyperThumbnail) for real-time 6K rate-distortion-aware image
rescaling. Our framework first embeds an HR image into a JPEG LR thumbnail by
an encoder with our proposed quantization prediction module, which minimizes
the file size of the embedding LR JPEG thumbnail while maximizing HR
reconstruction quality. Then, an efficient frequency-aware decoder reconstructs
a high-fidelity HR image from the LR one in real time. Extensive experiments
demonstrate that our framework outperforms previous image rescaling baselines
in rate-distortion performance and can perform 6K image reconstruction in real
time.Comment: Accepted by CVPR 2023; Github Repository:
https://github.com/AbnerVictor/HyperThumbnai
Joint Hierarchical Priors and Adaptive Spatial Resolution for Efficient Neural Image Compression
Recently, the performance of neural image compression (NIC) has steadily
improved thanks to the last line of study, reaching or outperforming
state-of-the-art conventional codecs. Despite significant progress, current NIC
methods still rely on ConvNet-based entropy coding, limited in modeling
long-range dependencies due to their local connectivity and the increasing
number of architectural biases and priors, resulting in complex underperforming
models with high decoding latency. Motivated by the efficiency investigation of
the Tranformer-based transform coding framework, namely SwinT-ChARM, we propose
to enhance the latter, as first, with a more straightforward yet effective
Tranformer-based channel-wise auto-regressive prior model, resulting in an
absolute image compression transformer (ICT). Through the proposed ICT, we can
capture both global and local contexts from the latent representations and
better parameterize the distribution of the quantized latents. Further, we
leverage a learnable scaling module with a sandwich ConvNeXt-based
pre-/post-processor to accurately extract more compact latent codes while
reconstructing higher-quality images. Extensive experimental results on
benchmark datasets showed that the proposed framework significantly improves
the trade-off between coding efficiency and decoder complexity over the
versatile video coding (VVC) reference encoder (VTM-18.0) and the neural codec
SwinT-ChARM. Moreover, we provide model scaling studies to verify the
computational efficiency of our approach and conduct several objective and
subjective analyses to bring to the fore the performance gap between the
adaptive image compression transformer (AICT) and the neural codec SwinT-ChARM
Combined Industry, Space and Earth Science Data Compression Workshop
The sixth annual Space and Earth Science Data Compression Workshop and the third annual Data Compression Industry Workshop were held as a single combined workshop. The workshop was held April 4, 1996 in Snowbird, Utah in conjunction with the 1996 IEEE Data Compression Conference, which was held at the same location March 31 - April 3, 1996. The Space and Earth Science Data Compression sessions seek to explore opportunities for data compression to enhance the collection, analysis, and retrieval of space and earth science data. Of particular interest is data compression research that is integrated into, or has the potential to be integrated into, a particular space or earth science data information system. Preference is given to data compression research that takes into account the scien- tist's data requirements, and the constraints imposed by the data collection, transmission, distribution and archival systems
- …