181,260 research outputs found

    Overfitting for Fun and Profit: Instance-Adaptive Data Compression

    Get PDF
    Neural data compression has been shown to outperform classical methods in terms of RDRD performance, with results still improving rapidly. At a high level, neural compression is based on an autoencoder that tries to reconstruct the input instance from a (quantized) latent representation, coupled with a prior that is used to losslessly compress these latents. Due to limitations on model capacity and imperfect optimization and generalization, such models will suboptimally compress test data in general. However, one of the great strengths of learned compression is that if the test-time data distribution is known and relatively low-entropy (e.g. a camera watching a static scene, a dash cam in an autonomous car, etc.), the model can easily be finetuned or adapted to this distribution, leading to improved RDRD performance. In this paper we take this concept to the extreme, adapting the full model to a single video, and sending model updates (quantized and compressed using a parameter-space prior) along with the latent representation. Unlike previous work, we finetune not only the encoder/latents but the entire model, and - during finetuning - take into account both the effect of model quantization and the additional costs incurred by sending the model updates. We evaluate an image compression model on I-frames (sampled at 2 fps) from videos of the Xiph dataset, and demonstrate that full-model adaptation improves RDRD performance by ~1 dB, with respect to encoder-only finetuning.Comment: Accepted at International Conference on Learning Representations 202

    Hopf Maps, Lowest Landau Level, and Fuzzy Spheres

    Full text link
    This paper is a review of monopoles, lowest Landau level, fuzzy spheres, and their mutual relations. The Hopf maps of division algebras provide a prototype relation between monopoles and fuzzy spheres. Generalization of complex numbers to Clifford algebra is exactly analogous to generalization of fuzzy two-spheres to higher dimensional fuzzy spheres. Higher dimensional fuzzy spheres have an interesting hierarchical structure made of "compounds" of lower dimensional spheres. We give a physical interpretation for such particular structure of fuzzy spheres by utilizing Landau models in generic even dimensions. With Grassmann algebra, we also introduce a graded version of the Hopf map, and discuss its relation to fuzzy supersphere in context of supersymmetric Landau model.Comment: v2: note and references added; v3: references adde

    The Network Analysis of Urban Streets: A Primal Approach

    Full text link
    The network metaphor in the analysis of urban and territorial cases has a long tradition especially in transportation/land-use planning and economic geography. More recently, urban design has brought its contribution by means of the "space syntax" methodology. All these approaches, though under different terms like accessibility, proximity, integration,connectivity, cost or effort, focus on the idea that some places (or streets) are more important than others because they are more central. The study of centrality in complex systems,however, originated in other scientific areas, namely in structural sociology, well before its use in urban studies; moreover, as a structural property of the system, centrality has never been extensively investigated metrically in geographic networks as it has been topologically in a wide range of other relational networks like social, biological or technological. After two previous works on some structural properties of the dual and primal graph representations of urban street networks (Porta et al. cond-mat/0411241; Crucitti et al. physics/0504163), in this paper we provide an in-depth investigation of centrality in the primal approach as compared to the dual one, with a special focus on potentials for urban design.Comment: 19 page, 4 figures. Paper related to the paper "The Network Analysis of Urban Streets: A Dual Approach" cond-mat/041124
    corecore