443 research outputs found

    Framed sheaves on projective space and Quot schemes

    Full text link
    We prove that, given integers m≥3m\geq 3, r≥1r\geq 1 and n≥0n\geq 0, the moduli space of torsion free sheaves on Pm\mathbb P^m with Chern character (r,0,…,0,−n)(r,0,\ldots,0,-n) that are trivial along a hyperplane D⊂PmD \subset \mathbb P^m is isomorphic to the Quot scheme QuotAm(O⊕r,n)\mathrm{Quot}_{\mathbb A^m}(\mathscr O^{\oplus r},n) of 00-dimensional length nn quotients of the free sheaf O⊕r\mathscr O^{\oplus r} on Am\mathbb A^m.Comment: Minor improvement

    What is the probability that a random symmetric tensor is close to rank-one?

    Full text link
    We address the general problem of estimating the probability that a real symmetric tensor is close to rank-one tensors. Using Weyl's tube formula, we turn this question into a differential geometric one involving the study of metric invariants of the real Veronese variety. More precisely, we give an explicit formula for its reach and curvature coefficients with respect to the Bombieri-Weyl metric. These results are obtained using techniques from Random Matrix theory and an explicit description of the second fundamental form of the Veronese variety in terms of GOE matrices. Our findings give a complete solution to the original problem, and in the case of rational normal curves lead to some novel asymptotic results.Comment: 25 pages. Minor changes, corrected a typo that affected some constant

    The geometry of hidden representations of large transformer models

    Full text link
    Large transformers are powerful architectures used for self-supervised data analysis across various data types, including protein sequences, images, and text. In these models, the semantic structure of the dataset emerges from a sequence of transformations between one representation and the next. We characterize the geometric and statistical properties of these representations and how they change as we move through the layers. By analyzing the intrinsic dimension (ID) and neighbor composition, we find that the representations evolve similarly in transformers trained on protein language tasks and image reconstruction tasks. In the first layers, the data manifold expands, becoming high-dimensional, and then contracts significantly in the intermediate layers. In the last part of the model, the ID remains approximately constant or forms a second shallow peak. We show that the semantic information of the dataset is better expressed at the end of the first peak, and this phenomenon can be observed across many models trained on diverse datasets. Based on our findings, we point out an explicit strategy to identify, without supervision, the layers that maximize semantic content: representations at intermediate layers corresponding to a relative minimum of the ID profile are more suitable for downstream learning tasks

    Emergent representations in networks trained with the Forward-Forward algorithm

    Full text link
    The Backpropagation algorithm, widely used to train neural networks, has often been criticised for its lack of biological realism. In an attempt to find a more biologically plausible alternative, and avoid to back-propagate gradients in favour of using local learning rules, the recently introduced Forward-Forward algorithm replaces the traditional forward and backward passes of Backpropagation with two forward passes. In this work, we show that internal representations obtained with the Forward-Forward algorithm organize into robust, category-specific ensembles, composed by an extremely low number of active units (high sparsity). This is remarkably similar to what is observed in cortical representations during sensory processing. While not found in models trained with standard Backpropagation, sparsity emerges also in networks optimized by Backpropagation, on the same training objective of Forward-Forward. These results suggest that the learning procedure proposed by Forward-Forward may be superior to Backpropagation in modelling learning in the cortex, even when a backward pass is used.Comment: 14 pages, 8 figure

    Exploiting secondary raw materials from extractive waste facilities: A case study

    Get PDF
    In recent years, resource scarcity has emphasised a need to transition from a linear to a circular flow of resources. Securing supplies of critical and secondary raw materials (CRM/SRM) for the manufacturing industry is at the forefront of industrial challenges, especially in Europe, USA and Asia. A key step towards achieving resource efficiency, is to recover these materials from anthropogenic waste deposits, such as urban landfill sites and extractive waste facilities. This means breaking away from the traditional linear use of resources to a closed-loop approach that allows maximum recovery of resources from waste. The management of extractive waste deposits and resource recovery is closely linked to the concept of urban mining. In this paper, we present a case study illustrating the feasibility of recovering SRM from EW facilities and discuss the pros and cons of undertaking such activities
    • …
    corecore