397 research outputs found

    Documentació: més enllà del Google

    Get PDF

    Arquitectura rural al Montsant occidental

    Full text link

    Quan la informació es plega a l'espectacle

    Get PDF

    Relació de 29 tàxons de cormòfits nous i altres dades florístiques d'interès per al massís del Montsant

    Get PDF
    L'article presenta 29 tàxons nous per a la flora vascular del massís de Montsant que l'autor i altres han identificat amb posterioritat a la publicació de la ‘Flora de la Serra de Montsant', l'any 2007. Entre aquests hi ha diverses plantes aŀlòctones que han penetrat darrerament al massís o que havien passat inadvertides, espècies d'elevat interès biogeogràfic, novetats per la territori catalanídic central o fins i tot el sud de Catalunya, tàxons d'ecologia i distribució poc conegudes i altres molt rares al Principat. El treball també aporta noves dades sobre 7 tàxons més molt rars o d'especial interès.The paper reports 29 new taxa of vascular plants for Montsant range identified by the author and others after the publication of the ‘Flora de la Serra de Montsant', in 2007. Among these there are several alien species that have lately spread in the range or that remained unnoticed, plants of high biogeographic interest, new citations for the central catalanidic territory or even for the south of Catalonia, and taxa which ecology and distribution are poorly known or that are very rare in Catalonia. Additionally, the work gives new details about 7 very scarce or special interest taxa

    Arquitectura rural en el Montsant occidental (Priorat)

    Full text link

    Telling BERT's full story: from Local Attention to Global Aggregation

    Full text link
    We take a deep look into the behavior of self-attention heads in the transformer architecture. In light of recent work discouraging the use of attention distributions for explaining a model's behavior, we show that attention distributions can nevertheless provide insights into the local behavior of attention heads. This way, we propose a distinction between local patterns revealed by attention and global patterns that refer back to the input, and analyze BERT from both angles. We use gradient attribution to analyze how the output of an attention attention head depends on the input tokens, effectively extending the local attention-based analysis to account for the mixing of information throughout the transformer layers. We find that there is a significant discrepancy between attention and attribution distributions, caused by the mixing of context inside the model. We quantify this discrepancy and observe that interestingly, there are some patterns that persist across all layers despite the mixing.Comment: Accepted at EACL 202
    corecore