51,586 research outputs found

    Development of human limbs.

    Get PDF
    This work offers a new view on the developmental history of tetrapods. It proposes an original evolution model of human limbs based on metameric formation of osteogenic buds in accordance to primary segmentation and biplanar symmetry. While going through evolution, osteogenic buds initially identical to each other were changing their sizes, realigning, regressing, uniting while keeping the direction of the formation in accordance to the following formula (taking into account sesamoid bones):
2; 1; 2; 3; 2; 3; 5; 5; 8; 8 (in the upper limb together with the upper limb girdle); 3; 2; 3; 2; 1; 2; 8; 8; 5; 5 (in the lower limb together with the pelvic bones)

    From holism to compositionality: memes and the evolution of segmentation, syntax, and signification in music and language

    Get PDF
    Steven Mithen argues that language evolved from an antecedent he terms “Hmmmmm, [meaning it was] Holistic, manipulative, multi-modal, musical and mimetic”. Owing to certain innate and learned factors, a capacity for segmentation and cross-stream mapping in early Homo sapiens broke the continuous line of Hmmmmm, creating discrete replicated units which, with the initial support of Hmmmmm, eventually became the semantically freighted words of modern language. That which remained after what was a bifurcation of Hmmmmm arguably survived as music, existing as a sound stream segmented into discrete units, although one without the explicit and relatively fixed semantic content of language. All three types of utterance – the parent Hmmmmm, language, and music – are amenable to a memetic interpretation which applies Universal Darwinism to what are understood as language and musical memes. On the basis of Peter Carruthers’ distinction between ‘cognitivism’ and ‘communicativism’ in language, and William Calvin’s theories of cortical information encoding, a framework is hypothesized for the semantic and syntactic associations between, on the one hand, the sonic patterns of language memes (‘lexemes’) and of musical memes (‘musemes’) and, on the other hand, ‘mentalese’ conceptual structures, in Chomsky’s ‘Logical Form’ (LF)

    Neural End-to-End Learning for Computational Argumentation Mining

    Full text link
    We investigate neural techniques for end-to-end computational argumentation mining (AM). We frame AM both as a token-based dependency parsing and as a token-based sequence tagging problem, including a multi-task learning setup. Contrary to models that operate on the argument component level, we find that framing AM as dependency parsing leads to subpar performance results. In contrast, less complex (local) tagging models based on BiLSTMs perform robustly across classification scenarios, being able to catch long-range dependencies inherent to the AM problem. Moreover, we find that jointly learning 'natural' subtasks, in a multi-task learning setup, improves performance.Comment: To be published at ACL 201
    corecore