71 research outputs found

    Questions de modernité et postmodernité dans la sculpture actuelle : un cas : Alomph Abram de David Moore

    Get PDF
    Le but de ce travail de recherche est de cerner et de circonscrire un certain esprit des temps pour observer ses manifestations dans l'art actuel. J'ai d'abord voulu rendre compte des valeurs qui marquent l'époque que nous vivons, aux plans philosophique et esthétique. Le champ de la connaissance a fait des bonds importants depuis la Renaissance et le Siècle des Lumières. La modernité a élargi les moyens d'analyse et de synthèse des phénomènes du monde. Récemment, la postmodernité semble cependant mettre en doute les expérimentations modernistes lorsqu'elle propose la multiplication des instruments de connaissance. En deuxième étape, j'ai voulu analyser l'esprit qui sous-tend le travail sculptural du post-formalisme américain et par la suite brosser un tableau historique de la situation de la sculpture au Québec. Finalement, je me suis arrêtée sur 1'installation aLomph aBram de l'artiste montréalais David Moore pour en montrer les appartenances contemporaines aux points de vue historique, intellectuel et esthétique.Montréal Trigonix inc. 201

    : a tutorial for psychology students and other beginners

    Get PDF
    Version françaçise de Navarro, D. J., & Foxcroft, D. R. (2019). Learning statistics with Jamovi: a tutorial for psychology students and other beginners. (Version 0.70). Consulté à l’adresse http://www.learnstatswithJamovi.comL'apprentissage des statistiques avec Jamovi couvre le contenu d'un cours d'introduction à la statistique, tel qu'il est généralement enseigné aux étudiants de premier cycle en psychologie. Le livre aborde la façon de commencer dans Jamovi et donne une introduction à la manipulation des données. D'un point de vue statistique, l'ouvrage traite d'abord des statistiques descriptives et la représentation graphique, puis de la théorie des probabilités, de l'échantillonnage et de l'estimation et de la vérification des hypothèses nulles. Après avoir présenté la théorie, le livre couvre l'analyse des tableaux de contingence, la corrélation, les tests t, la régression, l'ANOVA et l'analyse factorielle. Les statistiques bayésiennes sont présentées à la fin du livre

    Analysis of an algorithm catching elephants on the Internet

    Get PDF
    The paper deals with the problem of catching the elephants in the Internet traffic. The aim is to investigate an algorithm proposed by Azzana based on a multistage Bloom filter, with a refreshment mechanism (called shift\textit{shift} in the present paper), able to treat on-line a huge amount of flows with high traffic variations. An analysis of a simplified model estimates the number of false positives. Limit theorems for the Markov chain that describes the algorithm for large filters are rigorously obtained. The asymptotic behavior of the stochastic model is here deterministic. The limit has a nice formulation in terms of a M/G/1/CM/G/1/C queue, which is analytically tractable and which allows to tune the algorithm optimally

    Genetic influences on cost-efficient organization of human cortical functional networks

    Get PDF
    The human cerebral cortex is a complex network of functionally specialized regions interconnected by axonal fibers, but the organizational principles underlying cortical connectivity remain unknown. Here, we report evidence that one such principle for functional cortical networks involves finding a balance between maximizing communication efficiency and minimizing connection cost, referred to as optimization of network cost-efficiency. We measured spontaneous fluctuations of the blood oxygenation level-dependent signal using functional magnetic resonance imaging in healthy monozygotic (16 pairs) and dizygotic (13 pairs) twins and characterized cost-efficient properties of brain network functional connectivity between 1041 distinct cortical regions. At the global network level, 60% of the interindividual variance in cost-efficiency of cortical functional networks was attributable to additive genetic effects. Regionally, significant genetic effects were observed throughout the cortex in a largely bilateral pattern, including bilateral posterior cingulate and medial prefrontal cortices, dorsolateral prefrontal and superior parietal cortices, and lateral temporal and inferomedial occipital regions. Genetic effects were stronger for cost-efficiency than for other metrics considered, and were more clearly significant in functional networks operating in the 0.09–0.18 Hz frequency interval than at higher or lower frequencies. These findings are consistent with the hypothesis that brain networks evolved to satisfy competitive selection criteria of maximizing efficiency and minimizing cost, and that optimization of network cost-efficiency represents an important principle for the brain's functional organization

    Developing an acoustic-phonetic characterization of dysarthric speech in French

    Get PDF
    - ISBN: 2-9517408-6-7 - Domaines: Phonetic Databases, Phonology, Person IdentificationInternational audienceThis paper presents the rationale, objectives and advances of an on-going project (the DesPho-APaDy project funded by the French National Agency of Research) which aims to provide a systematic and quantified description of French dysarthric speech, over a large population of patients and three dysarthria types (related to the parkinson's disease, the Amyotrophic Lateral Sclerosis disease, and a pure cerebellar alteration). The two French corpora of dysarthric patients, from which the speech data have been selected for analysis purposes, are firstly described. Secondly, this paper discusses and outlines the requirement of a structured and organized computerized platform in order to store, organize and make accessible (for selected and protected usage) dysarthric speech corpora and associated patients' clinical information (mostly disseminated in different locations: labs, hospitals, ...). The design of both a computer database and a multi-field query interface is proposed for the clinical context. Finally, advances of the project related to the selection of the population used for the dysarthria analysis, the preprocessing of the speech files, their orthographic transcription and their automatic alignment are also presented

    Learning, Memory, and the Role of Neural Network Architecture

    Get PDF
    The performance of information processing systems, from artificial neural networks to natural neuronal ensembles, depends heavily on the underlying system architecture. In this study, we compare the performance of parallel and layered network architectures during sequential tasks that require both acquisition and retention of information, thereby identifying tradeoffs between learning and memory processes. During the task of supervised, sequential function approximation, networks produce and adapt representations of external information. Performance is evaluated by statistically analyzing the error in these representations while varying the initial network state, the structure of the external information, and the time given to learn the information. We link performance to complexity in network architecture by characterizing local error landscape curvature. We find that variations in error landscape structure give rise to tradeoffs in performance; these include the ability of the network to maximize accuracy versus minimize inaccuracy and produce specific versus generalizable representations of information. Parallel networks generate smooth error landscapes with deep, narrow minima, enabling them to find highly specific representations given sufficient time. While accurate, however, these representations are difficult to generalize. In contrast, layered networks generate rough error landscapes with a variety of local minima, allowing them to quickly find coarse representations. Although less accurate, these representations are easily adaptable. The presence of measurable performance tradeoffs in both layered and parallel networks has implications for understanding the behavior of a wide variety of natural and artificial learning systems

    Efficient Physical Embedding of Topologically Complex Information Processing Networks in Brains and Computer Circuits

    Get PDF
    Nervous systems are information processing networks that evolved by natural selection, whereas very large scale integrated (VLSI) computer circuits have evolved by commercially driven technology development. Here we follow historic intuition that all physical information processing systems will share key organizational properties, such as modularity, that generally confer adaptivity of function. It has long been observed that modular VLSI circuits demonstrate an isometric scaling relationship between the number of processing elements and the number of connections, known as Rent's rule, which is related to the dimensionality of the circuit's interconnect topology and its logical capacity. We show that human brain structural networks, and the nervous system of the nematode C. elegans, also obey Rent's rule, and exhibit some degree of hierarchical modularity. We further show that the estimated Rent exponent of human brain networks, derived from MRI data, can explain the allometric scaling relations between gray and white matter volumes across a wide range of mammalian species, again suggesting that these principles of nervous system design are highly conserved. For each of these fractal modular networks, the dimensionality of the interconnect topology was greater than the 2 or 3 Euclidean dimensions of the space in which it was embedded. This relatively high complexity entailed extra cost in physical wiring: although all networks were economically or cost-efficiently wired they did not strictly minimize wiring costs. Artificial and biological information processing systems both may evolve to optimize a trade-off between physical cost and topological complexity, resulting in the emergence of homologous principles of economical, fractal and modular design across many different kinds of nervous and computational networks

    Do brain networks evolve by maximizing their information flow capacity?

    Get PDF
    We propose a working hypothesis supported by numerical simulations that brain networks evolve based on the principle of the maximization of their internal information flow capacity. We find that synchronous behavior and capacity of information flow of the evolved networks reproduce well the same behaviors observed in the brain dynamical networks of Caenorhabditis elegans and humans, networks of Hindmarsh-Rose neurons with graphs given by these brain networks. We make a strong case to verify our hypothesis by showing that the neural networks with the closest graph distance to the brain networks of Caenorhabditis elegans and humans are the Hindmarsh-Rose neural networks evolved with coupling strengths that maximize information flow capacity. Surprisingly, we find that global neural synchronization levels decrease during brain evolution, reflecting on an underlying global no Hebbian-like evolution process, which is driven by no Hebbian-like learning behaviors for some of the clusters during evolution, and Hebbian-like learning rules for clusters where neurons increase their synchronization
    corecore