20,777 research outputs found

    A general algebraic algorithm for blind extraction of one source in a MIMO convolutive mixture

    No full text
    International audienceThe paper deals with the problem of blind source extraction from a MIMO convolutive mixture. We define a new criterion for source extraction which uses higher-order contrast functions based on so called reference signals. It generalizes existing reference-based contrasts. In order to optimize the new criterion, we propose a general algebraic algorithm based on best rank-1 tensor approximation. Computer simulations illustrate the good behavior and the interest of our algorithm in comparison with other approaches

    Different Methods to Define Utility Functions Yield Similar Results but Engage Different Neural Processes

    Get PDF
    Although the concept of utility is fundamental to many economic theories, up to now a generally accepted method determining a subject's utility function is not available. We investigated two methods that are used in economic sciences for describing utility functions by using response-locked event-related potentials in order to assess their neural underpinnings. For determining the certainty equivalent, we used a lottery game with probabilities to win p = 0.5, for identifying the subjects’ utility functions directly a standard bisection task was applied. Although the lottery tasks’ payoffs were only hypothetical, a pronounced negativity was observed resembling the error related negativity (ERN) previously described in action monitoring research, but this occurred only for choices far away from the indifference point between money and lottery. By contrast, the bisection task failed to evoke an remarkable ERN irrespective of the responses’ correctness. Based on these findings we are reasoning that only decisions made in the lottery task achieved a level of subjective relevance that activates cognitive-emotional monitoring. In terms of economic sciences, our findings support the view that the bisection method is unaffected by any kind of probability valuation or other parameters related to risk and in combination with the lottery task can, therefore, be used to differentiate between payoff and probability valuation

    Methodological considerations of integrating portable digital technologies in the analysis and management of complex superimposed Californian pictographs: From spectroscopy and spectral imaging to 3-D scanning

    Get PDF
    How can the utilization of newly developed advanced portable technologies give us greater understandings of the most complex of prehistoric rock art? This is the questions driving The Gordian Knot project analysing the polychrome Californian site known as Pleito. New small transportable devices allow detailed on-site analyses of rock art. These non-destructive portable technologies can use X-ray and Raman technology to determine the chemical elements used to make the pigment that makes the painting; they can use imaging techniques such as Highlight Reflective Transformation Imaging and dStretch© to enhance their visibility; they can use digital imagery to disentangle complex superimposed paintings; and they can use portable laser instruments to analyse the micro-topography of the rock surface and integrate these technologies into a 3-D environment. This paper outlines a robust methodology and preliminary results to show how an integration of different portable technologies can serve rock art research and management

    Independent EEG Sources Are Dipolar

    Get PDF
    Independent component analysis (ICA) and blind source separation (BSS) methods are increasingly used to separate individual brain and non-brain source signals mixed by volume conduction in electroencephalographic (EEG) and other electrophysiological recordings. We compared results of decomposing thirteen 71-channel human scalp EEG datasets by 22 ICA and BSS algorithms, assessing the pairwise mutual information (PMI) in scalp channel pairs, the remaining PMI in component pairs, the overall mutual information reduction (MIR) effected by each decomposition, and decomposition ‘dipolarity’ defined as the number of component scalp maps matching the projection of a single equivalent dipole with less than a given residual variance. The least well-performing algorithm was principal component analysis (PCA); best performing were AMICA and other likelihood/mutual information based ICA methods. Though these and other commonly-used decomposition methods returned many similar components, across 18 ICA/BSS algorithms mean dipolarity varied linearly with both MIR and with PMI remaining between the resulting component time courses, a result compatible with an interpretation of many maximally independent EEG components as being volume-conducted projections of partially-synchronous local cortical field activity within single compact cortical domains. To encourage further method comparisons, the data and software used to prepare the results have been made available (http://sccn.ucsd.edu/wiki/BSSComparison)

    A Peer-to-Peer Middleware Framework for Resilient Persistent Programming

    Get PDF
    The persistent programming systems of the 1980s offered a programming model that integrated computation and long-term storage. In these systems, reliable applications could be engineered without requiring the programmer to write translation code to manage the transfer of data to and from non-volatile storage. More importantly, it simplified the programmer's conceptual model of an application, and avoided the many coherency problems that result from multiple cached copies of the same information. Although technically innovative, persistent languages were not widely adopted, perhaps due in part to their closed-world model. Each persistent store was located on a single host, and there were no flexible mechanisms for communication or transfer of data between separate stores. Here we re-open the work on persistence and combine it with modern peer-to-peer techniques in order to provide support for orthogonal persistence in resilient and potentially long-running distributed applications. Our vision is of an infrastructure within which an application can be developed and distributed with minimal modification, whereupon the application becomes resilient to certain failure modes. If a node, or the connection to it, fails during execution of the application, the objects are re-instantiated from distributed replicas, without their reference holders being aware of the failure. Furthermore, we believe that this can be achieved within a spectrum of application programmer intervention, ranging from minimal to totally prescriptive, as desired. The same mechanisms encompass an orthogonally persistent programming model. We outline our approach to implementing this vision, and describe current progress.Comment: Submitted to EuroSys 200

    Parental unemployment and children's school performance

    Get PDF
    This study investigates the effect of parental unemployment on children’s school performance. We use individual level data for all children completing lower secondary school in Sweden in 1990 directly moving on to three years of upper secondary school. We control for family and individual heterogeneity by means of lower secondary school GPA. The huge variation in Swedish unemployment during the beginning of the 1990s provides an ideal setting for testing the hypothesis that parental unemployment affects children’s school performance. Our results indicate that having an unemployed father has a negative effect on children’s school performance while having an unemployed mother has a positive effect.School performance; unemployment
    corecore