478,235 research outputs found

    Interaction patterns of brain activity across space, time and frequency. Part I: methods

    Full text link
    We consider exploratory methods for the discovery of cortical functional connectivity. Typically, data for the i-th subject (i=1...NS) is represented as an NVxNT matrix Xi, corresponding to brain activity sampled at NT moments in time from NV cortical voxels. A widely used method of analysis first concatenates all subjects along the temporal dimension, and then performs an independent component analysis (ICA) for estimating the common cortical patterns of functional connectivity. There exist many other interesting variations of this technique, as reviewed in [Calhoun et al. 2009 Neuroimage 45: S163-172]. We present methods for the more general problem of discovering functional connectivity occurring at all possible time lags. For this purpose, brain activity is viewed as a function of space and time, which allows the use of the relatively new techniques of functional data analysis [Ramsay & Silverman 2005: Functional data analysis. New York: Springer]. In essence, our method first vectorizes the data from each subject, which constitutes the natural discrete representation of a function of several variables, followed by concatenation of all subjects. The singular value decomposition (SVD), as well as the ICA of this new matrix of dimension [rows=(NT*NV); columns=NS] will reveal spatio-temporal patterns of connectivity. As a further example, in the case of EEG neuroimaging, Xi of size NVxNW may represent spectral density for electric neuronal activity at NW discrete frequencies from NV cortical voxels, from the i-th EEG epoch. In this case our functional data analysis approach would reveal coupling of brain regions at possibly different frequencies.Comment: Technical report 2011-March-15, The KEY Institute for Brain-Mind Research Zurich, KMU Osak

    Towards sophisticated learning from EHRs : increasing prediction specificity and accuracy using clinically meaningful risk criteria

    Get PDF
    Computer based analysis of Electronic Health Records (EHRs) has the potential to provide major novel insights of benefit both to specific individuals in the context of personalized medicine, as well as on the level of population-wide health care and policy. The present paper introduces a novel algorithm that uses machine learning for the discovery of longitudinal patterns in the diagnoses of diseases. Two key technical novelties are introduced: one in the form of a novel learning paradigm which enables greater learning specificity, and another in the form of a risk driven identification of confounding diagnoses. We present a series of experiments which demonstrate the effectiveness of the proposed techniques, and which reveal novel insights regarding the most promising future research directions.Postprin

    Regime Switching and Technical Trading with Dynamic Bayesian Networks in High-Frequency Stock Markets

    Get PDF
    Technical analysis has been thwarted in academic circles, due to the Efficient Market Hypothesis, which had significant empirical support early on. However recently, there is accumulating evidence that the markets are not as efficient and a new theory of price discovery, Heterogenous Market Hypothesis, is being proposed. As such, there is renewed interest and possibility in technical analysis, which identifies trends in price and volume based on aggregate repeatable human behavioural patterns. In this thesis we propose a new approach for modeling and working with technical analysis in high-frequency markets: dynamic Bayesian networks (DBNs). DBNs are a statistical modeling and learning framework that have had successful applications in other domains such as speech recognition, bio-sequencing, visual interpretation. It provides a coherent probabilistic framework (in a Bayesian sense), that can be used for both learning technical rules and inferring the hidden state of the system. We design a DBN to learn price and volume patterns in TSE60 stock market and find that our model is able to successfully identify runs and reversal out-of-sample in a statistically significant way

    Perceptual-cognitive Training Improves Cross-Cultural Communication

    Get PDF
    Authoring adaptive training can present challenges because instructors, unit leaders, and other non-technical users need to understand and control adaptation in order to accept and make use of a training system such as GIFT. Therefore, adaptation should be presented in a manner that parallels the way these end users think about instruction (Wray, Folsom-Kovarik, Woods, & Jones, 2015). This work enabled future improvements in authoring for adaptation by adding several constructs inside GIFT. First, patterns added a new construct for defining learner behaviors and analytics that can drive adaptation. Second, misconceptions added information to GIFT concepts in the Learner Module about reasons that individuals might be performing Below Expectation. Third, mid-lesson reports tested a specific type of adaptive intervention that prompts learner reflection during training, with reduced authoring via reusable prompts.A randomized controlled trial was conducted to evaluate the training effectiveness of GIFT when driving adaptive feedback in a newly integrated tool for perceptual and cognitive skills relevant to cross-cultural communication. The combination of GIFT plus the skill training was evaluated by a population of 74 West Point Cadets. A preliminary analysis supported the value of the patterns to identify different classes of learner experience and, in future, to let non-technical personnel define what high-level behaviors and groups of observations would help GIFT respond to these. The analysis also suggested new domain-general misconceptions that might be able to inform adaptation. The evaluation showed an improvement between pre-test and post-test scores across all users. The discovery of new patterns and misconceptions highlights opportunities for instructors or unit leaders to gather evidence about how training is progressing in GIFT and, with future incorporation into the GIFT authoring suite, to quickly add new adaptive interventions that make training more effective

    Towards a re-engineering method for web services architectures

    Get PDF
    Recent developments in Web technologies ā€“ in particular through the Web services framework ā€“ have greatly enhanced the flexible and interoperable implementation of service-oriented software architectures. Many older Web-based and other distributed software systems will be re-engineered to a Web services-oriented platform. Using an advanced e-learning system as our case study, we investigate central aspects of a re-engineering approach for the Web services platform. Since our aim is to provide components of the legacy system also as services in the new platform, re-engineering to suit the new development paradigm is as important as re-engineering to suit the new architectural requirements

    Structuring visual exploratory analysis of skill demand

    No full text
    The analysis of increasingly large and diverse data for meaningful interpretation and question answering is handicapped by human cognitive limitations. Consequently, semi-automatic abstraction of complex data within structured information spaces becomes increasingly important, if its knowledge content is to support intuitive, exploratory discovery. Exploration of skill demand is an area where regularly updated, multi-dimensional data may be exploited to assess capability within the workforce to manage the demands of the modern, technology- and data-driven economy. The knowledge derived may be employed by skilled practitioners in defining career pathways, to identify where, when and how to update their skillsets in line with advancing technology and changing work demands. This same knowledge may also be used to identify the combination of skills essential in recruiting for new roles. To address the challenges inherent in exploring the complex, heterogeneous, dynamic data that feeds into such applications, we investigate the use of an ontology to guide structuring of the information space, to allow individuals and institutions to interactively explore and interpret the dynamic skill demand landscape for their specific needs. As a test case we consider the relatively new and highly dynamic field of Data Science, where insightful, exploratory data analysis and knowledge discovery are critical. We employ context-driven and task-centred scenarios to explore our research questions and guide iterative design, development and formative evaluation of our ontology-driven, visual exploratory discovery and analysis approach, to measure where it adds value to usersā€™ analytical activity. Our findings reinforce the potential in our approach, and point us to future paths to build on

    Data mining as a tool for environmental scientists

    Get PDF
    Over recent years a huge library of data mining algorithms has been developed to tackle a variety of problems in fields such as medical imaging and network traffic analysis. Many of these techniques are far more flexible than more classical modelling approaches and could be usefully applied to data-rich environmental problems. Certain techniques such as Artificial Neural Networks, Clustering, Case-Based Reasoning and more recently Bayesian Decision Networks have found application in environmental modelling while other methods, for example classification and association rule extraction, have not yet been taken up on any wide scale. We propose that these and other data mining techniques could be usefully applied to difficult problems in the field. This paper introduces several data mining concepts and briefly discusses their application to environmental modelling, where data may be sparse, incomplete, or heterogenous
    • ā€¦
    corecore