325,058 research outputs found
Input variable selection in time-critical knowledge integration applications: A review, analysis, and recommendation paper
This is the post-print version of the final paper published in Advanced Engineering Informatics. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2013 Elsevier B.V.The purpose of this research is twofold: first, to undertake a thorough appraisal of existing Input Variable Selection (IVS) methods within the context of time-critical and computation resource-limited dimensionality reduction problems; second, to demonstrate improvements to, and the application of, a recently proposed time-critical sensitivity analysis method called EventTracker to an environment science industrial use-case, i.e., sub-surface drilling.
Producing time-critical accurate knowledge about the state of a system (effect) under computational and data acquisition (cause) constraints is a major challenge, especially if the knowledge required is critical to the system operation where the safety of operators or integrity of costly equipment is at stake. Understanding and interpreting, a chain of interrelated events, predicted or unpredicted, that may or may not result in a specific state of the system, is the core challenge of this research. The main objective is then to identify which set of input data signals has a significant impact on the set of system state information (i.e. output). Through a cause-effect analysis technique, the proposed technique supports the filtering of unsolicited data that can otherwise clog up the communication and computational capabilities of a standard supervisory control and data acquisition system.
The paper analyzes the performance of input variable selection techniques from a series of perspectives. It then expands the categorization and assessment of sensitivity analysis methods in a structured framework that takes into account the relationship between inputs and outputs, the nature of their time series, and the computational effort required. The outcome of this analysis is that established methods have a limited suitability for use by time-critical variable selection applications. By way of a geological drilling monitoring scenario, the suitability of the proposed EventTracker Sensitivity Analysis method for use in high volume and time critical input variable selection problems is demonstrated.E
Embedded model discrepancy: A case study of Zika modeling
Mathematical models of epidemiological systems enable investigation of and
predictions about potential disease outbreaks. However, commonly used models
are often highly simplified representations of incredibly complex systems.
Because of these simplifications, the model output, of say new cases of a
disease over time, or when an epidemic will occur, may be inconsistent with
available data. In this case, we must improve the model, especially if we plan
to make decisions based on it that could affect human health and safety, but
direct improvements are often beyond our reach. In this work, we explore this
problem through a case study of the Zika outbreak in Brazil in 2016. We propose
an embedded discrepancy operator---a modification to the model equations that
requires modest information about the system and is calibrated by all relevant
data. We show that the new enriched model demonstrates greatly increased
consistency with real data. Moreover, the method is general enough to easily
apply to many other mathematical models in epidemiology.Comment: 9 pages, 7 figure
Validating Predictions of Unobserved Quantities
The ultimate purpose of most computational models is to make predictions,
commonly in support of some decision-making process (e.g., for design or
operation of some system). The quantities that need to be predicted (the
quantities of interest or QoIs) are generally not experimentally observable
before the prediction, since otherwise no prediction would be needed. Assessing
the validity of such extrapolative predictions, which is critical to informed
decision-making, is challenging. In classical approaches to validation, model
outputs for observed quantities are compared to observations to determine if
they are consistent. By itself, this consistency only ensures that the model
can predict the observed quantities under the conditions of the observations.
This limitation dramatically reduces the utility of the validation effort for
decision making because it implies nothing about predictions of unobserved QoIs
or for scenarios outside of the range of observations. However, there is no
agreement in the scientific community today regarding best practices for
validation of extrapolative predictions made using computational models. The
purpose of this paper is to propose and explore a validation and predictive
assessment process that supports extrapolative predictions for models with
known sources of error. The process includes stochastic modeling, calibration,
validation, and predictive assessment phases where representations of known
sources of uncertainty and error are built, informed, and tested. The proposed
methodology is applied to an illustrative extrapolation problem involving a
misspecified nonlinear oscillator
Contracts and Behavioral Patterns for SoS: The EU IP DANSE approach
This paper presents some of the results of the first year of DANSE, one of
the first EU IP projects dedicated to SoS. Concretely, we offer a tool chain
that allows to specify SoS and SoS requirements at high level, and analyse them
using powerful toolsets coming from the formal verification area. At the high
level, we use UPDM, the system model provided by the british army as well as a
new type of contract based on behavioral patterns. At low level, we rely on a
powerful simulation toolset combined with recent advances from the area of
statistical model checking. The approach has been applied to a case study
developed at EADS Innovation Works.Comment: In Proceedings AiSoS 2013, arXiv:1311.319
Efficient transfer entropy analysis of non-stationary neural time series
Information theory allows us to investigate information processing in neural
systems in terms of information transfer, storage and modification. Especially
the measure of information transfer, transfer entropy, has seen a dramatic
surge of interest in neuroscience. Estimating transfer entropy from two
processes requires the observation of multiple realizations of these processes
to estimate associated probability density functions. To obtain these
observations, available estimators assume stationarity of processes to allow
pooling of observations over time. This assumption however, is a major obstacle
to the application of these estimators in neuroscience as observed processes
are often non-stationary. As a solution, Gomez-Herrero and colleagues
theoretically showed that the stationarity assumption may be avoided by
estimating transfer entropy from an ensemble of realizations. Such an ensemble
is often readily available in neuroscience experiments in the form of
experimental trials. Thus, in this work we combine the ensemble method with a
recently proposed transfer entropy estimator to make transfer entropy
estimation applicable to non-stationary time series. We present an efficient
implementation of the approach that deals with the increased computational
demand of the ensemble method's practical application. In particular, we use a
massively parallel implementation for a graphics processing unit to handle the
computationally most heavy aspects of the ensemble method. We test the
performance and robustness of our implementation on data from simulated
stochastic processes and demonstrate the method's applicability to
magnetoencephalographic data. While we mainly evaluate the proposed method for
neuroscientific data, we expect it to be applicable in a variety of fields that
are concerned with the analysis of information transfer in complex biological,
social, and artificial systems.Comment: 27 pages, 7 figures, submitted to PLOS ON
Assessment for learning and for self-regulation
Drawing on a research study of formative assessment practices in Irish schools, this
paper traces the design, development and pilot of the Assessment for Learning Audit
instrument (AfLAi) - a research tool for measuring teachersâ understanding and
deployment of formative teaching, learning and assessment practices. Underpinning the
paper is an extensive body of international research connecting assessment for learning
pedagogy with student self-regulation, mental health and well-being. Reflecting on the
potential of the AfLAi as a research tool, an activity systems framework is advanced as a
mechanism to engage researchers and teachers in meaningful site-based continuous
professional development that supports teachersâ interrogation of aggregated school data
derived from their responses to the AfLAi. It is argued that by de-privatising classroom
practice in this way and challenging teachers to examine self-reports of their
understanding and use of assessment for learning pedagogy, the extent to which students
are afforded opportunities to develop as self-regulating learners is laid bare. In turn, the
teaching, learning and assessment conditions that serve to create and sustain selfregulation
by students emerge. The paper is premised on a commitment to a
biopsychosocial approach to mental health and to an inter-disciplinary, multi-lens,
research agenda that will yield comprehensive, dynamic insights and understandings to
inform future practice.peer-reviewe
Developing a distributed electronic health-record store for India
The DIGHT project is addressing the problem of building a scalable and highly available information store for the Electronic Health Records (EHRs) of the over one billion citizens of India
- âŠ