173,328 research outputs found
An empirical learning-based validation procedure for simulation workflow
Simulation workflow is a top-level model for the design and control of
simulation process. It connects multiple simulation components with time and
interaction restrictions to form a complete simulation system. Before the
construction and evaluation of the component models, the validation of
upper-layer simulation workflow is of the most importance in a simulation
system. However, the methods especially for validating simulation workflow is
very limit. Many of the existing validation techniques are domain-dependent
with cumbersome questionnaire design and expert scoring. Therefore, this paper
present an empirical learning-based validation procedure to implement a
semi-automated evaluation for simulation workflow. First, representative
features of general simulation workflow and their relations with validation
indices are proposed. The calculation process of workflow credibility based on
Analytic Hierarchy Process (AHP) is then introduced. In order to make full use
of the historical data and implement more efficient validation, four learning
algorithms, including back propagation neural network (BPNN), extreme learning
machine (ELM), evolving new-neuron (eNFN) and fast incremental gaussian mixture
model (FIGMN), are introduced for constructing the empirical relation between
the workflow credibility and its features. A case study on a landing-process
simulation workflow is established to test the feasibility of the proposed
procedure. The experimental results also provide some useful overview of the
state-of-the-art learning algorithms on the credibility evaluation of
simulation models
SUBIC: A Supervised Bi-Clustering Approach for Precision Medicine
Traditional medicine typically applies one-size-fits-all treatment for the
entire patient population whereas precision medicine develops tailored
treatment schemes for different patient subgroups. The fact that some factors
may be more significant for a specific patient subgroup motivates clinicians
and medical researchers to develop new approaches to subgroup detection and
analysis, which is an effective strategy to personalize treatment. In this
study, we propose a novel patient subgroup detection method, called Supervised
Biclustring (SUBIC) using convex optimization and apply our approach to detect
patient subgroups and prioritize risk factors for hypertension (HTN) in a
vulnerable demographic subgroup (African-American). Our approach not only finds
patient subgroups with guidance of a clinically relevant target variable but
also identifies and prioritizes risk factors by pursuing sparsity of the input
variables and encouraging similarity among the input variables and between the
input and target variable
Augmented Models of High-Frequency Transformers for SMPS
The modeling of high-frequency transformers via augmented equivalent circuits is addressed. The augmented models are composed of a low-frequency equivalent and a supplemental element modeled via real rational fitting. They offer both high accuracy levels and a physical meaning that helps the interpretation of simulation results. Parasitics effects between the windings and between the windings and the carrying board can be also included. The use of an augmented model for the simulation of a dc-dc converter is demonstrate
Qualitative System Identification from Imperfect Data
Experience in the physical sciences suggests that the only realistic means of
understanding complex systems is through the use of mathematical models.
Typically, this has come to mean the identification of quantitative models
expressed as differential equations. Quantitative modelling works best when the
structure of the model (i.e., the form of the equations) is known; and the
primary concern is one of estimating the values of the parameters in the model.
For complex biological systems, the model-structure is rarely known and the
modeler has to deal with both model-identification and parameter-estimation. In
this paper we are concerned with providing automated assistance to the first of
these problems. Specifically, we examine the identification by machine of the
structural relationships between experimentally observed variables. These
relationship will be expressed in the form of qualitative abstractions of a
quantitative model. Such qualitative models may not only provide clues to the
precise quantitative model, but also assist in understanding the essence of
that model. Our position in this paper is that background knowledge
incorporating system modelling principles can be used to constrain effectively
the set of good qualitative models. Utilising the model-identification
framework provided by Inductive Logic Programming (ILP) we present empirical
support for this position using a series of increasingly complex artificial
datasets. The results are obtained with qualitative and quantitative data
subject to varying amounts of noise and different degrees of sparsity. The
results also point to the presence of a set of qualitative states, which we
term kernel subsets, that may be necessary for a qualitative model-learner to
learn correct models. We demonstrate scalability of the method to biological
system modelling by identification of the glycolysis metabolic pathway from
data
Investigation of the Hammerstein hypothesis in the modeling of electrically stimulated muscle
To restore functional use of paralyzed muscles by automatically controlled stimulation, an accurate quantitative model of the stimulated muscles is desirable. The most commonly used model for isometric muscle has had a Hammerstein structure, in which a linear dynamic block is preceded by a static nonlinear function, To investigate the accuracy of the Hammerstein model, the responses to a pseudo-random binary sequence (PRBS) excitation of normal human plantarflexors, stimulated with surface electrodes, were used to identify a Hammerstein model but also four local models which describe the responses to small signals at different mean levels of activation. Comparison of the local models with the Linearized Hammerstein model showed that the Hammerstein model concealed a fivefold variation in the speed of response. Also, the small-signal gain of the Hammerstein model was in error by factors up to three. We conclude that, despite the past widespread use of the Hammerstein model, it is not an accurate representation of isometric muscle. On the other hand, local models, which are more accurate predictors, can be identified from the responses to short PRBS sequences. The utility of local models for controller design is discussed
A Simulation Model for Logical and Operative Clash Detection
The introduction of the Building Information Modeling (BIM) approach has
facilitated the management process of documents produced by different kinds of
professionals involved in the design and/or renovation of a building, through
identification and subsequent management of geometrical interferences (Clash
Detection). The methodology of this research proposes a tool to support Clash
Detection, introducing the logical-operative dimension, that may occur with the
presence of a construction site within a hospital structure, through the integration
of a BIM model within a Game Engine environment, to preserve the continuity of
daily hospital activities and trying to reduce negative impacts, times and costs
due to construction activities
An Objective-Based Perspective on Assessment of Model-Supported Policy Processes
Simulation models, being in use for a long time in natural sciences and engineering domains, are diffusing to a wider context including policy analysis studies. The differences between the nature of the domain of application, as well as the increased variety of usage partially induced by this difference naturally imply new challenges to be overcome. One of these challenges is related to the assessment of the simulation-based outcomes in terms of their reliability and relevance in the policy context being studied. The importance of this assessment is twofold. First of all, it is all about conducting a high quality policy study with effective results. However, the quality of the study does not necessarily imply acceptance of the results by the clients and/or colleagues. This problem of policy analysts increases the importance of such an assessment; an effective assessment may induce the acceptance of the conclusions drawn from the study by the clients and/or colleagues. The main objective of this paper is to introduce an objective-based assessment perspective for simulation model-supported policy studies. As a first step towards such a goal, an objective-based classification of models is introduced. Based on that, we will discuss the importance of different aspects of the assessment for each type. In doing so, we aim to provide a structured discussion that may serve as a sort of methodological guideline to be used by policy analysts, and also by clients.Simulation, Validation, Model Assessment, Policy Analysis, Model Typology
Single-vehicle data of highway traffic - a statistical analysis
In the present paper single-vehicle data of highway traffic are analyzed in
great detail. By using the single-vehicle data directly empirical time-headway
distributions and speed-distance relations can be established. Both quantities
yield relevant information about the microscopic states. Several fundamental
diagrams are also presented, which are based on time-averaged quantities and
compared with earlier empirical investigations. In the remaining part
time-series analyses of the averaged as well as the single-vehicle data are
carried out. The results will be used in order to propose objective criteria
for an identification of the different traffic states, e.g. synchronized
traffic.Comment: 12 pages, 19 figures, RevTe
- …