306,422 research outputs found
A unified wavelet-based modelling framework for non-linear system identification: the WANARX model structure
A new unified modelling framework based on the superposition of additive submodels, functional components, and
wavelet decompositions is proposed for non-linear system identification. A non-linear model, which is often represented
using a multivariate non-linear function, is initially decomposed into a number of functional components via the wellknown
analysis of variance (ANOVA) expression, which can be viewed as a special form of the NARX (non-linear
autoregressive with exogenous inputs) model for representing dynamic input–output systems. By expanding each functional
component using wavelet decompositions including the regular lattice frame decomposition, wavelet series and
multiresolution wavelet decompositions, the multivariate non-linear model can then be converted into a linear-in-theparameters
problem, which can be solved using least-squares type methods. An efficient model structure determination
approach based upon a forward orthogonal least squares (OLS) algorithm, which involves a stepwise orthogonalization
of the regressors and a forward selection of the relevant model terms based on the error reduction ratio (ERR), is
employed to solve the linear-in-the-parameters problem in the present study. The new modelling structure is referred to
as a wavelet-based ANOVA decomposition of the NARX model or simply WANARX model, and can be applied to
represent high-order and high dimensional non-linear systems
A unified framework for solving a general class of conditional and robust set-membership estimation problems
In this paper we present a unified framework for solving a general class of
problems arising in the context of set-membership estimation/identification
theory. More precisely, the paper aims at providing an original approach for
the computation of optimal conditional and robust projection estimates in a
nonlinear estimation setting where the operator relating the data and the
parameter to be estimated is assumed to be a generic multivariate polynomial
function and the uncertainties affecting the data are assumed to belong to
semialgebraic sets. By noticing that the computation of both the conditional
and the robust projection optimal estimators requires the solution to min-max
optimization problems that share the same structure, we propose a unified
two-stage approach based on semidefinite-relaxation techniques for solving such
estimation problems. The key idea of the proposed procedure is to recognize
that the optimal functional of the inner optimization problems can be
approximated to any desired precision by a multivariate polynomial function by
suitably exploiting recently proposed results in the field of parametric
optimization. Two simulation examples are reported to show the effectiveness of
the proposed approach.Comment: Accpeted for publication in the IEEE Transactions on Automatic
Control (2014
A unified approach to mortality modelling using state-space framework: characterisation, identification, estimation and forecasting
This paper explores and develops alternative statistical representations and
estimation approaches for dynamic mortality models. The framework we adopt is
to reinterpret popular mortality models such as the Lee-Carter class of models
in a general state-space modelling methodology, which allows modelling,
estimation and forecasting of mortality under a unified framework. Furthermore,
we propose an alternative class of model identification constraints which is
more suited to statistical inference in filtering and parameter estimation
settings based on maximization of the marginalized likelihood or in Bayesian
inference. We then develop a novel class of Bayesian state-space models which
incorporate apriori beliefs about the mortality model characteristics as well
as for more flexible and appropriate assumptions relating to heteroscedasticity
that present in observed mortality data. We show that multiple period and
cohort effect can be cast under a state-space structure. To study long term
mortality dynamics, we introduce stochastic volatility to the period effect.
The estimation of the resulting stochastic volatility model of mortality is
performed using a recent class of Monte Carlo procedure specifically designed
for state and parameter estimation in Bayesian state-space models, known as the
class of particle Markov chain Monte Carlo methods. We illustrate the framework
we have developed using Danish male mortality data, and show that incorporating
heteroscedasticity and stochastic volatility markedly improves model fit
despite an increase of model complexity. Forecasting properties of the enhanced
models are examined with long term and short term calibration periods on the
reconstruction of life tables.Comment: 46 page
Utilizing RxNorm to Support Practical Computing Applications: Capturing Medication History in Live Electronic Health Records
RxNorm was utilized as the basis for direct-capture of medication history
data in a live EHR system deployed in a large, multi-state outpatient
behavioral healthcare provider in the United States serving over 75,000
distinct patients each year across 130 clinical locations. This tool
incorporated auto-complete search functionality for medications and proper
dosage identification assistance. The overarching goal was to understand if and
how standardized terminologies like RxNorm can be used to support practical
computing applications in live EHR systems. We describe the stages of
implementation, approaches used to adapt RxNorm's data structure for the
intended EHR application, and the challenges faced. We evaluate the
implementation using a four-factor framework addressing flexibility, speed,
data integrity, and medication coverage. RxNorm proved to be functional for the
intended application, given appropriate adaptations to address high-speed
input/output (I/O) requirements of a live EHR and the flexibility required for
data entry in multiple potential clinical scenarios. Future research around
search optimization for medication entry, user profiling, and linking RxNorm to
drug classification schemes holds great potential for improving the user
experience and utility of medication data in EHRs.Comment: Appendix (including SQL/DDL Code) available by author request.
Keywords: RxNorm; Electronic Health Record; Medication History;
Interoperability; Unified Medical Language System; Search Optimizatio
Activity Identification and Local Linear Convergence of Forward--Backward-type methods
In this paper, we consider a class of Forward--Backward (FB) splitting
methods that includes several variants (e.g. inertial schemes, FISTA) for
minimizing the sum of two proper convex and lower semi-continuous functions,
one of which has a Lipschitz continuous gradient, and the other is partly
smooth relatively to a smooth active manifold . We propose a
unified framework, under which we show that, this class of FB-type algorithms
(i) correctly identifies the active manifolds in a finite number of iterations
(finite activity identification), and (ii) then enters a local linear
convergence regime, which we characterize precisely in terms of the structure
of the underlying active manifolds. For simpler problems involving polyhedral
functions, we show finite termination. We also establish and explain why FISTA
(with convergent sequences) locally oscillates and can be slower than FB. These
results may have numerous applications including in signal/image processing,
sparse recovery and machine learning. Indeed, the obtained results explain the
typical behaviour that has been observed numerically for many problems in these
fields such as the Lasso, the group Lasso, the fused Lasso and the nuclear norm
regularization to name only a few.Comment: Full length version of the previous short on
A framework for protein structure classification and identification of novel protein structures
BACKGROUND: Protein structure classification plays a central role in understanding the function of a protein molecule with respect to all known proteins in a structure database. With the rapid increase in the number of new protein structures, the need for automated and accurate methods for protein classification is increasingly important. RESULTS: In this paper we present a unified framework for protein structure classification and identification of novel protein structures. The framework consists of a set of components for comparing, classifying, and clustering protein structures. These components allow us to accurately classify proteins into known folds, to detect new protein folds, and to provide a way of clustering the new folds. In our evaluation with SCOP 1.69, our method correctly classifies 86.0%, 87.7%, and 90.5% of new domains at family, superfamily, and fold levels. Furthermore, for protein domains that belong to new domain families, our method is able to produce clusters that closely correspond to the new families in SCOP 1.69. As a result, our method can also be used to suggest new classification groups that contain novel folds. CONCLUSION: We have developed a method called proCC for automatically classifying and clustering domains. The method is effective in classifying new domains and suggesting new domain families, and it is also very efficient. A web site offering access to proCC is freely available a
- …