17,298 research outputs found
Residual Weighted Learning for Estimating Individualized Treatment Rules
Personalized medicine has received increasing attention among statisticians,
computer scientists, and clinical practitioners. A major component of
personalized medicine is the estimation of individualized treatment rules
(ITRs). Recently, Zhao et al. (2012) proposed outcome weighted learning (OWL)
to construct ITRs that directly optimize the clinical outcome. Although OWL
opens the door to introducing machine learning techniques to optimal treatment
regimes, it still has some problems in performance. In this article, we propose
a general framework, called Residual Weighted Learning (RWL), to improve finite
sample performance. Unlike OWL which weights misclassification errors by
clinical outcomes, RWL weights these errors by residuals of the outcome from a
regression fit on clinical covariates excluding treatment assignment. We
utilize the smoothed ramp loss function in RWL, and provide a difference of
convex (d.c.) algorithm to solve the corresponding non-convex optimization
problem. By estimating residuals with linear models or generalized linear
models, RWL can effectively deal with different types of outcomes, such as
continuous, binary and count outcomes. We also propose variable selection
methods for linear and nonlinear rules, respectively, to further improve the
performance. We show that the resulting estimator of the treatment rule is
consistent. We further obtain a rate of convergence for the difference between
the expected outcome using the estimated ITR and that of the optimal treatment
rule. The performance of the proposed RWL methods is illustrated in simulation
studies and in an analysis of cystic fibrosis clinical trial data.Comment: 48 pages, 3 figure
TVL<sub>1</sub> Planarity Regularization for 3D Shape Approximation
The modern emergence of automation in many industries has given impetus to extensive research into mobile robotics. Novel perception technologies now enable cars to drive autonomously, tractors to till a field automatically and underwater robots to construct pipelines. An essential requirement to facilitate both perception and autonomous navigation is the analysis of the 3D environment using sensors like laser scanners or stereo cameras. 3D sensors generate a very large number of 3D data points when sampling object shapes within an environment, but crucially do not provide any intrinsic information about the environment which the robots operate within.
This work focuses on the fundamental task of 3D shape reconstruction and modelling from 3D point clouds. The novelty lies in the representation of surfaces by algebraic functions having limited support, which enables the extraction of smooth consistent implicit shapes from noisy samples with a heterogeneous density. The minimization of total variation of second differential degree makes it possible to enforce planar surfaces which often occur in man-made environments. Applying the new technique means that less accurate, low-cost 3D sensors can be employed without sacrificing the 3D shape reconstruction accuracy
Sparse Volterra and Polynomial Regression Models: Recoverability and Estimation
Volterra and polynomial regression models play a major role in nonlinear
system identification and inference tasks. Exciting applications ranging from
neuroscience to genome-wide association analysis build on these models with the
additional requirement of parsimony. This requirement has high interpretative
value, but unfortunately cannot be met by least-squares based or kernel
regression methods. To this end, compressed sampling (CS) approaches, already
successful in linear regression settings, can offer a viable alternative. The
viability of CS for sparse Volterra and polynomial models is the core theme of
this work. A common sparse regression task is initially posed for the two
models. Building on (weighted) Lasso-based schemes, an adaptive RLS-type
algorithm is developed for sparse polynomial regressions. The identifiability
of polynomial models is critically challenged by dimensionality. However,
following the CS principle, when these models are sparse, they could be
recovered by far fewer measurements. To quantify the sufficient number of
measurements for a given level of sparsity, restricted isometry properties
(RIP) are investigated in commonly met polynomial regression settings,
generalizing known results for their linear counterparts. The merits of the
novel (weighted) adaptive CS algorithms to sparse polynomial modeling are
verified through synthetic as well as real data tests for genotype-phenotype
analysis.Comment: 20 pages, to appear in IEEE Trans. on Signal Processin
Recommended from our members
TVL<sub>1</sub>shape approximation from scattered 3D data
With the emergence in 3D sensors such as laser scanners and 3D reconstruction from cameras, large 3D point clouds can now be sampled from physical objects within a scene. The raw 3D samples delivered by these sensors however, contain only a limited degree of information about the environment the objects exist in, which means that further geometrical high-level modelling is essential. In addition, issues like sparse data measurements, noise, missing samples due to occlusion, and the inherently huge datasets involved in such representations makes this task extremely challenging. This paper addresses these issues by presenting a new 3D shape modelling framework for samples acquired from 3D sensor. Motivated by the success of nonlinear kernel-based approximation techniques in the statistics domain, existing methods using radial basis functions are applied to 3D object shape approximation. The task is framed as an optimization problem and is extended using non-smooth L1 total variation regularization. Appropriate convex energy functionals are constructed and solved by applying the Alternating Direction Method of Multipliers approach, which is then extended using Gauss-Seidel iterations. This significantly lowers the computational complexity involved in generating 3D shape from 3D samples, while both numerical and qualitative analysis confirms the superior shape modelling performance of this new framework compared with existing 3D shape reconstruction techniques
The ROMES method for statistical modeling of reduced-order-model error
This work presents a technique for statistically modeling errors introduced
by reduced-order models. The method employs Gaussian-process regression to
construct a mapping from a small number of computationally inexpensive `error
indicators' to a distribution over the true error. The variance of this
distribution can be interpreted as the (epistemic) uncertainty introduced by
the reduced-order model. To model normed errors, the method employs existing
rigorous error bounds and residual norms as indicators; numerical experiments
show that the method leads to a near-optimal expected effectivity in contrast
to typical error bounds. To model errors in general outputs, the method uses
dual-weighted residuals---which are amenable to uncertainty control---as
indicators. Experiments illustrate that correcting the reduced-order-model
output with this surrogate can improve prediction accuracy by an order of
magnitude; this contrasts with existing `multifidelity correction' approaches,
which often fail for reduced-order models and suffer from the curse of
dimensionality. The proposed error surrogates also lead to a notion of
`probabilistic rigor', i.e., the surrogate bounds the error with specified
probability
- …