113,329 research outputs found
Recommended from our members
Automatic, computer aided geometric design of free-knot, regression splines
A new algorithm for Computer Aided Geometric Design of least squares (LS) splines with variable knots, named GeDS, is presented. It is based on interpreting functional spline regression as a parametric B-spline curve, and on using the shape preserving property of its control polygon. The GeDS algorithm includes two major stages. For the first stage, an automatic adaptive, knot location algorithm is developed. By adding knots, one at a time, it sequentially "breaks" a straight line segment into pieces in order to construct a linear LS B-spline fit, which captures the "shape" of the data. A stopping rule is applied which avoids both over and under fitting and selects the number of knots for the second stage of GeDS, in which smoother, higher order (quadratic, cubic, etc.) fits are generated. The knots appropriate for the second stage are determined, according to a new knot location method, called the averaging method. It approximately preserves the linear precision property of B-spline curves and allows the attachment of smooth higher order LS B-spline fits to a control polygon, so that the shape of the linear polygon of stage one is followed. The GeDS method produces simultaneously linear, quadratic, cubic (and possibly higher order) spline fits with one and the same number of B-spline regression functions. The GeDS algorithm is very fast, since no deterministic or stochastic knot insertion/deletion and relocation search strategies are involved, neither in the first nor the second stage. Extensive numerical examples are provided, illustrating the performance of GeDS and the quality of the resulting LS spline fits. The GeDS procedure is compared with other existing variable knot spline methods and smoothing techniques, such as SARS, HAS, MDL, AGS methods and is shown to produce models with fewer parameters but with similar goodness of fit characteristics, and visual quality
A feasible and automatic free tool for T1 and ECV mapping
Purpose: Cardiac magnetic resonance (CMR) is a useful non-invasive tool for characterizing tissues and detecting myocardial fibrosis and edema. Estimation of extracellular volume fraction (ECV) using T1 sequences is emerging as an accurate biomarker in cardiac diseases associated with diffuse fibrosis. In
this study, automatic software for T1 and ECV map generation consisting of an executable file was developed and validated using phantom and human data.
Methods: T1 mapping was performed in phantoms and 30 subjects (22 patients and 8 healthy subjects) on a 1.5T MR scanner using the modified Look-Locker inversion-recovery (MOLLI) sequence prototype before and 15 min after contrast agent administration. T1 maps were generated using a Fast Nonlinear
Least Squares algorithm. Myocardial ECV maps were generated using both pre- and post-contrast T1 image registration and automatic extraction of blood relaxation rates.
Results: Using our software, pre- and post-contrast T1 maps were obtained in phantoms and healthy subjects resulting in a robust and reliable quantification as compared to reference software. Coregistration of pre- and post-contrast images improved the quality of ECV maps. Mean ECV value in healthy subjects was
24.5% ± 2.5%.
Conclusions: This study demonstrated that it is possible to obtain accurate T1 maps and informative ECV maps using our software. Pixel-wise ECV maps obtained with this automatic software made it possible to visualize and evaluate the extent and severity of ECV alterations
Recommended from our members
Toward improved calibration of hydrologic models: Combining the strengths of manual and automatic methods
Automatic methods for model calibration seek to take advantage of the speed and power of digital computers, while being objective and relatively easy to implement. However, they do not provide parameter estimates and hydrograph simulations that are considered acceptable by the hydrologists responsible for operational forecasting and have therefore not entered into widespread use. In contrast, the manual approach which has been developed and refined over the years to result in excellent model calibrations is complicated and highly labor-intensive, and the expertise acquired by one individual with a specific model is not easily transferred to another person (or model). In this paper, we propose a hybrid approach that combines the strengths of each. A multicriteria formulation is used to "model" the evaluation techniques and strategies used in manual calibration, and the resulting optimization problem is solved by means of a computerized algorithm. The new approach provides a stronger test of model performance than methods that use a single overall statistic to aggregate model errors over a large range of hydrologic behaviors. The power of the new approach is illustrated by means of a case study using the Sacramento Soil Moisture Accounting model
Face analysis using curve edge maps
This paper proposes an automatic and real-time system for face analysis, usable in visual communication applications. In this approach, faces are represented with Curve Edge Maps, which are collections of polynomial segments with a convex region. The segments are extracted from edge pixels using an adaptive incremental linear-time fitting algorithm, which is based on constructive polynomial fitting. The face analysis system considers face tracking, face recognition and facial feature detection, using Curve Edge Maps driven by histograms of intensities and histograms of relative positions. When applied to different face databases and video sequences, the average face recognition rate is 95.51%, the average facial feature detection rate is 91.92% and the accuracy in location of the facial features is 2.18% in terms of the size of the face, which is comparable with or better than the results in literature. However, our method has the advantages of simplicity, real-time performance and extensibility to the different aspects of face analysis, such as recognition of facial expressions and talking
Post-correlation radio frequency interference classification methods
We describe and compare several post-correlation radio frequency interference
classification methods. As data sizes of observations grow with new and
improved telescopes, the need for completely automated, robust methods for
radio frequency interference mitigation is pressing. We investigated several
classification methods and find that, for the data sets we used, the most
accurate among them is the SumThreshold method. This is a new method formed
from a combination of existing techniques, including a new way of thresholding.
This iterative method estimates the astronomical signal by carrying out a
surface fit in the time-frequency plane. With a theoretical accuracy of 95%
recognition and an approximately 0.1% false probability rate in simple
simulated cases, the method is in practice as good as the human eye in finding
RFI. In addition it is fast, robust, does not need a data model before it can
be executed and works in almost all configurations with its default parameters.
The method has been compared using simulated data with several other mitigation
techniques, including one based upon the singular value decomposition of the
time-frequency matrix, and has shown better results than the rest.Comment: 14 pages, 12 figures (11 in colour). The software that was used in
the article can be downloaded from http://www.astro.rug.nl/rfi-software
- âŠ