39,622 research outputs found
A Simple Iterative Algorithm for Parsimonious Binary Kernel Fisher Discrimination
By applying recent results in optimization theory variously known as optimization transfer or majorize/minimize algorithms, an algorithm for binary, kernel, Fisher discriminant analysis is introduced that makes use of a non-smooth penalty on the coefficients to provide a parsimonious solution. The problem is converted into a smooth optimization that can be solved iteratively with no greater overhead than iteratively re-weighted least-squares. The result is simple, easily programmed and is shown to perform, in terms of both accuracy and parsimony, as well as or better than a number of leading machine learning algorithms on two well-studied and substantial benchmarks
Recommended from our members
Sparse kernel density estimation technique based on zero-norm constraint
A sparse kernel density estimator is derived based on the zero-norm constraint, in which the zero-norm of the kernel weights is incorporated to enhance model sparsity. The classical Parzen window estimate is adopted as the desired response for density estimation, and an approximate function of the zero-norm is used for achieving mathemtical tractability and algorithmic efficiency. Under the mild condition of the positive definite design matrix, the kernel weights of the proposed density estimator based on the zero-norm approximation can be obtained using the multiplicative nonnegative quadratic programming algorithm. Using the -optimality based selection algorithm as the preprocessing to select a small significant subset design matrix, the proposed zero-norm based approach offers an effective means for constructing very sparse kernel density estimates with excellent generalisation performance
A Survey on Compiler Autotuning using Machine Learning
Since the mid-1990s, researchers have been trying to use machine-learning
based approaches to solve a number of different compiler optimization problems.
These techniques primarily enhance the quality of the obtained results and,
more importantly, make it feasible to tackle two main compiler optimization
problems: optimization selection (choosing which optimizations to apply) and
phase-ordering (choosing the order of applying optimizations). The compiler
optimization space continues to grow due to the advancement of applications,
increasing number of compiler optimizations, and new target architectures.
Generic optimization passes in compilers cannot fully leverage newly introduced
optimizations and, therefore, cannot keep up with the pace of increasing
options. This survey summarizes and classifies the recent advances in using
machine learning for the compiler optimization field, particularly on the two
major problems of (1) selecting the best optimizations and (2) the
phase-ordering of optimizations. The survey highlights the approaches taken so
far, the obtained results, the fine-grain classification among different
approaches and finally, the influential papers of the field.Comment: version 5.0 (updated on September 2018)- Preprint Version For our
Accepted Journal @ ACM CSUR 2018 (42 pages) - This survey will be updated
quarterly here (Send me your new published papers to be added in the
subsequent version) History: Received November 2016; Revised August 2017;
Revised February 2018; Accepted March 2018
Recommended from our members
Probability density estimation with tunable kernels using orthogonal forward regression
A generalized or tunable-kernel model is proposed for probability density function estimation based on an orthogonal forward regression procedure. Each stage of the density estimation process determines a tunable kernel, namely, its center vector and diagonal covariance matrix, by minimizing a leave-one-out test criterion. The kernel mixing weights of the constructed sparse density estimate are finally updated using the multiplicative nonnegative quadratic programming algorithm to ensure the nonnegative and unity constraints, and this weight-updating process additionally has the desired ability to further reduce the model size. The proposed tunable-kernel model has advantages, in terms of model generalization capability and model sparsity, over the standard fixed-kernel model that restricts kernel centers to the training data points and employs a single common kernel variance for every kernel. On the other hand, it does not optimize all the model parameters together and thus avoids the problems of high-dimensional ill-conditioned nonlinear optimization associated with the conventional finite mixture model. Several examples are included to demonstrate the ability of the proposed novel tunable-kernel model to effectively construct a very compact density estimate accurately
Application of Fractal Dimension for Quantifying Noise Texture in Computed Tomography Images
Purpose
Evaluation of noise texture information in CT images is important for assessing image quality. Noise texture is often quantified by the noise power spectrum (NPS), which requires numerous image realizations to estimate. This study evaluated fractal dimension for quantifying noise texture as a scalar metric that can potentially be estimated using one image realization. Methods
The American College of Radiology CT accreditation phantom (ACR) was scanned on a clinical scanner (Discovery CT750, GE Healthcare) at 120 kV and 25 and 90 mAs. Images were reconstructed using filtered back projection (FBP/ASIR 0%) with varying reconstruction kernels: Soft, Standard, Detail, Chest, Lung, Bone, and Edge. For each kernel, images were also reconstructed using ASIR 50% and ASIR 100% iterative reconstruction (IR) methods. Fractal dimension was estimated using the differential boxâcounting algorithm applied to images of the uniform section of ACR phantom. The twoâdimensional Noise Power Spectrum (NPS) and oneâdimensionalâradially averaged NPS were estimated using established techniques. By changing the radiation dose, the effect of noise magnitude on fractal dimension was evaluated. The Spearman correlation between the fractal dimension and the frequency of the NPS peak was calculated. The number of images required to reliably estimate fractal dimension was determined and compared to the number of images required to estimate the NPSâpeak frequency. The effect of Region of Interest (ROI) size on fractal dimension estimation was evaluated. Feasibility of estimating fractal dimension in an anthropomorphic phantom and clinical image was also investigated, with the resulting fractal dimension compared to that estimated within the uniform section of the ACR phantom. Results
Fractal dimension was strongly correlated with the frequency of the peak of the radially averaged NPS curve, having a Spearman rankâorder coefficient of 0.98 (Pâvalue \u3c 0.01) for ASIR 0%. The mean fractal dimension at ASIR 0% was 2.49 (Soft), 2.51 (Standard), 2.52 (Detail), 2.57 (Chest), 2.61 (Lung), 2.66 (Bone), and 2.7 (Edge). A reduction in fractal dimension was observed with increasing ASIR levels for all investigated reconstruction kernels. Fractal dimension was found to be independent of noise magnitude. Fractal dimension was successfully estimated from four ROIs of size 64 Ă 64 pixels or one ROI of 128 Ă 128 pixels. Fractal dimension was found to be sensitive to nonânoise structures in the image, such as ring artifacts and anatomical structure. Fractal dimension estimated within a uniform region of an anthropomorphic phantom and clinical head image matched that estimated within the ACR phantom for filtered back projection reconstruction. Conclusions
Fractal dimension correlated with the NPSâpeak frequency and was independent of noise magnitude, suggesting that the scalar metric of fractal dimension can be used to quantify the change in noise texture across reconstruction approaches. Results demonstrated that fractal dimension can be estimated from four, 64 Ă 64âpixel ROIs or one 128 Ă 128 ROI within a head CT image, which may make it amenable for quantifying noise texture within clinical images
- âŠ