16,694 research outputs found

    Hyperspectral Unmixing Overview: Geometrical, Statistical, and Sparse Regression-Based Approaches

    Get PDF
    Imaging spectrometers measure electromagnetic energy scattered in their instantaneous field view in hundreds or thousands of spectral channels with higher spectral resolution than multispectral cameras. Imaging spectrometers are therefore often referred to as hyperspectral cameras (HSCs). Higher spectral resolution enables material identification via spectroscopic analysis, which facilitates countless applications that require identifying materials in scenarios unsuitable for classical spectroscopic analysis. Due to low spatial resolution of HSCs, microscopic material mixing, and multiple scattering, spectra measured by HSCs are mixtures of spectra of materials in a scene. Thus, accurate estimation requires unmixing. Pixels are assumed to be mixtures of a few materials, called endmembers. Unmixing involves estimating all or some of: the number of endmembers, their spectral signatures, and their abundances at each pixel. Unmixing is a challenging, ill-posed inverse problem because of model inaccuracies, observation noise, environmental conditions, endmember variability, and data set size. Researchers have devised and investigated many models searching for robust, stable, tractable, and accurate unmixing algorithms. This paper presents an overview of unmixing methods from the time of Keshava and Mustard's unmixing tutorial [1] to the present. Mixing models are first discussed. Signal-subspace, geometrical, statistical, sparsity-based, and spatial-contextual unmixing algorithms are described. Mathematical problems and potential solutions are described. Algorithm characteristics are illustrated experimentally.Comment: This work has been accepted for publication in IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensin

    An Adaptive Neuro-Fuzzy Inference System Based Approach to Real Estate Property Assessment

    Get PDF
    This paper describes a first effort to design and implement an adaptive neuro-fuzzy inference system based approach to estimate prices for residential properties. The data set consists of historic sales of homes in a market in Midwest USA and it contains parameters describing typical residential property features and the actual sale price. The study explores the use of fuzzy inference systems to assess real estate property values and the use of neural networks in creating and fine tuning the fuzzy rules used in the fuzzy inference system. The results are compared with those obtained using a traditional multiple regression model. The paper also describes possible future research in this area.

    Latent class analysis for segmenting preferences of investment bonds

    Get PDF
    Market segmentation is a key component of conjoint analysis which addresses consumer preference heterogeneity. Members in a segment are assumed to be homogenous in their views and preferences when worthing an item but distinctly heterogenous to members of other segments. Latent class methodology is one of the several conjoint segmentation procedures that overcome the limitations of aggregate analysis and a-priori segmentation. The main benefit of Latent class models is that market segment membership and regression parameters of each derived segment are estimated simultaneously. The Latent class model presented in this paper uses mixtures of multivariate conditional normal distributions to analyze rating data, where the likelihood is maximized using the EM algorithm. The application focuses on customer preferences for investment bonds described by four attributes; currency, coupon rate, redemption term and price. A number of demographic variables are used to generate segments that are accessible and actionable.peer-reviewe

    THE EFFECTIVENESS OF REMEDIAL COURSES IN ITALY: A FUZZY REGRESSION DISCONTINUITY DESIGN

    Get PDF
    We evaluate the effects on student achievement of a number of remedial courses provided by an Italian University. To identify the causal effect of remediation we use a Fuzzy Regression Discontinuity Design, relying on the fact that students whose performance at a placement test was below a certain cutoff were assigned to the treatment. We deal with partial compliance using the assignment rule as an instrumental variable for the effective attendance to remedial courses. From our analysis it emerges that students just below the cutoff, attending the remedial courses, acquire a higher number of credits compared to students just above the cutoff. We also find that remedial courses reduce the probability of dropping out from academic career. On the other hand, we do not find any statistically significant effect on the average grade obtained at passed exams.Remedial Courses, Tertiary Education, Public Policy, Fuzzy Regression Discontinuity Design, Instrumental Variables

    Asset Allocation with Aversion to Parameter Uncertainty: A Minimax Regression Approach

    Get PDF
    This paper takes a minimax regression approach to incorporate aversion to parameter uncertainty into the mean-variance model. The uncertainty-averse minimax mean-variance portfolio is obtained by minimizing with respect to the unknown weights the upper bound of the usual quadratic risk function over a fuzzy ellipsoidal set. Beyond the existing approaches, our methodology offers three main advantages: first, the resulting optimal portfolio can be interpreted as a Bayesian mean-variance portfolio with the least favorable prior density, and this result allows for a comprehensive comparison with traditional uncertainty-neutral Bayesian mean-variance portfolios. Second, the minimax mean-variance portfolio has a shrinkage expression, but its performance does not necessarily lie within those of the two reference portfolios. Third, we provide closed form expressions for the standard errors of the minimax mean-variance portfolio weights and statistical significance of the optimal portfolio weights can be easily conducted. Empirical applications show that incorporating aversion to parameter uncertainty leads to more stable optimal portfolios that outperform traditional uncertainty-neutral Bayesian mean-variance portfolios.Asset allocation, estimation error, aversion to uncertainty, min-imax regression, Bayesian mean-variance portfolios, least favorable prior

    Natural experiments: An overview of methods, approaches, and contributions to public health intervention research

    Get PDF
    Population health interventions are essential to reduce health inequalities and tackle other public health priorities, but they are not always amenable to experimental manipulation. Natural experiment (NE) approaches are attracting growing interest as a way of providing evidence in such circumstances. One key challenge in evaluating NEs is selective exposure to the intervention. Studies should be based on a clear theoretical understanding of the processes that determine exposure. Even if the observed effects are large and rapidly follow implementation, confidence in attributing these effects to the intervention can be improved by carefully considering alternative explanations. Causal inference can be strengthened by including additional design features alongside the principal method of effect estimation.NEstudies often rely on existing (including routinely collected) data. Investment in such data sources and the infrastructure for linking exposure and outcome data is essential if the potential for such studies to inform decision making is to be realized
    corecore