374 research outputs found

    The Modified Bottleneck Assignment Problem in Vector Case ―An Idea to Apply a Clustering Method―

    Get PDF
    In this study, we deal with the bottleneck assignment problem in vector case. This problem is NP-complete. We show an idea that we use a clustering method to divide the original problem into sub problems. Each set of vertices is divided to subsets by a non-hierarchical clustering method. We make the optimal combination of the subsets, then vertices in the subset are corresponded according to the subsets’ combinations. We show the effect of this idea by the numerical experiments

    "Rotterdam econometrics": publications of the econometric institute 1956-2005

    Get PDF
    This paper contains a list of all publications over the period 1956-2005, as reported in the Rotterdam Econometric Institute Reprint series during 1957-2005.

    Optimal Design of Validation Experiments for Calibration and Validation of Complex Numerical Models

    Get PDF
    As prediction of the performance and behavior of complex engineering systems shifts from a primarily empirical-based approach to the use of complex physics-based numerical models, the role of experimentation is evolving to calibrate, validate, and quantify uncertainty of the numerical models. Oftentimes, these experiments are expensive, placing importance on selecting experimental settings to efficiently calibrate the numerical model with a limited number of experiments. The aim of this thesis is to reduce the experimental resources required to reach predictive maturity in complex numerical models by (i) aiding experimenters in determining the optimal settings for experiments, and (ii) aiding the model developers in assessing the predictive maturity of numerical models through a new, more refined coverage metric. Numerical model predictions entail uncertainties, primarily caused by imprecisely known input parameter values and biases, primarily caused by simplifications and idealizations in the model. Hence, calibration of numerical models involves not only updating of parameter values but also inferring the discrepancy bias, or empirically trained error model. Training of this error model throughout the domain of applicability becomes possible when experiments conducted at varying settings are available. Of course, for the trained discrepancy bias to be meaningful and a numerical model to be predictively mature, the validation experiments must sufficiently cover the operational domain. Otherwise, poor training of the discrepancy bias and overconfidence in model predictions may result. Thus, coverage metrics are used to quantify the ability of a set of validation experiments to represent an entire operation domain. This thesis is composed of two peer-reviewed journal articles. The first article focuses on the optimal design of validation experiments. The ability to improve the predictive maturity of a plasticity material model is assessed for several index-based and distance-based batch sequential design selection criteria through a detailed analysis of discrepancy bias and coverage. Furthermore, the effect of experimental uncertainty, complexity of discrepancy bias, and initial experimental settings on the performance of each criterion is evaluated. Lastly, a technique that integrates index-based and distance-based selection criteria to both exploit the available knowledge regarding the discrepancy bias and explore the operational domain is evaluated. This article is published in Structural and Multidisciplinary Optimization in 2013. The second article is focused on developing a coverage metric. Four characteristics of an exemplar coverage metric are identified and the ability of coverage metrics from the literature to satisfy the four criteria is evaluated. No existing coverage metric is determined to satisfy all four criteria. As a solution, a new coverage metric is proposed which exhibits satisfactory performance in all four criteria. The performance of the proposed coverage metric is compared to the existing coverage metrics using an application to the plasticity material model as well as a high-dimensional Rosenbrock function. This article is published in Mechanical Systems and Signal Processing in 2014

    Hyperspectral Data Acquisition and Its Application for Face Recognition

    Get PDF
    Current face recognition systems are rife with serious challenges in uncontrolled conditions: e.g., unrestrained lighting, pose variations, accessories, etc. Hyperspectral imaging (HI) is typically employed to counter many of those challenges, by incorporating the spectral information within different bands. Although numerous methods based on hyperspectral imaging have been developed for face recognition with promising results, three fundamental challenges remain: 1) low signal to noise ratios and low intensity values in the bands of the hyperspectral image specifically near blue bands; 2) high dimensionality of hyperspectral data; and 3) inter-band misalignment (IBM) correlated with subject motion during data acquisition. This dissertation concentrates mainly on addressing the aforementioned challenges in HI. First, to address low quality of the bands of the hyperspectral image, we utilize a custom light source that has more radiant power at shorter wavelengths and properly adjust camera exposure times corresponding to lower transmittance of the filter and lower radiant power of our light source. Second, the high dimensionality of spectral data imposes limitations on numerical analysis. As such, there is an emerging demand for robust data compression techniques with lows of less relevant information to manage real spectral data. To cope with these challenging problems, we describe a reduced-order data modeling technique based on local proper orthogonal decomposition in order to compute low-dimensional models by projecting high-dimensional clusters onto subspaces spanned by local reduced-order bases. Third, we investigate 11 leading alignment approaches to address IBM correlated with subject motion during data acquisition. To overcome the limitations of the considered alignment approaches, we propose an accurate alignment approach ( A3) by incorporating the strengths of point correspondence and a low-rank model. In addition, we develop two qualitative prediction models to assess the alignment quality of hyperspectral images in determining improved alignment among the conducted alignment approaches. Finally, we show that the proposed alignment approach leads to promising improvement on face recognition performance of a probabilistic linear discriminant analysis approach

    "Rotterdam econometrics": publications of the econometric institute 1956-2005

    Get PDF
    This paper contains a list of all publications over the period 1956-2005, as reported in the Rotterdam Econometric Institute Reprint series during 1957-2005
    corecore