224,769 research outputs found

    Regional development assessment using parametric and non-parametric ranking methods: A comparative analysis of Slovenia and Croatia

    Get PDF
    In this paper we describe several regional development-assessment methods and subsequently apply them in a comparative development level analysis of the Slovenian and Croatian municipalities. The aim is to compare performance and suitability of several parametric and non-parametric ranking methods and to develop a suitable multivariate methodological framework for distinguishing development level of particular territorial units. However, the usefulness and appropriateness of various multivariate techniques for regional development assessment is generally questionable and there is no clear consensus about how to carry out such analysis. Two main methodological approaches are based on parametric and non-parametric methods, where in the former an explicit econometric model containing theory-implied causal and possibly simultaneous relationships is estimated using likelihood-based methods and formally assessed in terms of the goodness of fit and other test statistics, subsequently allowing for estimation of the development level on a metric scale, while in the later, territorial units or regions are essentially classified into clusters or groups differing in the development level, but no formal inferential methods are applied to confirm the validity of the model, or to establish the difference in the development level on a metric scale. The possible advantages of the first approach are in the existence of formal testing and evaluation procedures, as well as in producing interval ranks of the analysed units, while its disadvantages are in the lack of robustness; often unrealistic distributional assumptions; and possible invalidity of the theoretically implied causal relationships. In this paper we consider a parametric, inferential approach based on maximum likelihood estimation of the linear structural equation model with latent variables for metric-scale development ranking, and a non-parametric approach based on cluster analysis for development grouping. Our analysis is based on ten regional development variables such as income per capita, population density, age index, etc. which are similarly collected and generally compatible for both analysed countries. Within the parametric approach, a simultaneous equation econometric model is estimated and latent scores are computed for each underlying latent development variable, where three latent constructs are postulated corresponding to economic, structural and demographic development dimensions. In the non-parametric approach, a combination of Ward?s hierarchical method and K-means clustering procedure is applied to classify the territorial units. We apply both methodological frameworks to Slovenian and Croatian municipality data and assess their regional development level. We further compare the performance of both methods and show to which degree their results are compatible. Finally, we propose a unified framework based on both parametric and non-parametric methods, where clustering techniques are performed both on the original development indicators and on the computed latent scores from the structural equation model, and compare these results with the results from each of the two methods applied separately. We show that a combined parametric/non-parametric approach is superior to each approach applied individually and propose a methodological framework capable of estimating the development level of territorial units or regions on a metric scale, while in the same time preserving the robustness of the non-parametric techniques.

    Future probabilistic hot summer years for overheating risk assessments

    Get PDF
    As the 2003 Paris heatwave showed, elevated temperatures in buildings can cause thousands of deaths. This makes the assessment of overheating risk a critical exercise. Unfortunately current methods of creating example weather time series for the assessment of overheating are based on a single weather variable, and hence on only one driver of discomfort or mortality. In this study, two alternative approaches for the development of current and future weather files are presented: one (pHSY-1) is based on Weighted Cooling Degree Hours (WCDH), the other (pHSY-2) is based on Physiologically Equivalent Temperature (PET). pHSY-1 and pHSY-2 files were produced for fourteen locations. These were then compared with the existing probabilistic future Design Summer Year (pDSY) and the probabilistic future Test Reference Year. It was found that both pHSY-1 and pHSY-2 are more robust than the pDSY. It is suggested that pHSY-1 could be used for assessing the severity and occurrence of overheating, while pHSY-2 could be used for evaluating thermal discomfort or heat stress. The results also highlight an important limitation in using different metrics to compare overheating years. If the weather year is created by a ranking of a single environmental variable, to ensure consistent results assessment of the building should be with a similar single metric (e.g. hours >28 °C or WCDH), if however the weather year is based upon several environmental variables then a composite metric (e.g. PET or Fanger’s PMV) should be used. This has important implications for the suitability of weather files for thermal comfort analysis.This research was supported by Engineering and Physical Science Research Council (EPSRC) via grants EP/M021890/1 and EP/M022099/1. All data created during this research are available from the University of Bath data archive at http://doi.org/10.15125/BATH-00190

    Quantifying image distortion based on Gabor filter bank and multiple regression analysis

    Get PDF
    Image quality assessment is indispensable for image-based applications. The approaches towards image quality assessment fall into two main categories: subjective and objective methods. Subjective assessment has been widely used. However, careful subjective assessments are experimentally difficult and lengthy, and the results obtained may vary depending on the test conditions. On the other hand, objective image quality assessment would not only alleviate the difficulties described above but would also help to expand the application field. Therefore, several works have been developed for quantifying the distortion presented on a image achieving goodness of fit between subjective and objective scores up to 92%. Nevertheless, current methodologies are designed assuming that the nature of the distortion is known. Generally, this is a limiting assumption for practical applications, since in a majority of cases the distortions in the image are unknown. Therefore, we believe that the current methods of image quality assessment should be adapted in order to identify and quantify the distortion of images at the same time. That combination can improve processes such as enhancement, restoration, compression, transmission, among others. We present an approach based on the power of the experimental design and the joint localization of the Gabor filters for studying the influence of the spatial/frequencies on image quality assessment. Therefore, we achieve a correct identification and quantification of the distortion affecting images. This method provides accurate scores and differentiability between distortions

    No-reference image quality assessment through the von Mises distribution

    Get PDF
    An innovative way of calculating the von Mises distribution (VMD) of image entropy is introduced in this paper. The VMD's concentration parameter and some fitness parameter that will be later defined, have been analyzed in the experimental part for determining their suitability as a image quality assessment measure in some particular distortions such as Gaussian blur or additive Gaussian noise. To achieve such measure, the local R\'{e}nyi entropy is calculated in four equally spaced orientations and used to determine the parameters of the von Mises distribution of the image entropy. Considering contextual images, experimental results after applying this model show that the best-in-focus noise-free images are associated with the highest values for the von Mises distribution concentration parameter and the highest approximation of image data to the von Mises distribution model. Our defined von Misses fitness parameter experimentally appears also as a suitable no-reference image quality assessment indicator for no-contextual images.Comment: 29 pages, 11 figure

    A review on the complementarity of renewable energy sources: concept, metrics, application and future research directions

    Get PDF
    It is expected, and regionally observed, that energy demand will soon be covered by a widespread deployment of renewable energy sources. However, the weather and climate driven energy sources are characterized by a significant spatial and temporal variability. One of the commonly mentioned solutions to overcome the mismatch between demand and supply provided by renewable generation is a hybridization of two or more energy sources in a single power station (like wind-solar, solar-hydro or solar-wind-hydro). The operation of hybrid energy sources is based on the complementary nature of renewable sources. Considering the growing importance of such systems and increasing number of research activities in this area this paper presents a comprehensive review of studies which investigated, analyzed, quantified and utilized the effect of temporal, spatial and spatio-temporal complementarity between renewable energy sources. The review starts with a brief overview of available research papers, formulates detailed definition of major concepts, summarizes current research directions and ends with prospective future research activities. The review provides a chronological and spatial information with regard to the studies on the complementarity concept.Comment: 34 pages 7 figures 3 table

    An Efficient Dual Approach to Distance Metric Learning

    Full text link
    Distance metric learning is of fundamental interest in machine learning because the distance metric employed can significantly affect the performance of many learning methods. Quadratic Mahalanobis metric learning is a popular approach to the problem, but typically requires solving a semidefinite programming (SDP) problem, which is computationally expensive. Standard interior-point SDP solvers typically have a complexity of O(D6.5)O(D^{6.5}) (with DD the dimension of input data), and can thus only practically solve problems exhibiting less than a few thousand variables. Since the number of variables is D(D+1)/2D (D+1) / 2 , this implies a limit upon the size of problem that can practically be solved of around a few hundred dimensions. The complexity of the popular quadratic Mahalanobis metric learning approach thus limits the size of problem to which metric learning can be applied. Here we propose a significantly more efficient approach to the metric learning problem based on the Lagrange dual formulation of the problem. The proposed formulation is much simpler to implement, and therefore allows much larger Mahalanobis metric learning problems to be solved. The time complexity of the proposed method is O(D3)O (D ^ 3) , which is significantly lower than that of the SDP approach. Experiments on a variety of datasets demonstrate that the proposed method achieves an accuracy comparable to the state-of-the-art, but is applicable to significantly larger problems. We also show that the proposed method can be applied to solve more general Frobenius-norm regularized SDP problems approximately
    • 

    corecore