3 research outputs found
Fitting aggregation operators to data
Theoretical advances in modelling aggregation of information produced a wide range of aggregation operators, applicable to almost every practical problem. The most important classes of aggregation operators include triangular norms, uninorms, generalised means and OWA operators.With such a variety, an important practical problem has emerged: how to fit the parameters/ weights of these families of aggregation operators to observed data? How to estimate quantitatively whether a given class of operators is suitable as a model in a given practical setting? Aggregation operators are rather special classes of functions, and thus they require specialised regression techniques, which would enforce important theoretical properties, like commutativity or associativity. My presentation will address this issue in detail, and will discuss various regression methods applicable specifically to t-norms, uninorms and generalised means. I will also demonstrate software implementing these regression techniques, which would allow practitioners to paste their data and obtain optimal parameters of the chosen family of operators.<br /
Recommended from our members
Development and evaluation of multiple criteria decision-making approaches to watershed management
Decision-making in environmental management is complex due to the multiplicity arid diversity of management objectives and technological choices. This suggests that modelers and experts could utilize (I) multiple-criteria decision-making (MCDM) approaches to assist stakeholder groups in integrating and synthesizing relevant data and information to address ecological and socio-economic concerns and (2) uncertainty approaches to quantify the risks related to the impact of decision alternatives. Since decisions made under uncertainty and MCDM methods have been studied almost independently, most of the MCDM approaches do not address the uncertainties of real world decision situations. This dissertation presents the use of a MCDM methodology and its related decision-making tool, RESTORE. RESTORE is an integrative geographical information system-based decision-making tool that was developed to help watershed councils prioritize and evaluate restoration activities at the watershed level. RESTORE's deterministic performance evaluation module is developed from experts' knowledge and experiences. However, to filly address the complexity of the various landscape processes and human subjectivity, RESTORE should involve uncertainties inherent to experts' knowledge. No single method is able to model all types of uncertainty, therefore the examination of various uncertainty theories is critical before selecting one best suited to a specific decision context. This work explores three uncertainty theories: certainty factor model, Dempster-Shafer theory, and fuzzy set theory. To evaluate these methods in a MCDM watershed restoration context, we (1) identified criteria to assess the suitability of a method for a specific MCDM context, (2) characterized each theory in terms of the identified criteria using RESTORE, and (3) applied each theory using RESTORE. Special emphasis was given to the development of a comprehensive fuzzy MCDM methodology. Uncertainty-based MCDM approaches provide a valuable tool in analyzing complex watershed management issues. When used properly, the proposed MCDM methodology allows decision-makers (DMs) to explore a broader range of drivers and consequences. The inclusion of uncertainty analysis provides DMs with meaningful information on the quality of the evidence supporting the impact of a decision alternative, allowing them to make more informed decisions
Optimization of scientific algorithms in heterogeneous systems and accelerators for high performance computing
Actualmente, la computaci贸n de prop贸sito general en GPU es uno de los pilares b谩sicos
de la computaci贸n de alto rendimiento. Aunque existen cientos de aplicaciones
aceleradas en GPU, a煤n hay algoritmos cient铆ficos poco estudiados. Por ello, la
motivaci贸n de esta tesis ha sido investigar la posibilidad de acelerar significativamente
en GPU un conjunto de algoritmos pertenecientes a este grupo.
En primer lugar, se ha obtenido una implementaci贸n optimizada del algoritmo de
compresi贸n de v铆deo e imagen CAVLC (Context-Adaptive Variable Length Encoding), que
es el m茅todo entr贸pico m谩s usado en el est谩ndar de codificaci贸n de v铆deo H.264. La
aceleraci贸n respecto a la mejor implementaci贸n anterior est谩 entre 2.5x y 5.4x. Esta
soluci贸n puede aprovecharse como el componente entr贸pico de codificadores H.264
software, y utilizarse en sistemas de compresi贸n de v铆deo e imagen en formatos
distintos a H.264, como im谩genes m茅dicas.
En segundo lugar, se ha desarrollado GUD-Canny, un detector de bordes de Canny no
supervisado y distribuido. El sistema resuelve las principales limitaciones de las
implementaciones del algoritmo de Canny, que son el cuello de botella causado por el
proceso de hist茅resis y el uso de umbrales de hist茅resis fijos. Dada una imagen, esta
se divide en un conjunto de sub-im谩genes, y, para cada una de ellas, se calcula de forma
no supervisada un par de umbrales de hist茅resis utilizando el m茅todo de MedinaCarnicer. El detector satisface el requisito de tiempo real, al ser 0.35 ms el tiempo
promedio en detectar los bordes de una imagen 512x512.
En tercer lugar, se ha realizado una implementaci贸n optimizada del m茅todo de
compresi贸n de datos VLE (Variable-Length Encoding), que es 2.6x m谩s r谩pida en
promedio que la mejor implementaci贸n anterior. Adem谩s, esta soluci贸n incluye un
nuevo m茅todo scan inter-bloque, que se puede usar para acelerar la propia operaci贸n
scan y otros algoritmos, como el de compactaci贸n. En el caso de la operaci贸n scan, se
logra una aceleraci贸n de 1.62x si se usa el m茅todo propuesto en lugar del utilizado en la
mejor implementaci贸n anterior de VLE.
Esta tesis doctoral concluye con un cap铆tulo sobre futuros trabajos de investigaci贸n que
se pueden plantear a partir de sus contribuciones