854 research outputs found

    Genetic Land - Modeling land use change using evolutionary algorithms

    Get PDF
    Future land use configurations provide valuable knowledge for policy makers and economic agents, especially under expected environmental changes such as decreasing rainfall or increasing temperatures, or scenarios of policy guidance such as carbon sequestration enforcement. In this paper, modelling land use change is designed as an optimization problem in which landscapes (land uses) are generated through the use of genetic algorithms (GA), according to an objective function (e.g. minimization of soil erosion, or maximization of carbon sequestration), and a set of local restrictions (e.g. soil depth, water availability, or landscape structure). GAs are search and optimization procedures based on the mechanics of natural selection and genetics. The GA starts with a population of random individuals, each corresponding to a particular candidate solution to the problem. The best solutions are propagated; they are mated with each other and originate “offspring solutions” which randomly combine the characteristics of each “parent”. The repeated application of these operations leads to a dynamic system that emulates the evolutionary mechanisms that occur in nature. The fittest individuals survive and propagate their traits to future generations, while unfit individuals have a tendency to die and become extinct (Goldberg, 1989). Applications of GA to land use planning have been experimented (Brookes, 2001, Ducheyne et al, 2001). However, long-term planning with a time-span component has not yet been addressed. GeneticLand, the GA for land use generation, works on a region represented by a bi-dimensional array of cells. For each cell, there is a number of possible land uses (U1, U2, ..., Un). The task of the GA is to search for an optimal assignment of these land uses to the cells, evolving the landscape patterns that are most suitable for satisfying the objective function, for a certain time period (e.g. 50 years in the future). GeneticLand develops under a multi-objective function: (i) Minimization of soil erosion – each solution is validated by applying the USLE, with the best solution being the one that minimizes the landscape soil erosion value; (ii) Maximization of carbon sequestration – each solution is validated by applying atmospheric CO2 carbon uptake estimates, with the best solution being the one that maximizes the landscape carbon uptake; and (iii) Maximization of the landscape economic value – each solution is validated by applying an economic value (derived from expert judgment), with the best solution being the one that maximizes the landscape economic value. As an optimization problem, not all possible land use assignments are feasible. GeneticLand considers two sets of restrictions that must be met: (i) physical constraints (soil type suitability, slope, rainfall-evapotranspiration ratio, and a soil wetness index) and (ii) landscape ecology restrictions at several levels (minimum patch area, land use adjacency index and landscape contagion index). The former assures physical feasibility and the latter the spatial coherence of the landscape. The physical and landscape restrictions were derived from the analysis of past events based on a time series of Landsat images (1985-2003), in order to identify the drivers of land use change and structure. Since the problem has multiple objectives, the GA integrates multi-objective extensions allowing it to evolve a set of non-dominated solutions. An evolutive type algorithm – Evolutive strategy (1+1) – is used, due to the need to accommodate the very large solution space. Current applications have about 1000 decision variables, while the problem analysed by GeneticLand has almost 111000, generated by a landscape with 333*333 discrete pixels. GeneticLand is developed and validated for a Mediterranean type landscape located in southern Portugal. Future climate triggers, such as the increase of intense rainfall episodes, is accommodated to simulate climate change . This paper presents: (1) the formulation of land use modelling as an optimization problem; (2) the formulation of the GA for the explicit spatial domain, (3) the land use constraints derived for a Mediterranean landscape, (4) the results illustrating conflicting objectives, and (5) limitations encountered.

    G-Tric: enhancing triclustering evaluation using three-way synthetic datasets with ground truth

    Get PDF
    Tese de mestrado, CiĂȘncia de Dados, Universidade de Lisboa, Faculdade de CiĂȘncias, 2020Three-dimensional datasets, or three-way data, started to gain popularity due to their increasing capacity to describe inherently multivariate and temporal events, such as biological responses, social interactions along time, urban dynamics, or complex geophysical phenomena. Triclustering, subspace clustering of three-way data, enables the discovery of patterns corresponding to data subspaces (triclusters) with values correlated across the three dimensions (observations _ features _ contexts). With an increasing number of algorithms being proposed, effectively comparing them with state-of-the-art algorithms is paramount.These comparisons are usually performed using real data, without a known ground-truth, thus limiting the assessments. In this context, we propose a synthetic data generator, G-Tric, allowing the creation of synthetic datasets with configurable properties and the possibility to plant triclusters. The generator is prepared to create datasets resembling real three-way data from biomedical and social data domains, with the additional advantage of further providing the ground truth (triclustering solution) as output. G-Tric can replicate real-world datasets and create new ones that match researchers’ needs across several properties, including data type (numeric or symbolic), dimension, and background distribution. Users can tune the patterns and structure that characterize the planted triclusters (subspaces) and how they interact (overlapping). Data quality can also be controlled by defining the number of missing values, noise, and errors. Furthermore, a benchmark of datasets resembling real data is made available, together with the corresponding triclustering solutions (planted triclusters) and generating parameters. Triclustering evaluation using G-Tric provides the possibility to combine both intrinsic and extrinsic metrics to compare solutions that produce more reliable analyses. A set of predefined datasets, mimicking widely used three-way data and exploring crucial properties was generated and made available, highlighting G-Tric’s potential to advance triclustering state-of-the-art by easing the process of evaluating the quality of new triclustering approaches. Besides reviewing the current state-of-the-art regarding triclustering approaches, comparison studies and evaluation metrics, this work also analyzes how the lack of frameworks to generate synthetic data influences existent evaluation methodologies, limiting the scope of performance insights that can be extracted from each algorithm. As well as exemplifying how the set of decisions made on these evaluations can impact the quality and validity of those results. Alternatively, a different methodology that takes advantage of synthetic data with ground truth is presented. This approach, combined with the proposal of an extension to an existing clustering extrinsic measure, enables to assess solutions’ quality under new perspectives

    Developing persuasive interfaces

    Get PDF
    MSCC Dissertation in Computer EngineeringNowadays, computers are indispensable tools for most people. Since computers’ role is to make everyone’s life easier, systems can be built to be of even more assistance. While today people have to learn how to interact with computers, in the future, instead of having to do that, computer systems will be blended in our everyday things. It is expected that technology will come to a point where it is natural to interact with it like with every other object. Computers have been gaining their space in our lives being considered machines that can do almost anything. This master thesis has the purpose of studying how computers can be used as persuasive technology. To build persuasive systems it is important to know that humans are very different from each other. Given this idea it is of extreme importance that the system is able to determine as much aspects of the surrounding environment as it can, to process that information and interact with people accordingly. To achieve a solution that is effective for this concept, a study of persuasive systems is presented in this document. An architecture was developed in order to provide an infrastructure for the development of persuasive applications. An authoring tool was also implemented in order to allow the creation of context aware persuasive applications by users without programming skills. Furthermore, a persuasive application prototype was developed as proof of concept for this study. Usability tests were performed to analyze if users could successfully create the application prototype, named Smart Bins, using the proposed framework. Further user tests were executed with children, the target users of the application prototype, to evaluate its usability and persuasiveness. A description of the Smart Bins application as well as the results of the user tests are presented in this thesis. The results provide further understanding and new perspectives regarding the use of persuasive technologies as interactive computer products, designed to change people’s habits, therefore helping to improve human’s attitudes and behaviours towards important matters like environmental preservation

    Digital ocular fundus imaging: a review

    Get PDF
    Ocular fundus imaging plays a key role in monitoring the health status of the human eye. Currently, a large number of imaging modalities allow the assessment and/or quantification of ocular changes from a healthy status. This review focuses on the main digital fundus imaging modality, color fundus photography, with a brief overview of complementary techniques, such as fluorescein angiography. While focusing on two-dimensional color fundus photography, the authors address the evolution from nondigital to digital imaging and its impact on diagnosis. They also compare several studies performed along the transitional path of this technology. Retinal image processing and analysis, automated disease detection and identification of the stage of diabetic retinopathy (DR) are addressed as well. The authors emphasize the problems of image segmentation, focusing on the major landmark structures of the ocular fundus: the vascular network, optic disk and the fovea. Several proposed approaches for the automatic detection of signs of disease onset and progression, such as microaneurysms, are surveyed. A thorough comparison is conducted among different studies with regard to the number of eyes/subjects, imaging modality, fundus camera used, field of view and image resolution to identify the large variation in characteristics from one study to another. Similarly, the main features of the proposed classifications and algorithms for the automatic detection of DR are compared, thereby addressing computer-aided diagnosis and computer-aided detection for use in screening programs.Fundação para a CiĂȘncia e TecnologiaFEDErPrograma COMPET

    A practical and general methodology for efficiency calibration of coaxial Ge detectors

    Get PDF
    The correct determination of artificial and natural radionuclides like 152Eu, 137,134Cs, 60,57Co, etc., and 234,228Th, 228,226Ra, 210Pb and 40K, respectively, is essential for many environmental science fields. For this, a general function was obtained for the full-energy peak efficiency (FEPE) by gamma-ray spectrometry using coaxial Ge detectors. Then, the experimental FEPE values, obtained fixing the energy, EÎł, were fitted varying the thickness, h, of cylindrical standards. The parameters resulted from these fittings were fitted, in turn, versus EÎł, obtaining a general efficiency function, Δc (EÎł, h). Δc (EÎł, h) was validated, obtaining very good zscore, except for EÎł affected by TCS effects. Consequently, a practical and general method was developed, recalibrating the detector by varying the sample-detector distance, d. Δc (EÎł, h, d) was obtained, achieving very good zscore. Furthermore, this practical method was also employed to correct high self-absorptions and high dead times.This research has partially funded by the projects of the Regional Government of Andalusia called “Basic processes regulating the fractionations and enrichments of natural radionuclides under acid mine drainage conditions” (Ref.: UHU-1255876), and “Treatment of acid leachates from phosphogypsum piles located at Huelva, and transport modelling of the released radionuclides” (Ref.: P20_00096), the project funded by the Spanish Ministry of Science, Innovation and Universities’ Research Agency “Development and optimization of a process for removing natural radionuclides in phosphogypsum leachates” (Ref.: PID2020-116461RB-C21), and the Project for Novel Principal Investigators “Quantitative study of the variables involved in the radonexhalation rate for granular solids; application to rafts of granular solid phosphogypsum” (Ref.: UHUPJ-00005-632). The authors acknowledge the funding for open access charge provided by Universidad de Huelva/CBUA

    A new efficiency calibration methodology for different atmospheric filter geometries by using coaxial Ge detectors

    Get PDF
    The study of the different pollutants present in atmospheric aerosols such as trace elements and radionuclides is essential to assess the air quality. To analyze the particulate matter (PM), atmospheric filters with different dimensions and geometries (rectangular, circular, slotted, and square filters) are usually employed. Regarding the pollutants existing in atmospheric aerosols, radionuclides are usually analyzed due to their multiple applications such as either in the environmental radiological control or as tracers of atmospheric processes. Therefore, this study aims to develop a new and general methodology to calibrate in efficiency coaxial Ge detectors to properly determine radionuclides present in the PM by gamma-ray spectrometry for several filter types. For this, granular certified reference materials (CRM) containing only natural radionuclides (ÂČ³⁞U-series, ÂČÂłÂČTh-series, and ⁎⁰ K) were selected. Several granular solid CRMs were chosen allowing us to reproduce the same PM deposition geometry and to assure the homogeneity of the added CRMs. These are the main advantages in relation to the typical methods that use liquid CRMs. Furthermore, for filters whose surfaces are relatively large, they were cut in several pieces and placed one on top of the other, achieving the same geometry than the PM deposited onto the filter. Then, the experimental full-energy peak efficiencies (FEPEs) were obtained for each energy of interest (EÎł) and they were fitted versus EÎł, finding a general FEPE function for each filter type. Finally, this methodology was validated for both natural and artificial radionuclides (from 46 to 1332 keV) by using different filter types employed in proficiency test exercises, obtaining |zscore|< 2 for all cases.Funding for open access publishing: Universidad de Huelva / CBUA This research has partially funded by the projects of the Regional Government of Andalusia called “Treatment of acid leachates from phosphogypsum piles located at Huelva, and transport modelling of the released radionuclides” (Ref.: P20_00096) and “Valorization of inorganic wastes enriched in natural radioactivity for sustainable building materials” (Ref.: FEDER-UHU-202020); the project funded by the Spanish Nuclear Safety Council (CSN) “Radon exhalation from building materials; radiological impact and corrective measures” (Ref.: SUBV-4/2021); the project funded by the Spanish Ministry of Science, Innovation and Universities’ Research Agency “Development and optimization of a process for removing natural radionuclides in phosphogypsum leachates” (Ref.: PID2020-116461RB-C21); and the Project for Novel Principal Investigators “Quantitative study of the variables involved in the radon exhalation rate for granular solids; application to rafts of granular solid phosphogypsum” (Ref.: UHUPJ-00005–632)

    On the accretion of phantom energy onto wormholes

    Full text link
    By using a properly generalized accretion formalism it is argued that the accretion of phantom energy onto a wormhole does not make the size of the wormhole throat to comovingly scale with the scale factor of the universe, but instead induces an increase of that size so big that the wormhole can engulf the universe itself before it reaches the big rip singularity, at least relative to an asymptotic observer.Comment: 4 pages, LaTex, to appear in Phys. Lett.

    Location of R&D activities by vertical multinationals over asymmetric countries

    Get PDF
    This paper deals with the location of R&D by vertical multinational firms. By taking the colocation of laboratories and productive plants as a benchmark, we can see that the spatial separation of both emerges under two conditions – high intensity of R&D spillovers and strong size asymmetry between countries. The latter condition is effective since it is related with a rising international inequality of wages. If the spatial separation of R&D and manufacturing takes place, headquarters services (namely R&D units) will be likely located in the smaller country. The converse pattern, where laboratories are place in the larger country, may arise if production is high-tech and the localized externalities of research activity are strong. Hence, this article confirms the main results of the literature on this topic but in the context of a different framework which allows us to tackle two usually disregarded topics: the transfer cost of technology; and the direct engagement of industrial workers in R&D spillovers. These aspects are dealt with by presupposing that, in addition to a “technological” externality among researchers, there is an “educational” externality exerted by researchers upon neighbouring industrial workers. When a country loses its laboratories, the inhabitants become intellectually “impoverished” and their labour starts to have a lesser efficiency.info:eu-repo/semantics/publishedVersio

    Cost-sensitive learning and threshold-moving approach to improve industrial lots release process on imbalanced datasets

    Get PDF
    With Industry 4.0, companies must manage massive and generally imbalanced datasets. In an automotive company, the lots release decision process must cope with this problem by combining data from different sources to determine if a selected group of products can be released to the customers. This work focuses on this process and aims to classify the occurrence of customer complaints with a conception, tune and evaluation of five ML algorithms, namely XGBoost (XGB), LightGBM (LGBM), CatBoost (CatB), Random Forest(RF) and a Decision Tree (DT), based on an imbalanced dataset of automatic production tests. We used a non-sampling approach to deal with the problem of imbalanced datasets by analyzing two different methods, cost-sensitive learning and threshold-moving. Regarding the obtained results, both methods showed an effective impact on boosting algorithms, whereas RF only showed improvements with threshold-moving. Also, considering both approaches, the best overall results were achieved by the threshold-moving method, where RF obtained the best outcome with a F1-Score value of 76.2%.FCT - Fundação para a CiĂȘncia e a Tecnologia(UIDB/00319/2020
    • 

    corecore