1,292 research outputs found

    Multi-source eo for dynamic wetland mapping and monitoring in the great lakes basin

    Get PDF
    Wetland managers, citizens and government leaders are observing rapid changes in coastal wetlands and associated habitats around the Great Lakes Basin due to human activity and climate variability. SAR and optical satellite sensors offer cost effective management tools that can be used to monitor wetlands over time, covering large areas like the Great Lakes and providing information to those making management and policy decisions. In this paper we describe ongoing efforts to monitor dynamic changes in wetland vegetation, surface water extent, and water level change. Included are assessments of simulated Radarsat Constellation Mission data to determine feasibility of continued monitoring into the future. Results show that integration of data from multiple sensors is most effective for monitoring coastal wetlands in the Great Lakes region. While products developed using methods described in this article provide valuable management tools, more effort is needed to reach the goal of establishing a dynamic, near-real-time, remote sensing-based monitoring program for the basin

    Nurse Rostering Problem Considering Drect andIndirect Costs: Deferential Evolution Algorithm

    Get PDF
    The employee scheduling seeks to find an optimal schedule for employees according to the amount of demand (workload), employee availability, labor law, employment contracts, etc. The importance of this problem in improving the quality of service, health and satisfaction of employees and reducing costs, including in hospitals, military or service centers, has encouraged researchers to study. In this regard, nurse rostering problem is a scheduling that determines the number of nurses required with different skills and the time of their services on the planning horizon. In this research, by adding the nurses' shift preferences and number of consecutive working days constraints, an attempt has been made to make the problem more realistic. The objective function of the problem is to minimize the total cost of allocating work shifts to nurses, the cost of the number of nurses required to reserve, the cost of overtime from a particular shift, the cost of underemployment from a particular shift, the cost of overtime on the planning horizon, the cost of underemployment on the planning horizon and the cost of absence shift-working and non-working days preferred by nurses. To solve problem, after modeling the problem as a mixed-nteger program and due to the complexity of the problem, the differential evolutionary algorithm is used with innovation in its crossover operator. To validate the proposed algorithm, its output was compared with the genetic algorithm. The results show that the differential evolutionary algorithm has good performance in problem-solving.Keywords: Nurse Rostering Problem, Deferential Evolution Algorith

    Semi-automated surface water detection with synthetic aperture radar data: A wetland case study

    Get PDF
    In this study, a new method is proposed for semi-automated surface water detection using synthetic aperture radar data via a combination of radiometric thresholding and image segmentation based on the simple linear iterative clustering superpixel algorithm. Consistent intensity thresholds are selected by assessing the statistical distribution of backscatter values applied to the mean of each superpixel. Higher-order texture measures, such as variance, are used to improve accuracy by removing false positives via an additional thresholding process used to identify the boundaries of water bodies. Results applied to quad-polarized RADARSAT-2 data show that the threshold value for the variance texture measure can be approximated using a constant value for different scenes, and thus it can be used in a fully automated cleanup procedure. Compared to similar approaches, errors of omission and commission are improved with the proposed method. For example, we observed that a threshold-only approach consistently tends to underestimate the extent of water bodies compared to combined thresholding and segmentation, mainly due to the poor performance of the former at the edges of water bodies. The proposed method can be used for monitoring changes in surface water extent within wetlands or other areas, and while presented for use with radar data, it can also be used to detect surface water in optical images

    Design, fabrication and characterization of the first AC-coupled silicon microstrip sensors in India

    Full text link
    This paper reports the design, fabrication and characterization of single-sided silicon microstrip sensors with integrated biasing resistors and coupling capacitors, produced for the first time in India. We have first developed a prototype sensor on a four-inch wafer. After finding suitable test procedures for characterizing these AC coupled sensors, we have fine-tuned various process parameters in order to produce sensors with the desired specifications.Comment: 10 pages, 11 figures, 1 table, to appear in JINS

    Location-Allocation and Scheduling of Inbound and Outbound Trucks in Multiple Cross-Dockings Considering Breakdown Trucks

    Get PDF
    This paper studies multiple cross-dockings where the loads are transferred from origins (suppliers) to destinations (customers) through cross-docking facilities. Products are no longer stored in intermediate depots and incoming shipments are consolidated based on customer demands and immediately delivered to them to their destinations. In this paper, each cross-docking has a covering radius that customers can be served by at least one cross-docking provided. In addition, this paper considers the breakdown of trucks. We present a two-stage model for the location of cross-docking centers and scheduling inbound and outbound trucks in multiple cross-dockings.We work on minimizing the transportation cost in a network by loading trucks in the supplier locations and route them to the customers via cross-docking facilities. The objective, in the first stage, is to minimize transportation cost of delivering products from suppliers to open cross-docks and cross-docks to the customers; in the second-stage, the objective is to minimize the makespans of open cross-dockings and the total weighted summation of completion time. Due to the difficulty of obtaining the optimum solution tomedium- and large-scale problems, we ‌propose four types of metaheuristic algorithms, i.e., genetic, simulated annealing, differential evolution, and hybrid algorithms.The result showed that simulated annealing is the best algorithm between the four algorithms

    Temporal population structure, a genetic dating method for ancient Eurasian genomes from the past 10,000 years

    Get PDF
    Radiocarbon dating is the gold standard in archeology to estimate the age of skeletons, a key to studying their origins. Many published ancient genomes lack reliable and direct dates, which results in obscure and contradictory reports. We developed the temporal population structure (TPS), a DNA-based dating method for genomes ranging from the Late Mesolithic to today, and applied it to 3,591 ancient and 1,307 modern Eurasians. TPS predictions aligned with the known dates and correctly accounted for kin relationships. TPS dating of poorly dated Eurasian samples resolved conflicting reports in the literature, as illustrated by one test case. We also demonstrated how TPS improved the ability to study phenotypic traits over time. TPS can be used when radiocarbon dating is unfeasible or uncertain or to develop alternative hypotheses for samples younger than 10,000 years ago, a limitation that may be resolved over time as ancient data accumulate

    A Systematic Approach for Variable Selection With Random Forests: Achieving Stable Variable Importance Values

    Get PDF
    Random Forests variable importance measures are often used to rank variables by their relevance to a classification problem and subsequently reduce the number of model inputs in high-dimensional data sets, thus increasing computational efficiency. However, as a result of the way that training data and predictor variables are randomly selected for use in constructing each tree and splitting each node, it is also well known that if too few trees are generated, variable importance rankings tend to differ between model runs. In this letter, we characterize the effect of the number of trees (ntree) and class separability on the stability of variable importance rankings and develop a systematic approach to define the number of model runs and/or trees required to achieve stability in variable importance measures. Results demonstrate that both a large ntree for a single model run, or averaged values across multiple model runs with fewer trees, are sufficient for achieving stable mean importance values. While the latter is far more computationally efficient, both the methods tend to lead to the same ranking of variables. Moreover, the optimal number of model runs differs depending on the separability of classes. Recommendations are made to users regarding how to determine the number of model runs and/or trees that are required to achieve stable variable importance rankings

    An agent-based genetic algorithm for hybrid flowshops with sequence dependent setup times to minimise makespan

    Full text link
    This paper deals with a variant of flowshop scheduling, namely, the hybrid or flexible flowshop with sequence dependent setup times. This type of flowshop is frequently used in the batch production industry and helps reduce the gap between research and operational use. This scheduling problem is NP-hard and solutions for large problems are based on non-exact methods. An improved genetic algorithm (GA) based on software agent design to minimise the makespan is presented. The paper proposes using an inherent characteristic of software agents to create a new perspective in GA design. To verify the developed metaheuristic, computational experiments are conducted on a well-known benchmark problem dataset. The experimental results show that the proposed metaheuristic outperforms some of the well-known methods and the state-of-art algorithms on the same benchmark problem dataset.The translation of this paper was funded by Universidad Politecnica de Valencia, Spain.Gómez Gasquet, P.; Andrés Romano, C.; Lario Esteban, FC. (2012). An agent-based genetic algorithm for hybrid flowshops with sequence dependent setup times to minimise makespan. Expert Systems with Applications. 39(9):8095-8107. https://doi.org/10.1016/j.eswa.2012.01.158S8095810739

    Satisfying flexible due dates in fuzzy job shop by means of hybrid evolutionary algorithms

    Get PDF
    This paper tackles the job shop scheduling problem with fuzzy sets modelling uncertain durations and flexible due dates. The objective is to achieve high-service level by maximising due-date satisfaction, considering two different overall satisfaction measures as objective functions. We show how these functions model different attitudes in the framework of fuzzy multicriteria decision making and we define a measure of solution robustness based on an existing a-posteriori semantics of fuzzy schedules to further assess the quality of the obtained solutions. As solving method, we improve a memetic algorithm from the literature by incorporating a new heuristic mechanism to guide the search through plateaus of the fitness landscape. We assess the performance of the resulting algorithm with an extensive experimental study, including a parametric analysis, and a study of the algorithm’s components and synergy between them. We provide results on a set of existing and new benchmark instances for fuzzy job shop with flexible due dates that show the competitiveness of our method.This research has been supported by the Spanish Government under research grant TIN2016-79190-R

    Trapping in irradiated p-on-n silicon sensors at fluences anticipated at the HL-LHC outer tracker

    Get PDF
    The degradation of signal in silicon sensors is studied under conditions expected at the CERN High-Luminosity LHC. 200 μ\mum thick n-type silicon sensors are irradiated with protons of different energies to fluences of up to 310153 \cdot 10^{15} neq/cm2^2. Pulsed red laser light with a wavelength of 672 nm is used to generate electron-hole pairs in the sensors. The induced signals are used to determine the charge collection efficiencies separately for electrons and holes drifting through the sensor. The effective trapping rates are extracted by comparing the results to simulation. The electric field is simulated using Synopsys device simulation assuming two effective defects. The generation and drift of charge carriers are simulated in an independent simulation based on PixelAV. The effective trapping rates are determined from the measured charge collection efficiencies and the simulated and measured time-resolved current pulses are compared. The effective trapping rates determined for both electrons and holes are about 50% smaller than those obtained using standard extrapolations of studies at low fluences and suggests an improved tracker performance over initial expectations
    corecore