1,662 research outputs found

    Edge Potential Functions (EPF) and Genetic Algorithms (GA) for Edge-Based Matching of Visual Objects

    Get PDF
    Edges are known to be a semantically rich representation of the contents of a digital image. Nevertheless, their use in practical applications is sometimes limited by computation and complexity constraints. In this paper, a new approach is presented that addresses the problem of matching visual objects in digital images by combining the concept of Edge Potential Functions (EPF) with a powerful matching tool based on Genetic Algorithms (GA). EPFs can be easily calculated starting from an edge map and provide a kind of attractive pattern for a matching contour, which is conveniently exploited by GAs. Several tests were performed in the framework of different image matching applications. The results achieved clearly outline the potential of the proposed method as compared to state of the art methodologies. (c) 2007 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works

    AdaTT: Adaptive Task-to-Task Fusion Network for Multitask Learning in Recommendations

    Full text link
    Multi-task learning (MTL) aims at enhancing the performance and efficiency of machine learning models by training them on multiple tasks simultaneously. However, MTL research faces two challenges: 1) modeling the relationships between tasks to effectively share knowledge between them, and 2) jointly learning task-specific and shared knowledge. In this paper, we present a novel model Adaptive Task-to-Task Fusion Network (AdaTT) to address both challenges. AdaTT is a deep fusion network built with task specific and optional shared fusion units at multiple levels. By leveraging a residual mechanism and gating mechanism for task-to-task fusion, these units adaptively learn shared knowledge and task specific knowledge. To evaluate the performance of AdaTT, we conduct experiments on a public benchmark and an industrial recommendation dataset using various task groups. Results demonstrate AdaTT can significantly outperform existing state-of-the-art baselines

    Efficient scheduling of batch processes in continuous processing lines

    Get PDF
    This thesis focuses mainly on the development of efficient formulations for scheduling in industrial environments. Likewise, decisions over the processes more related to advanced process control or production planning are included in the scheduling; in this way, the schedule obtained will be more efficient than it would be if the additional restrictions were not considered. The formulations have to emphasize obtaining online implementations, as they are planned to be used in real plants. The most common scheduling problems handled in the industrial environments are: the assignment of tasks to units, the distribution of production among parallel units and the distribution of shared resources among concurrent processes. Most advances in this work are the result of a collaborative work.Departamento de Ingeniería de Sistemas y AutomáticaDoctorado en Ingeniería Industria

    The development of a computer-based assessment tool for industrial water reuse

    Get PDF
    A principal impediment to internal industrial water reuse is the lack of an orderly and timely process for the systematic evaluation of possible water reuse schemes and their comparative economic benefit. For this reason, an interactive computer-based evaluation tool was developed. The POWR (Potential Opportunities for Water Reuse) computer program performs a systematic evaluation of possible water reuse schemes for one or two water-utilizing processes within a facility with a minimum amount of user-supplied information. The Microsoft Visual Basic programming utility was used due to its ability to perform the mass balance calculations necessary to determine technical and economic feasibility in a user-friendly environment. An industrial application is presented to demonstrate the use and utility of the developed POWR program. To facilitate use of the program, a guidance manual that provides instructions and additional water reuse information and references was also developed. This product can be used as a design tool to quickly evaluate potential internal water reuse opportunities within industrial manufacturing facilities

    Maintenance Management of Wind Turbines

    Get PDF
    “Maintenance Management of Wind Turbines” considers the main concepts and the state-of-the-art, as well as advances and case studies on this topic. Maintenance is a critical variable in industry in order to reach competitiveness. It is the most important variable, together with operations, in the wind energy industry. Therefore, the correct management of corrective, predictive and preventive politics in any wind turbine is required. The content also considers original research works that focus on content that is complementary to other sub-disciplines, such as economics, finance, marketing, decision and risk analysis, engineering, etc., in the maintenance management of wind turbines. This book focuses on real case studies. These case studies concern topics such as failure detection and diagnosis, fault trees and subdisciplines (e.g., FMECA, FMEA, etc.) Most of them link these topics with financial, schedule, resources, downtimes, etc., in order to increase productivity, profitability, maintainability, reliability, safety, availability, and reduce costs and downtime, etc., in a wind turbine. Advances in mathematics, models, computational techniques, dynamic analysis, etc., are employed in analytics in maintenance management in this book. Finally, the book considers computational techniques, dynamic analysis, probabilistic methods, and mathematical optimization techniques that are expertly blended to support the analysis of multi-criteria decision-making problems with defined constraints and requirements

    Online Build-Order Optimization for Real-Time Strategy Agents Using Multi-Objective Evolutionary Algorithms

    Get PDF
    The investigation introduces a novel approach for online build-order optimization in real-time strategy (RTS) games. The goal of our research is to develop an artificial intelligence (AI) RTS planning agent for military critical decision- making education with the ability to perform at an expert human level, as well as to assess a players critical decision- making ability or skill-level. Build-order optimization is modeled as a multi-objective problem (MOP), and solutions are generated utilizing a multi-objective evolutionary algorithm (MOEA) that provides a set of good build-orders to a RTS planning agent. We de ne three research objectives: (1) Design, implement and validate a capability to determine the skill-level of a RTS player. (2) Design, implement and validate a strategic planning tool that produces near expert level build-orders which are an ordered sequence of actions a player can issue to achieve a goal, and (3) Integrate the strategic planning tool into our existing RTS agent framework and an RTS game engine. The skill-level metric we selected provides an original and needed method of evaluating a RTS players skill-level during game play. This metric is a high-level description of how quickly a player executes a strategy versus known players executing the same strategy. Our strategic planning tool combines a game simulator and an MOEA to produce a set of diverse and good build-orders for an RTS agent. Through the integration of case-base reasoning (CBR), planning goals are derived and expert build- orders are injected into a MOEA population. The MOEA then produces a diverse and approximate Pareto front that is integrated into our AI RTS agent framework. Thus, the planning tool provides an innovative online approach for strategic planning in RTS games. Experimentation via the Spring Engine Balanced Annihilation game reveals that the strategic planner is able to discover build-orders that are better than an expert scripted agent and thus achieve faster strategy execution times

    Effects of input data degradation on hydrological model performance for a snowmelt dominated watershed, The

    Get PDF
    2006 Spring.Includes bibliographical references.The quality and quantity of hydrometeorological data used as input to a hydrologic model is varied and the output compared to observed historical flows. Temperature and precipitation data were used to feed the National Weather Service River Forecast System (NWSRFS); this hydrologic model outputs streamflow and is used daily throughout the country to forecast streamflows. NWSRFS is a lumped empirical model developed in the 1970s for the NWS and is calibrated in this study to model a portion of the snowmelt dominated Yampa River watershed in northwest Colorado. An analysis scheme is followed to capture the model's dependence on representative meteorological stations located in an around the modeled basin. Many regions in the United States experience meteorological and hydrological data scarcity issues. Operationally this becomes important when the available data is insufficient enough to produce reliable model outputs. Similar to Tsintikidis et al. (2002) concluding that the installation of additional rain gauges in a modeled basin would decrease the error of precipitation measurements in the model, we sought to find if increasing data input into a model, both the quantity and quality given by site representivity, will increase the accuracy of our model runs. The study basin was chosen for its snowmelt dominance characteristic. Mean areal precipitation and temperature values for the modeled zones are developed individually in each analysis scheme by the arrangement of stations used in each sensitivity analysis. A statistical analysis of the relative difference between model runs and archived observed values is performed in an effort to illustrate the effect of different model input data arrangements on model simulations. This study aimed at testing the tenable assertion that subtracting hydrometeorological data from a model's dataset would decrease the accuracy of forecasted stream flows from that model. Stream flows and snow water equivalence are analyzed to test the model's sensitivity to the amount of data used. Since the NWSRFS uses predetermined weights to determine MAPs, the number of stations used does not significantly affect model output. The usage of predetermined weights maintains a consistent year-to-year MAP. Varying the MAT station configuration showed a more sizeable effect than the MAP scheme illustrated. Though this procedure could and should be replicated for other hydroclimates and for basins with different sizes, the specific results are not transferable to other basins. The basin modeled is very heavily snowmelt dominated; this quality, as well as it size, climate, topography, and available hydrometeorological stations all influence model results; altering any of these would change the model performance

    Techno-economic analysis of solar stills using integrated fuzzy analytical hierarchy process and data envelopment analysis

    Get PDF
    Desalination using solar stills is an ancient economic method for water desalination. Over the years, research and development in the area of solar still has resulted in increased distillate yield by means of integration of PCM (phase change material), photo-voltaic thermal (PVT), etc with the still. Nano-PCM is an upcoming technology which modifies the thermal performance of PCM. The aim of this research is to analyze the efficiency of 20 solar stills including nano-PCM based solar stills considering various input and output criteria using integrated fuzzy analytical hierarchy process (AHP) and data envelopment analysis (DEA). The efficiency derived here is relative with regard to the parameters and stills considered in this study. The result infers that, even though the productivity of stepped solar still with sun tracking system was high, but when techno-economic aspects were considered it is not among the top solar stills. The analysis indicated pyramid type solar still, single slope solar still with PVT, solar still with NPCM (paraffin + copper oxide), solar still with NPCM (paraffin + titanium dioxide) and solar still with PCM (paraffin) occupies the top five positions with relative efficiency of 100, 100, 88.47, 88.46 and 76.93% respectively

    A new method for earthquake-induced damage identification in historic masonry towers combining OMA and IDA

    Get PDF
    AbstractThis paper presents a novel method for rapidly addressing the earthquake-induced damage identification task in historic masonry towers. The proposed method, termed DORI, combines operational modal analysis (OMA), FE modeling, rapid surrogate modeling (SM) and non-linear Incremental dynamic analysis (IDA). While OMA-based Structural Health Monitoring methods using statistical pattern recognition are known to allow the detection of small structural damages due to earthquakes, even far-field ones of moderate intensity, the combination of SM and IDA-based methods for damage localization and quantification is here proposed. The monumental bell tower of the Basilica of San Pietro located in Perugia, Italy, is considered for the validation of the method. While being continuously monitored since 2014, the bell tower experienced the main shocks of the 2016 Central Italy seismic sequence and the on-site vibration-based monitoring system detected changes in global dynamic behavior after the earthquakes. In the paper, experimental vibration data (continuous and seismic records), FE models and surrogate models of the structure are used for post-earthquake damage localization and quantification exploiting an ideal subdivision of the structure into meaningful macroelements. Results of linear and non-linear numerical modeling (SM and IDA, respectively) are successfully combined to this aim and the continuous exchange of information between the physical reality (monitoring data) and the virtual models (FE models and surrogate models) effectively enforces the Digital Twin paradigm. The earthquake-induced damage identified by both data-driven and model-based strategies is finally confirmed by in-situ visual inspections
    • …
    corecore