260 research outputs found

    Methods for solving combinatorial pricing problems

    Full text link
    Le problème de tarification combinatoire (CPP) ou le jeu de tarification de Stackelberg est une classe de problèmes d’optimisation bi-niveaux comprenant deux décideurs dans un ordre séquentiel. Le premier décideur, le leader, maximise ses revenus en contrôlant les prix d’un ensemble de ressources. Le deuxième décideur, le suiveur, réagit aux prix et sélectionne un sous-ensemble de ressources selon un problème d’optimisation combinatoire. Selon le problème du suiveur, le CPP peut être très difficile à résoudre. Cette thèse présente trois articles couvrant plusieurs méthodes de solution exacte pour le CPP. Le premier article aborde la modélisation et le prétraitement pour une spécialisation du CPP : le problème de tarification du réseau (NPP), dans lequel le problème du suiveur est un problème du plus court chemin. Les formulations du NPP sont organisées dans un cadre général qui établit les liens entre elles. Le deuxième article se concentre sur la version à plusieurs marchandises du NPP. À partir des résultats de l’analyse convexe, nous dérivons une nouvelle formulation du NPP et prouvons que le NPP évolue de manière polynomiale par rapport au nombre de marchandises, étant donné que le nombre d’arcs à péage est fixe. Le troisième article nous ramène au CPP général, dans lequel les problèmes du suiveur sont NP-difficiles. En utilisant deux modèles de programmation dynamique différents, les problèmes du suiveur sont convertis en programmes linéaires, auxquels la dualité forte peut être appliquée. En raison de la nature NP-difficile de ces problèmes, des schémas de génération dynamique de contraintes sont proposés. Les méthodes de solution décrites dans chaque article sont étayées par des résultats expérimentaux, montrant leur efficacité en pratique. Cette thèse approfondit notre compréhension de la structure du CPP et introduit des méthodologies innovantes pour y faire face, contribuant ainsi à de nouvelles perspectives pour aborder les problèmes de tarification et bi-niveau en général.The combinatorial pricing problem (CPP) or Stackelberg pricing game is a class of bilevel optimization problems that consist of two decision makers in sequential order. The first decision maker, the leader, maximizes their revenue by controlling the prices of a set of resources. The second decision maker, the follower, reacts to the prices and selects a subset of resources according to a combinatorial optimization problem. Depending on the follower’s problem, the CPP can be very challenging to solve. This thesis presents three articles covering several exact solution methods for the CPP. The first article addresses the modeling and preprocessing for a specialization of the CPP: the network pricing problem (NPP), in which the follower’s problem is a shortest path problem. The formulations of the NPP are organized in a general framework which establishes the links between them. The second article focuses on the multi-commodity version of the NPP. From the results in convex analysis, we derive a novel formulation of the NPP and with it, we prove that the NPP scales polynomially with respect to the number of commodities, given that the number of tolled arcs is fixed. The third article leads us back to the general CPP, in which the follower’s problems are NP-hard. By utilizing two different dynamic programming models, the follower’s problems are converted into linear programs, to which strong duality can be applied. Due to the NP-hard nature of these problems, dynamic constraint generation schemes are proposed. The solution methods described in each article are backed up with experimental results, showing that they are effective in practice. This thesis deepens our comprehension of the CPP structure and introduces innovative methodologies for addressing it, thereby contributing new perspectives to tackle pricing and bilevel problems in general

    A consensus based network intrusion detection system

    Full text link
    Network intrusion detection is the process of identifying malicious behaviors that target a network and its resources. Current systems implementing intrusion detection processes observe traffic at several data collecting points in the network but analysis is often centralized or partly centralized. These systems are not scalable and suffer from the single point of failure, i.e. attackers only need to target the central node to compromise the whole system. This paper proposes an anomaly-based fully distributed network intrusion detection system where analysis is run at each data collecting point using a naive Bayes classifier. Probability values computed by each classifier are shared among nodes using an iterative average consensus protocol. The final analysis is performed redundantly and in parallel at the level of each data collecting point, thus avoiding the single point of failure issue. We run simulations focusing on DDoS attacks with several network configurations, comparing the accuracy of our fully distributed system with a hierarchical one. We also analyze communication costs and convergence speed during consensus phases.Comment: Presented at THE 5TH INTERNATIONAL CONFERENCE ON IT CONVERGENCE AND SECURITY 2015 IN KUALA LUMPUR, MALAYSI

    Computational methods in biodiversity conservation

    Get PDF
    Vom Verfasser nicht angegeben

    THE CORRELATION BETWEEN HIGH SCHOOL STUDENTS’ PHONOLOGICAL AWARENESS AND THEIR PRONUNCIATION OF ENGLISH INFLECTIONAL MORPHEMES –ED AND -S: A CASE IN THE MEKONG DELTA OF VIETNAM

    Get PDF
    This study examined the relationship between high school students' phonological awareness and their performance in pronouncing allomorphs of English inflectional morphemes -ed and -s. The study involved 31 high school students in Can Tho City in the Mekong Delta of Vietnam. Data of the study were collected through a pronunciation written test (PWT) and a pronunciation oral test (POT). The findings pointed out that all the students had phonological knowledge of the two morphemes; however, the majority of the participants made errors in pronouncing them which indicated their lack of pronunciation performance. In addition, there was no correlation between the students’ phonological awareness and their pronunciation performance detected in the study. Based on the results, pedagogical implications were suggested.  Article visualizations

    Ultrafast Approximation for Phylogenetic Bootstrap

    Get PDF
    Nonparametric bootstrap has been a widely used tool in phylogenetic analysis to assess the clade support of phylogenetic trees. However, with the rapidly growing amount of data, this task remains a computational bottleneck. Recently, approximation methods such as the RAxML rapid bootstrap (RBS) and the Shimodaira-Hasegawa-like approximate likelihood ratio test have been introduced to speed up the bootstrap. Here, we suggest an ultrafast bootstrap approximation approach (UFBoot) to compute the support of phylogenetic groups in maximum likelihood (ML) based trees. To achieve this, we combine the resampling estimated log-likelihood method with a simple but effective collection scheme of candidate trees. We also propose a stopping rule that assesses the convergence of branch support values to automatically determine when to stop collecting candidate trees. UFBoot achieves a median speed up of 3.1 (range: 0.66-33.3) to 10.2 (range: 1.32-41.4) compared with RAxML RBS for real DNA and amino acid alignments, respectively. Moreover, our extensive simulations show that UFBoot is robust against moderate model violations and the support values obtained appear to be relatively unbiased compared with the conservative standard bootstrap. This provides a more direct interpretation of the bootstrap support. We offer an efficient and easy-to-use software (available at http://www.cibiv.at/software/iqtree) to perform the UFBoot analysis with ML tree inference

    Effect of humate and controlled released NPK fertilizers (NPK-CRF) on rice yield and soil fertility of intensive alluvial soils

    Get PDF
    The study aims to assess the effect of mixed fertilizers, including controlled slow-release NPK (NPK-CRF) and urea, potassium humate fertilizers, on soil fertility and rice yield. The on-Farm Trials experiment was carried out on alluvial soil, with two models corresponding to two farming techniques: (i) Traditional fertilization, applying conventional fertilizers with the formula 92.2 N–82.8 P2O5–22.8 K2O kg/ha; (ii) New generation fertilizers (NPK-CRF, urea humate, and potassium humate) with the formula 50.1 N–39.9 P2O5–30.0 K2O. Each pattern was repeated three times, corresponding to 3 farmers. Each household's area is 1,000m2, cultivating continuously through three seasons of Winter-Spring (WS), Summer-Autumn (SA), and Autumn-Winter (AW) in Chau Thanh A district, Hau Giang province. The results showed that the new generation fertilizer application significantly improved rice yield and yield composition of the Winter-Spring cropping season (6.92 tons/ha), Summer-Autumn (5.94 tons/ha), and Autumn-Winter (6.15 tons/ha), which are different from farmers' fields. Furthermore, the combined application of NPK-CRF, urea-humate, and K-humate fertilizers for rice in SA and AW crops significantly reduced the total acid content, Al3+ exchange in the soil, and improved soil fertility of pH, N, and available P, organic matter (%C). However, there was no difference in soil's physical properties over the three farming seasons. Finally, adding humic acid to controlled-release fertilizer can improve soil fertility, and increase yield and yield components, nitrogen uptake, enhance nitrogen usage efficiency, all of which have positive yield and soil consequences

    A DOUBLE-SHRINK AUTOENCODER FOR NETWORK ANOMALY DETECTION

    Get PDF
    The rapid development of the Internet and the wide spread of its applications has affected many aspects of our life. However, this development also makes the cyberspace more vulnerable to various attacks. Thus, detecting and preventing these attacks are crucial for the next development of the Internet and its services. Recently, machine learning methods have been widely adopted in detecting network attacks. Among many machine learning methods, AutoEncoders (AEs) are known as the state-of-the-art techniques for network anomaly detection. Although, AEs have been successfully applied to detect many types of attacks, it is often unable to detect some difficult attacks that attempt to mimic the normal network traffic. In order to handle this issue, we propose a new model based on AutoEncoder called Double-Shrink AutoEncoder (DSAE). DSAE put more shrinkage on the normal data in the middle hidden layer. This helps to pull out some anomalies that are very similar to normal data. DSAE are evaluated on six well-known network attacks datasets. The experimental results show that our model performs competitively to the state-of-the-art model, and often out-performs this model on the attacks group that is difficult for the previous methods

    Optimal control and real-time simulation of hybrid marine power plants

    Get PDF
    With significantly increasing concerns about greenhouse effects and sustainable economy, the marine industry presents great potential for reducing its environmental impact. Recent developments in power electronics and hybridisation technologies create new opportunities for innovative marine power plants which utilize both traditional diesel generators and energy storage like batteries and/or supercapacitors as the power sources. However, power management of such complex systems in order to achieve the best efficiency becomes one of the major challenges. Acknowledging this importance, this research aims to develop an optimal control strategy (OCS) for hybrid marine power plants. First, architecture of the researched marine power plant is briefly discussed and a simple plant model is presented. The generator can be used to charge the batteries when the ship works with low power demands. Conversely, this battery energy can be used as an additional power source to drive the propulsion or assist the generators when necessary. In addition, energy losses through braking can be recuperated and stored in the battery for later use. Second, the OCS is developed based on equivalent fuel consumption minimisation (EFCM) approach to manage efficiently the power flow between the power sources. This helps the generators to work at the optimal operating conditions, conserving fuel and lowering emissions. In principle, the EFCM is based on the simple concept that discharging the battery at present is equivalent to a fuel burn in the future and vice-versa and, is suitable for real-time implementation. However, instantaneously regulating the power sources’ demands could affect the system stability as well as the lifetime of the components. To overcome this drawback and to achieve smooth energy management, the OCS is designed with a number of penalty factors by considering carefully the system states, such as generators’ fuel consumption and dynamics (stop/start and cranking behaviour), battery state of charge and power demands. Moreover, adaptive energy conversion factors are designed using artificial intelligence and integrated in the OCS design to improve the management performance. The system therefore is capable of operating in the highest fuel economy zone and without sacrificing the overall performance. Furthermore, a real-time simulation platform has been developed for the future investigation of the control logic. The effectiveness of the proposed OCS is then verified through numerical simulations with a number of test cases

    New Methods to Calculate Concordance Factors for Phylogenomic Datasets

    Get PDF
    We implement two measures for quantifying genealogical concordance in phylogenomic data sets: the gene concordance factor (gCF) and the novel site concordance factor (sCF). For every branch of a reference tree, gCF is defined as the percentage of “decisive” gene trees containing that branch. This measure is already in wide usage, but here we introduce a package that calculates it while accounting for variable taxon coverage among gene trees. sCF is a new measure defined as the percentage of decisive sites supporting a branch in the reference tree. gCF and sCF complement classical measures of branch support in phylogenetics by providing a full description of underlying disagreement among loci and sites. An easy to use implementation and tutorial is freely available in the IQ-TREE software package (http://www.iqtree.org/doc/ Concordance-Factor, last accessed May 13, 2020).This work was supported by National Science Foundation (Grant No. DEB-1936187 to M.W.H.), an Australian National University Futures Grant (to R.L.), and an Australian Research Council (Grant No. DP200103151 to R.L., B.Q.M., and M.W.H.)
    • …
    corecore