388 research outputs found

    The Univariate Marginal Distribution Algorithm Copes Well With Deception and Epistasis

    Full text link
    In their recent work, Lehre and Nguyen (FOGA 2019) show that the univariate marginal distribution algorithm (UMDA) needs time exponential in the parent populations size to optimize the DeceptiveLeadingBlocks (DLB) problem. They conclude from this result that univariate EDAs have difficulties with deception and epistasis. In this work, we show that this negative finding is caused by an unfortunate choice of the parameters of the UMDA. When the population sizes are chosen large enough to prevent genetic drift, then the UMDA optimizes the DLB problem with high probability with at most λ(n2+2elnn)\lambda(\frac{n}{2} + 2 e \ln n) fitness evaluations. Since an offspring population size λ\lambda of order nlognn \log n can prevent genetic drift, the UMDA can solve the DLB problem with O(n2logn)O(n^2 \log n) fitness evaluations. In contrast, for classic evolutionary algorithms no better run time guarantee than O(n3)O(n^3) is known (which we prove to be tight for the (1+1){(1+1)} EA), so our result rather suggests that the UMDA can cope well with deception and epistatis. From a broader perspective, our result shows that the UMDA can cope better with local optima than evolutionary algorithms; such a result was previously known only for the compact genetic algorithm. Together with the lower bound of Lehre and Nguyen, our result for the first time rigorously proves that running EDAs in the regime with genetic drift can lead to drastic performance losses

    Evolutionary design of a full-envelope full-authority flight control system for an unstable high-performance aircraft

    Get PDF
    The use of an evolutionary algorithm in the framework of H1 control theory is being considered as a means for synthesizing controller gains that minimize a weighted combination of the infinite norm of the sensitivity function (for disturbance attenuation requirements) and complementary sensitivity function (for robust stability requirements) at the same time. The case study deals with a complete full-authority longitudinal control system for an unstable high-performance jet aircraft featuring (i) a stability and control augmentation system and (ii) autopilot functions (speed and altitude hold). Constraints on closed-loop response are enforced, that representing typical requirements on airplane handling qualities, that makes the control law synthesis process more demanding. Gain scheduling is required, in order to obtain satisfactory performance over the whole flight envelope, so that the synthesis is performed at different reference trim conditions, for several values of the dynamic pressure, used as the scheduling parameter. Nonetheless, the dynamic behaviour of the aircraft may exhibit significant variations when flying at different altitudes, even for the same value of the dynamic pressure, so that a trade-off is required between different feasible controllers synthesized at different altitudes for a given equivalent airspeed. A multiobjective search is thus considered for the determination of the best suited solution to be introduced in the scheduling of the control law. The obtained results are then tested on a longitudinal non-linear model of the aircraft

    Разработка системы автоматизированного управления установки предварительного сброса пластовой воды

    Get PDF
    Объектом исследования является установка предварительного сброса воды. Цель работы – модернизация автоматизированной системы установки предварительного сброса воды с использованием ПЛК, на основе выбранной SCADA-системы. В данном проекте была разработана система контрoля и управления технологическим процессoм на базе промышленных контроллерoв ПЛК Siemens S7-1500, с применением SCADA-системы. Разработанная системa может применяться в системах контрoля, управления и сборa данных на различных промышленных предприятиях. Данная системa позволит увеличить производительность, повысить точность и надежность измерений, сократить число аварий.The object of the study is to install a preliminary discharge of water. The purpose of the work is to upgrade the automated system for the installation of a preliminary discharge of water using a PLC, based on the selected SCADA system. In this project, a process control and process control system was developed based on industrial PLC controllers Siemens S7-1500, using a SCADA system. The developed system can be used in control systems, control and data collection in various industrial enterprises. This system will increase productivity, improve measurement accuracy and reliability, and reduce the number of accidents

    Distributed Synthesis in Continuous Time

    Get PDF
    We introduce a formalism modelling communication of distributed agents strictly in continuous-time. Within this framework, we study the problem of synthesising local strategies for individual agents such that a specified set of goal states is reached, or reached with at least a given probability. The flow of time is modelled explicitly based on continuous-time randomness, with two natural implications: First, the non-determinism stemming from interleaving disappears. Second, when we restrict to a subclass of non-urgent models, the quantitative value problem for two players can be solved in EXPTIME. Indeed, the explicit continuous time enables players to communicate their states by delaying synchronisation (which is unrestricted for non-urgent models). In general, the problems are undecidable already for two players in the quantitative case and three players in the qualitative case. The qualitative undecidability is shown by a reduction to decentralized POMDPs for which we provide the strongest (and rather surprising) undecidability result so far

    Post-Newtonian approximation for isolated systems calculated by matched asymptotic expansions

    Get PDF
    Two long-standing problems with the post-Newtonian approximation for isolated slowly-moving systems in general relativity are: (i) the appearance at high post-Newtonian orders of divergent Poisson integrals, casting a doubt on the soundness of the post-Newtonian series; (ii) the domain of validity of the approximation which is limited to the near-zone of the source, and prevents one, a priori, from incorporating the condition of no-incoming radiation, to be imposed at past null infinity. In this article, we resolve the problem (i) by iterating the post-Newtonian hierarchy of equations by means of a new (Poisson-type) integral operator that is free of divergencies, and the problem (ii) by matching the post-Newtonian near-zone field to the exterior field of the source, known from previous work as a multipolar-post-Minkowskian expansion satisfying the relevant boundary conditions at infinity. As a result, we obtain an algorithm for iterating the post-Newtonian series up to any order, and we determine the terms, present in the post-Newtonian field, that are associated with the gravitational-radiation reaction onto an isolated slowly-moving matter system.Comment: 61 pages, to appear in Phys. Rev.

    On the choice of the update strength in estimation-of-distribution algorithms and ant colony optimization

    Get PDF
    Probabilistic model-building Genetic Algorithms (PMBGAs) are a class of metaheuristics that evolve probability distributions favoring optimal solutions in the underlying search space by repeatedly sampling from the distribution and updating it according to promising samples. We provide a rigorous runtime analysis concerning the update strength, a vital parameter in PMBGAs such as the step size 1 / K in the so-called compact Genetic Algorithm (cGA) and the evaporation factor ρ in ant colony optimizers (ACO). While a large update strength is desirable for exploitation, there is a general trade-off: too strong updates can lead to unstable behavior and possibly poor performance. We demonstrate this trade-off for the cGA and a simple ACO algorithm on the well-known OneMax function. More precisely, we obtain lower bounds on the expected runtime of Ω(Kn−−√+nlogn) and Ω(n−−√/ρ+nlogn), respectively, suggesting that the update strength should be limited to 1/K,ρ=O(1/(n−−√logn)). In fact, choosing 1/K,ρ∼1/(n−−√logn) both algorithms efficiently optimize OneMax in expected time Θ(nlogn). Our analyses provide new insights into the stochastic behavior of PMBGAs and propose new guidelines for setting the update strength in global optimization

    Deep-Learning based segmentation and quantification in experimental kidney histopathology

    Get PDF
    BACKGROUND: Nephropathologic analyses provide important outcomes-related data in experiments with the animal models that are essential for understanding kidney disease pathophysiology. Precision medicine increases the demand for quantitative, unbiased, reproducible, and efficient histopathologic analyses, which will require novel high-throughput tools. A deep learning technique, the convolutional neural network, is increasingly applied in pathology because of its high performance in tasks like histology segmentation. METHODS: We investigated use of a convolutional neural network architecture for accurate segmentation of periodic acid-Schiff-stained kidney tissue from healthy mice and five murine disease models and from other species used in preclinical research. We trained the convolutional neural network to segment six major renal structures: glomerular tuft, glomerulus including Bowman\u27s capsule, tubules, arteries, arterial lumina, and veins. To achieve high accuracy, we performed a large number of expert-based annotations, 72,722 in total. RESULTS: Multiclass segmentation performance was very high in all disease models. The convolutional neural network allowed high-throughput and large-scale, quantitative and comparative analyses of various models. In disease models, computational feature extraction revealed interstitial expansion, tubular dilation and atrophy, and glomerular size variability. Validation showed a high correlation of findings with current standard morphometric analysis. The convolutional neural network also showed high performance in other species used in research-including rats, pigs, bears, and marmosets-as well as in humans, providing a translational bridge between preclinical and clinical studies. CONCLUSIONS: We developed a deep learning algorithm for accurate multiclass segmentation of digital whole-slide images of periodic acid-Schiff-stained kidneys from various species and renal disease models. This enables reproducible quantitative histopathologic analyses in preclinical models that also might be applicable to clinical studies

    Bottom-up Solution Synthesis of Graphene Nanoribbons with Precisely Engineered Nanopores

    Get PDF
    The incorporation of nanopores into graphene nanostructures has been demonstrated as an efficient tool in tuning their band gaps and electronic structures. However, precisely embedding the uniform nanopores into graphene nanoribbons (GNRs) at the atomic level remains underdeveloped especially for in-solution synthesis due to the lack of efficient synthetic strategies. Herein we report the first case of solution-synthesized porous GNR (pGNR) with a fully conjugated backbone via the efficient Scholl reaction of tailor-made polyphenylene precursor (P1) bearing pre-installed hexagonal nanopores. The resultant pGNR features periodic subnanometer pores with a uniform diameter of 0.6 nm and an adjacent-pores-distance of 1.7 nm. To solidify our design strategy, two porous model compounds (1 a, 1 b) containing the same pore size as the shortcuts of pGNR, are successfully synthesized. The chemical structure and photophysical properties of pGNR are investigated by various spectroscopic analyses. Notably, the embedded periodic nanopores largely reduce the π-conjugation degree and alleviate the inter-ribbon π–π interactions, compared to the nonporous GNRs with similar widths, affording pGNR with a notably enlarged band gap and enhanced liquid-phase processability
    corecore