518 research outputs found

    Protecting a Graph with Mobile Guards

    Full text link
    Mobile guards on the vertices of a graph are used to defend it against attacks on either its vertices or its edges. Various models for this problem have been proposed. In this survey we describe a number of these models with particular attention to the case when the attack sequence is infinitely long and the guards must induce some particular configuration before each attack, such as a dominating set or a vertex cover. Results from the literature concerning the number of guards needed to successfully defend a graph in each of these problems are surveyed.Comment: 29 pages, two figures, surve

    First Order Logic and Twin-Width in Tournaments

    Get PDF
    We characterise the classes of tournaments with tractable first-order model checking. For every hereditary class of tournaments T, first-order model checking either is fixed parameter tractable, or is AW[*]-hard. This dichotomy coincides with the fact that T has either bounded or unbounded twin-width, and that the growth of T is either at most exponential or at least factorial. From the model-theoretic point of view, we show that NIP classes of tournaments coincide with bounded twin-width. Twin-width is also characterised by three infinite families of obstructions: T has bounded twin-width if and only if it excludes at least one tournament from each family. This generalises results of Bonnet et al. on ordered graphs. The key for these results is a polynomial time algorithm which takes as input a tournament T and computes a linear order < on V(T) such that the twin-width of the birelation (T, <) is at most some function of the twin-width of T. Since approximating twin-width can be done in FPT time for an ordered structure (T, <), this provides a FPT approximation of twin-width for tournaments

    A comparative study between the cubic spline and b-spline interpolation methods in free energy calculations

    Get PDF
    Numerical methods are essential in computational science, as analytic calculations for large datasets are impractical. Using numerical methods, one can approximate the problem to solve it with basic arithmetic operations. Interpolation is a commonly-used method, inter alia, constructing the value of new data points within an interval of known data points. Furthermore, polynomial interpolation with a sufficiently high degree can make the data set differentiable. One consequence of using high-degree polynomials is the oscillatory behaviour towards the endpoints, also known as Runge's Phenomenon. Spline interpolation overcomes this obstacle by connecting the data points in a piecewise fashion. However, its complex formulation requires nested iterations in higher dimensions, which is time-consuming. In addition, the calculations have to be repeated for computing each partial derivative at the data point, leading to further slowdown. The B-spline interpolation is an alternative representation of the cubic spline method, where a spline interpolation at a point could be expressed as the linear combination of piecewise basis functions. It was proposed that implementing this new formulation can accelerate many scientific computing operations involving interpolation. Nevertheless, there is a lack of detailed comparison to back up this hypothesis, especially when it comes to computing the partial derivatives. Among many scientific research fields, free energy calculations particularly stand out for their use of interpolation methods. Numerical interpolation was implemented in free energy methods for many purposes, from calculating intermediate energy states to deriving forces from free energy surfaces. The results of these calculations can provide insight into reaction mechanisms and their thermodynamic properties. The free energy methods include biased flat histogram methods, which are especially promising due to their ability to accurately construct free energy profiles at the rarely-visited regions of reaction spaces. Free Energies from Adaptive Reaction Coordinates (FEARCF) that was developed by Professor Kevin J. Naidoo has many advantages over the other flat histogram methods. iii Because of its treatment of the atoms in reactions, FEARCF makes it easier to apply interpolation methods. It implements cubic spline interpolation to derive biasing forces from the free energy surface, driving the reaction towards regions with higher energy. A major drawback of the method is the slowdown experienced in higher dimensions due to the complicated nature of the cubic spline routine. If the routine is replaced by a more straightforward B-spline interpolation, sampling and generating free energy surfaces can be accelerated. The dissertation aims to perform a comparative study between the cubic spline interpolation and B-spline interpolation methods. At first, data sets of analytic functions were used instead of numerical data to compare the accuracy and compute the percentage errors of both methods by taking the functions themselves as reference. These functions were used to evaluate the performances of the two methods at the endpoints, inflections points and regions with a steep gradient. Both interpolation methods generated identically approximated values with a percentage error below the threshold of 1%, although they both performed poorly at the endpoints and the points of inflection. Increasing the number of interpolation knots reduced the errors, however, it caused overfitting in the other regions. Although significant speed-up was not observed in the univariate interpolation, cubic spline suffered from a drastic slowdown in higher dimensions with up to 103 in 3D and 105 in 4D interpolations. The same results applied to the classical molecular dynamics simulations with FEARCF with a speed-up of up to 103 when B-spline interpolation was implemented. To conclude, the B-spline interpolation method can enhance the efficiency of the free energy calculations where cubic spline interpolation has been the currently-used method

    LIPIcs, Volume 274, ESA 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 274, ESA 2023, Complete Volum

    Development and implementation of a combined discrete and finite element multibody dynamics simulation environment

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2001.Includes bibliographical references (p. [195]-198) and index.Some engineering applications and physical phenomena involve multiple bodies that undergo large displacements involving collisions between the bodies. Considering the difficulties and cost associated when conducting physical experiments of such systems, there is a demand for numerical simulation capabilities. The discrete element methods (DEM) are numerical techniques that have been specifically developed to facilitate simulations of distinct bodies that interact with each other through contact forces. In DEM the simulated bodies are typically assumed to be infinitely rigid. However, there are multibody systems for which it is useful to take into account the deformability of the simulated bodies. The objective of this research is to incorporate deformability in DEM, enabling the evaluation of the stress and strain distributions within simulated bodies during simulation. In order to achieve this goal, an Updated Lagrangian (UL) Finite Element (FE) formulation and an explicit time integration scheme have been employed together with some simplifiying assumptions to linearize this highly nonlinear contact problem and obtain solutions with realistic computational cost. An object-oriented extendable computational tool has been built specifically to allow us to simulate multiple distinct bodies that interact through contact forces allowing selected bodies to be deformable. Database technology has also been utilized in order to efficiently handle the huge amounts of computed results.by Petros Komodromos.Ph.D

    Analysis and design of multirate-multivariable sampled data systems

    Get PDF
    Imperial Users onl

    Computational Algorithms for Predicting Membrane Protein Assembly From Angstrom to Micron Scale

    Get PDF
    Biological barriers in the human body are one of the most crucial interfaces perfected through evolution for diverse and unique functions. Of the wide range of barriers, the paracellular protein interfaces of epithelial and endothelial cells called tight junctions with high molecular specificities are vital for homeostasis and to maintain proper health. While the breakdown of these barriers is associated with serious pathological consequences, their intact presence also poses a challenge to effective delivery of therapeutic drugs. Complimenting a rigorous combination of in vitro and in vivo approaches to establishing the fundamental biological construct, in addition to elucidating pathological implications and pharmaceutical interests, a systematic in silico approach is undertaken in this work in order to complete the molecular puzzle of the tight junctions. This work presents a bottom-up approach involving a careful consideration of protein interactions with Angstrom-level details integrated systematically, based on the principles of statistical thermodynamics and probabilities and designed using well-structured computational algorithms, up to micron-level molecular architecture of tight junctions, forming a robust prediction with molecular details packed for up to four orders of magnitude in length scale. This work is intended to bridge the gap between the computational nano-scale studies and the experimental micron-scale observations and provide a molecular explanation for cellular behaviors in the maintenance, and the adverse consequences of breakdown of these barriers. Furthermore, a comprehensive understanding of tight junctions shall enable development of safe strategies for enhanced delivery of therapeutics

    Evolutionary Computation

    Get PDF
    This book presents several recent advances on Evolutionary Computation, specially evolution-based optimization methods and hybrid algorithms for several applications, from optimization and learning to pattern recognition and bioinformatics. This book also presents new algorithms based on several analogies and metafores, where one of them is based on philosophy, specifically on the philosophy of praxis and dialectics. In this book it is also presented interesting applications on bioinformatics, specially the use of particle swarms to discover gene expression patterns in DNA microarrays. Therefore, this book features representative work on the field of evolutionary computation and applied sciences. The intended audience is graduate, undergraduate, researchers, and anyone who wishes to become familiar with the latest research work on this field
    • …
    corecore