43 research outputs found

    Contractions, Removals and How to Certify 3-Connectivity in Linear Time

    Get PDF
    It is well-known as an existence result that every 3-connected graph G=(V,E) on more than 4 vertices admits a sequence of contractions and a sequence of removal operations to K_4 such that every intermediate graph is 3-connected. We show that both sequences can be computed in optimal time, improving the previously best known running times of O(|V|^2) to O(|V|+|E|). This settles also the open question of finding a linear time 3-connectivity test that is certifying and extends to a certifying 3-edge-connectivity test in the same time. The certificates used are easy to verify in time O(|E|).Comment: preliminary versio

    Compound Logics for Modification Problems

    Get PDF
    We introduce a novel model-theoretic framework inspired from graph modification and based on the interplay between model theory and algorithmic graph minors. The core of our framework is a new compound logic operating with two types of sentences, expressing graph modification: the modulator sentence, defining some property of the modified part of the graph, and the target sentence, defining some property of the resulting graph. In our framework, modulator sentences are in counting monadic second-order logic (CMSOL) and have models of bounded treewidth, while target sentences express first-order logic (FOL) properties along with minor-exclusion. Our logic captures problems that are not definable in first-order logic and, moreover, may have instances of unbounded treewidth. Also, it permits the modeling of wide families of problems involving vertex/edge removals, alternative modulator measures (such as elimination distance or G\mathcal{G}-treewidth), multistage modifications, and various cut problems. Our main result is that, for this compound logic, model-checking can be done in quadratic time. All derived algorithms are constructive and this, as a byproduct, extends the constructibility horizon of the algorithmic applications of the Graph Minors theorem of Robertson and Seymour. The proposed logic can be seen as a general framework to capitalize on the potential of the irrelevant vertex technique. It gives a way to deal with problem instances of unbounded treewidth, for which Courcelle's theorem does not apply. The proof of our meta-theorem combines novel combinatorial results related to the Flat Wall theorem along with elements of the proof of Courcelle's theorem and Gaifman's theorem. We finally prove extensions where the target property is expressible in FOL+DP, i.e., the enhancement of FOL with disjoint-paths predicates

    On efficiency and reliability in computer science

    Get PDF
    Efficiency of algorithms and robustness against mistakes in their implementation or uncertainties in their input has always been of central interest in computer science. This thesis presents results for a number of problems related to this topic. Certifying algorithms enable reliable implementations by providing a certificate with their answer. A simple program can check the answers using the certificates. If the the checker accepts, the answer of the complex program is correct. The user only has to trust the simple checker. We present a novel certifying algorithm for 3-edge-connectivity as well as a simplified certifying algorithm for 3-vertex-connectivity. Occasionally storing the state of computations, so called checkpointing, also helps with reliability since we can recover from errors without having to restart the computation. In this thesis we show how to do checkpointing with bounded memory and present several strategies to minimize the worst-case recomputation. In theory, the input for problems is accurate and well-defined. However, in practice it often contains uncertainties necessitating robust solutions. We consider a robust variant of the well known k-median problem, where the clients are grouped into sets. We want to minimize the connection cost of the expensive group. This solution is robust against which group we actually need to serve. We show that this problem is hard to approximate, even on the line, and evaluate heuristic solutions.Effizienz von Algorithmen und ZuverlĂ€ssigkeit gegen Fehlern in ihrer Implementierung oder Unsicherheiten in der Eingabe ist in der Informatik von großem Interesse. Diese Dissertation prĂ€sentiert Ergebnisse fĂŒr Probleme in diesem Themenfeld. Zertifizierende Algorithmen ermöglichen zuverlĂ€ssige Implementierungen durch Berechnung eines Zertifikats fĂŒr ihre Antworten. Ein einfaches Programm kann die Antworten mit den Zertifikaten ĂŒberprĂŒfen. Der Nutzer muss nur dem einfachen Programms vertrauen. Wir prĂ€sentieren einen neuen zertifizierenden Algorithmus fĂŒr 3-Kantenzusammenhang und einen vereinfachten zertifizierenden Algorithmus fĂŒr 3-Knotenzusammenhang. Den Zustand einer Berechnung gelegentlich zu speichern, sog. Checkpointing, verbessert die ZuverlĂ€ssigkeit. Im Fehlerfall kann ein gespeicherter Zustand wiederhergestellt werden ohne die Berechnung neu zu beginnen. Wir zeigen Strategien fĂŒr Checkpointing mit begrenztem Speicher, die die Neuberechnungszeit minimieren. Traditionell sind die Eingaben fĂŒr Probleme prĂ€zise und wohldefiniert. In der Praxis beinhalten die Eingaben allerdings Unsicherheiten und man braucht robuste Lösungen. Wir betrachten eine robuste Variante des k-median Problem. Hier sind die Kunden in Gruppen eingeteilt und wir möchten die Kosten der teuersten Gruppe minimieren. Dies macht die Lösung robust gegenĂŒber welche der Gruppen letztlich bedient werden soll. Wir zeigen, dass dieses Problem schwer zu approximieren ist und untersuchen Heuristiken

    Effondrements et homologie persistante

    Get PDF
    In this thesis, we introduce two new approaches to compute the Persistent Homology (PH) of a sequence of simplicial complexes. The basic idea is to simplify the complexes of the input sequence by using special types of collapses (strong and edge collapse) and to compute the PH of an induced sequence of smaller size that has the same PH as the initial one.Our first approach uses strong collapse which is introduced by J. Barmak and E. Miniam [DCG (2012)]. Strong collapse comprises of removal of special vertices called \textit{dominated} vertices from a simplicial complex.Our approach with strong collapse has several salient features that distinguishes it from previous work. It is not limited to filtrations (i.e. sequences of nested simplicial subcomplexes) but works for othertypes of sequences like towers and zigzags. To strong collapse a simplicial complex, we only need to store the maximal simplices of the complex, not the full set of all its simplices, which saves a lot ofspace and time. Moreover, the complexes in the sequence can be strong collapsed independently and in parallel.In the case of flag complexes strong collapse can be performed over the 1-skeleton of the complex and the resulting complex is also a flag complex. We show that if we restrict the class of simplicial complexes to flag complexes, we can achieve decisive improvement in terms of time and space complexities with respect to previous work. When we strong collapse the complexes in a flag tower, we obtain a reduced sequence that is also a flag tower we call the coreflag tower. We then convert the core flag tower to an equivalent filtration to compute its PH. Here again, we only use the 1-skeletons of the complexes. The resulting method is simple and extremelyefficient. We extend the notions of dominated vertex to a simplex of any dimension. Domination of edges appear to be very powerful and we study it in the case of flag complexes in more detail. We show that edge collapse (removal of dominated edges) in a flag complex can be performed using only the 1-skeleton of the complex as well. Furthermore, the residual complex is a flag complex as well. Next we show that, similar to the case of strong collapses, we can use edge collapses to reduce a flag filtration F to a smaller flag filtration F^c with the same persistence. Here again, we only use the 1-skeletons of the complexes. As a result and as demonstrated by numerous experiments on publicly available data sets, our approaches are extremely fast and memory efficient in practice. In particular the method using edge collapse performs the best among all known methods including the strong collapse approach. Finally, we can compromizebetween precision and time by choosing the number of simplicial complexes of the sequence we strong collapse.Dans cette thĂšse, nous introduisons deux nouvelles approches pour calculer l'homologie persistante(HP) d'une sĂ©quence de complexes simpliciaux. L'idĂ©e de base est de simplifier les complexes de la sĂ©quence d'entrĂ©e en utilisant des types spĂ©ciaux de collapses (effondrement), les collapses forts et les collapses d'arĂȘtes, et de calculer l'HP d'une sĂ©quence rĂ©duite de plus petite taille qui a la mĂȘme HP que la sĂ©quence initiale. Notre premiĂšre approche utilise les collapses forts introduits par J. Barmak et E. Miniam [DCG (2012)]. Un collapse fort supprime les sommets dits dominĂ©s d'un complexe simplicial. Notre approche utilisant les collapses forts a plusieurs caractĂ©ristiques qui la distinguent des travaux antĂ©rieurs. La mĂ©thode n'est pas limitĂ©e aux filtrations (c'est-Ă -dire aux sĂ©quences de sous-complexes simpliciaux imbriquĂ©s) mais fonctionne pour d'autres types de sĂ©quences comme les tours et les zigzags. Par ailleurs, pour implĂ©menter les collapses forts, il suffit de reprĂ©senter les simplexes maximaux du complexe, et pas l'ensemble de tous ses simplexes, ce qui Ă©conomise beaucoup d'espace et de temps. De plus, les complexes de la sĂ©quence peuvent ĂȘtre collapsĂ©s indĂ©pendamment et en parallĂšle.Dans le cas des complexes en drapeaux (flag complexes), les collapses forts peuvent ĂȘtre rĂ©alisĂ©s sur le 1-squelette du complexe et le complexe rĂ©sultat est Ă©galement un complexe en drapeau. Nous montrons que si l'on restreint la classe des complexes simpliciaux aux complexes en drapeaux, on peut amĂ©liorer la complexitĂ© en temps et en espace de facon dĂ©cisive par rapport aux travaux antĂ©rieurs. Lorsque les collapses forts sont appliquĂ©s aux complexes d'une tour de complexes en drapeau, nous obtenons une sĂ©quence rĂ©duite qui est aussi une tour de complexes en drapeau que nous appelons le coeur de la tour. Nous convertissons ensuite le coeur de la tour en une filtration Ă©quivalente pour calculer son HP. LĂ  encore, nous n'utilisons que les 1-squelettes des complexes. La mĂ©thode rĂ©sultante est simple et extrĂȘmement efficace.Nous Ă©tendons la notion de sommet dominĂ© au cas de simplexes de dimension quelconque. Le concept d'arĂȘte dominĂ©e apparait trĂšs puissant et nous l'Ă©tudions dans le cas des complexes en drapeaux de faconplus dĂ©taillĂ©e. Nous montrons que les collapses d'arĂȘtes (suppression des arĂȘtes dominĂ©es) dans un complexe en drapeaux peut ĂȘtre effectuĂ©, comme prĂ©cĂ©demment, en utilisant uniquement le 1-squelette du complexe. En outre, le complexe rĂ©siduel est Ă©galement un complexe de drapeaux. Ensuite, nous montrons que, comme dans le cas des collapses forts, on peut utiliser les collapses d'arĂȘtes pour rĂ©duire une filtration de complexes en drapeaux en une filtration de complexes en drapeaux plus petite qui a la mĂȘme HP. LĂ  encore, nous utilisons uniquement le 1-squelettes des complexes.Comme l'ont dĂ©montrĂ© de nombreuses expĂ©riences sur des donnĂ©es publiques, les approches dĂ©veloppĂ©es sont extrĂȘmement rapides et efficaces en mĂ©moire. En particulier, la mĂ©thode utilisant les collapses d'arĂȘtes offre de meilleures performances que toutes les mĂ©thodes connues, y compris l'approche par collapses forts. Enfin, nous pouvons faire des compromis entre prĂ©cision et temps de calcul en choisissant le nombre de complexes simpliciaux de la sĂ©quence Ă  collapser

    Restoring and valuing global kelp forest ecosystems

    Full text link
    Kelp forests cover ~30% of the world’s coastline and are the largest biogenic marine habitat on earth. Across their distribution, kelp forests are essential for the healthy functioning of marine ecosystems and consequently underpin many of the benefits coastal societies receive from the ocean. Concurrently, rising sea temperatures, overgrazing by marine herbivores, sedimentation, and water pollution have caused kelp forests populations to decline in most regions across the world. Effectively managing the response to these declines will be pivotal to maintaining healthy marine ecosystems and ensuring the benefits they provide are equitably distributed to coastal societies. In Chapter 1, I review how the marine management paradigm has shifted from protection to restoration as well as the consequences of this shift. Chapter 2 introduces the field of kelp forest restoration and provides a quantitative and qualitative review of 300 years of kelp forest restoration, exploring the genesis of restoration efforts, the lessons we have learned about restoration, and how we can develop the field for the future. Chapter 3 is a direct answer to the question faced while completing Chapter 2. This chapter details the need for a standardized marine restoration reporting framework, the benefits that it would provide, the challenges presented by creating one, and the solutions to these problems. Similarly, Chapter 4 is a response to the gaps discovered in Chapter 2. Chapter 4 explores how we can use naturally occurring positive species interactions and synergies with human activities to not only increase the benefits from ecosystem restoration but increase the probability that restoration is successful. The decision to restore an ecosystem or not is informed by the values and priorities of the society living in or managing that ecosystem. Chapter 5 quantifies the fisheries production, nutrient cycling, and carbon sequestration potential of five key genera of globally distributed kelp forests. I conclude the thesis by reviewing the lessons learned and the steps required to advance the field kelp forest restoration and conservation

    LIPIcs, Volume 261, ICALP 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 261, ICALP 2023, Complete Volum

    Diversity and risk patterns of freshwater megafauna: A global perspective

    Get PDF
    PhDFreshwaters are amongst the most diverse and dynamic ecosystems globally and provide vital ecosystem services for human well-being. At the same time, they are subject to intense and increasing threats due to the rapid growth of human population and the subsequent rise in demand for energy and food. However, freshwaters remain underrepresented in both biodiversity research and conservation actions. Consequently, populations of vertebrates in freshwaters have declined by 83% from 1970 to 2014 - the rate of decline being much higher than that recorded in either terrestrial or marine ecosystems. In addition, one third of all classified freshwater species are threatened with extinction according to the International Union for Conservation of Nature Red List of Threatened Species (IUCN Red List). Freshwater megafauna, i.e. freshwater animals ≄ 30 kg, are particularly susceptible to extinction owing to their intrinsic characteristics such as large habitat requirements, long lifespan, and late maturity. Despite the fact that many freshwater megafauna species such as sturgeons, river dolphins, crocodilians and giant turtles are teetering on the edge of extinction, a synthesis of global freshwater megafauna is lacking. In particular, changes in population abundance and distribution ranges of freshwater megafauna at large scales (e.g. continental and global scales) remain unclear. In addition, relationships between extinction risks of freshwater megafauna species and their life-history traits, and how human threats impact on such relationships are as yet insufficiently explored. This thesis aims to gain a comprehensive picture of global freshwater megafauna, with emphasis on their distribution, conservation status, main threats, population trends, and extinction risks. The body-mass threshold of 30 kg was chosen to include most of the large freshwater animals with the potential of acting as flagship or umbrellas species. Based on this definition, I compiled a list of 207 extant freshwater megafauna species (i.e. 130 fishes, 44 reptiles, 31 mammals and 2 amphibians) and established a freshwater-megafauna database containing information on their distribution, life-history traits, population change, conservation status and intensity of human threats within their distribution ranges. I found that freshwater megafauna are threatened globally, with 54% of all classified species considered as threatened on the IUCN Red List. There are intense and growing anthropogenic threats in many diversity hotspots of freshwater megafauna such as the Amazon, Congo, Mekong and Ganges-Brahmaputra river basins. The main threats to freshwater megafauna include overexploitation, dam construction, habitat degradation, pollution and biological invasions. These threats can cause reduced fitness, disrupted reproduction and increased mortality of freshwater megafauna, leading to population decline and range contraction. Indeed, global populations of freshwater megafauna declined by 88% from 1970 to 2012. Decline rates of populations in Indomalaya (-99%) and Palearctic (-97%) realms, and in mega-fish (-94%) were even higher. In addition, distribution ranges of 42% of all freshwater megafauna species in Europe contracted by more than 40% of historical areas. I found that the extinction risk of freshwater megafauna is jointly determined by external threats and traits associated with species’ recovery potential (i.e. lifespan, age at maturity, and fecundity). This underscores the importance of maintaining species’ recovery potential, particularly for those freshwater megafauna species with the smallest population sizes. On the basis of such relationships, 16 out of 49 unclassified freshwater megafauna species were predicted as threatened. This thesis emphasizes the critical plight of freshwater megafauna globally. The loss of freshwater megafauna will pose, and most likely has already had profound impacts on other species and ecological processes in freshwaters and surrounding ecosystems. It also highlights large gaps in life-history data, monitoring and conservation actions for the world’s largest freshwater animals, which reflects a currently poorly recognized global need, i.e. the conservation for freshwater biodiversity. This urges for more research to gain a comprehensive understanding of these large animals and more activities in science communication and outreach to inform the public and policymakers of the crisis in freshwater biodiversity and engage them in freshwater conservation. Based on the results of this thesis, freshwater megafauna are able to indicate the ecological integrity of ecosystems they inhabit and hold the potential to act as flagship and umbrella species. A megafauna-based approach could be a promising strategy to promote freshwater biodiversity conservation benefiting a broad range of co-occurring species. This should be considered when developing conservation strategies and establishing protected areas to halt biodiversity loss in freshwaters
    corecore