2,952 research outputs found

    Attacking Shortest Paths by Cutting Edges

    Full text link
    Identifying shortest paths between nodes in a network is a common graph analysis problem that is important for many applications involving routing of resources. An adversary that can manipulate the graph structure could alter traffic patterns to gain some benefit (e.g., make more money by directing traffic to a toll road). This paper presents the Force Path Cut problem, in which an adversary removes edges from a graph to make a particular path the shortest between its terminal nodes. We prove that this problem is APX-hard, but introduce PATHATTACK, a polynomial-time approximation algorithm that guarantees a solution within a logarithmic factor of the optimal value. In addition, we introduce the Force Edge Cut and Force Node Cut problems, in which the adversary targets a particular edge or node, respectively, rather than an entire path. We derive a nonconvex optimization formulation for these problems, and derive a heuristic algorithm that uses PATHATTACK as a subroutine. We demonstrate all of these algorithms on a diverse set of real and synthetic networks, illustrating the network types that benefit most from the proposed algorithms.Comment: 37 pages, 11 figures; Extended version of arXiv:2104.0376

    Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)

    Full text link
    During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory, information theory, theoretical computer science, and learning theory. This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors. The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication. The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities. The basic ingredients of the entropy method are discussed first in the context of logarithmic Sobolev inequalities, which underlie the so-called functional approach to concentration of measure, and then from a complementary information-theoretic viewpoint based on transportation-cost inequalities and probability in metric spaces. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method to problems in communications and coding, including strong converses, empirical distributions of good channel codes, and an information-theoretic converse for concentration of measure.Comment: Foundations and Trends in Communications and Information Theory, vol. 10, no 1-2, pp. 1-248, 2013. Second edition was published in October 2014. ISBN to printed book: 978-1-60198-906-

    Structural Agnostic Modeling: Adversarial Learning of Causal Graphs

    Full text link
    A new causal discovery method, Structural Agnostic Modeling (SAM), is presented in this paper. Leveraging both conditional independencies and distributional asymmetries in the data, SAM aims at recovering full causal models from continuous observational data along a multivariate non-parametric setting. The approach is based on a game between dd players estimating each variable distribution conditionally to the others as a neural net, and an adversary aimed at discriminating the overall joint conditional distribution, and that of the original data. An original learning criterion combining distribution estimation, sparsity and acyclicity constraints is used to enforce the end-to-end optimization of the graph structure and parameters through stochastic gradient descent. Besides the theoretical analysis of the approach in the large sample limit, SAM is extensively experimentally validated on synthetic and real data

    Beurteilung der Resttragfähigkeit von Bauwerken mit Hilfe der Fuzzy-Logik und Entscheidungstheorie

    Get PDF
    Whereas the design of new structures is almost completely regulated by codes, there are no objective ways for the evaluation of existing facilities. Experts often are not familiar with the new tasks in system identification and try to retrieve at least some information from available documents. They therefore make compromises which, for many stakeholders, are not satisfying. Consequently, this publication presents a more objective and more realistic method for condition assessment. Necessary basics for this task are fracture mechanics combined with computational analysis, methods and techniques for geometry recording and material investigation, ductility and energy dissipation, risk analysis and uncertainty consideration. Present tools for evaluation perform research on how to analytically conceptualize a structure directly from given loads and measured response. Since defects are not necessarily visible or in a direct way detectable, several damage indices are combined and integrated in a model of the real system. Fuzzy-sets are ideally suited to illustrate parametric/data uncertainty and system- or model uncertainty. Trapezoidal membership functions may very well represent the condition state of structural components as function of damage extent or performance. Tthe residual load-bearing capacity can be determined by successively performing analyses in three steps. The "Screening assessment" shall eliminate a large majority of structures from detailed consideration and advise on immediate precautions to save lives and high economic values. Here, the defects have to be explicitly defined and located. If this is impossible, an "approximate evaluation" should follow describing system geometry, material properties and failure modes in detail. Here, a fault-tree helps investigate defaults in a systematic way avoiding random search or negligence of important features or damage indices. In order to inform about the structural system it is deemed essential not only due to its conceptual clarity, but also due to its applicational simplicity. It therefore represents an important prerequisite in condition assessment though special circumstances might require "fur-ther investigations" to consider the actual material parameters and unaccounted reserves due to spatial or other secondary contributions. Here, uncertainties with respect to geometry, material, loading or modeling should in no case be neglected, but explicitly quantified. Postulating a limited set of expected failure modes is not always sufficient, since detectable signature changes are seldom directly attributable and every defect might -together with other unforeseen situations- become decisive. So, a determination of all possible scenarios to consider every imaginable influence would be required. Risk is produced by a combination of various and ill-defined failure modes. Due to the interaction of many variables there is no simple and reliable way to predict which failure mode is dominant. Risk evaluation therefore comprises the estimation of the prognostic factor with respect to undesir-able events, component importance and the expected damage extent.Während die Bemessung von Tragwerken im allgemeinen durch Vorschriften geregelt ist, gibt es für die Zustandsbewertung bestehender Bauwerken noch keine objektiven Richtlinien. Viele Experten sind mit der neuen Problematik (Systemidentifikation anhand von Belastung und daraus entstehender Strukturantwort) noch nicht vertraut und begnügen sich daher mit Kompromißlösungen. Für viele Bauherren ist dies unbefriedigend, weshalb hier eine objektivere und wirklichkeitsnähere Zustandsbewertung vorgestellt wird. Wichtig hierfür sind theoretische Grundlagen der Schadensanalyse, Methoden und Techniken zur Geometrie- und Materialerkundung, Duktilität und Energieabsorption, Risikoanalyse und Beschreibung von Unsicherheiten. Da nicht alle Schäden offensichtlich sind, kombiniert man zur Zeit mehrere Zustandsindikatoren, bereitet die registrierten Daten gezielt auf, und integriert sie vor einer endgültigen Bewertung in ein validiertes Modell. Werden deterministische Nachweismethoden mit probabilstischen kombiniert, lassen sich nur zufällige Fehler problemlos minimieren. Systematische Fehler durch ungenaue Modellierung oder vagem Wissen bleiben jedoch bestehen. Daß Entscheidungsträger mit unsicheren, oft sogar widersprüchlichen Angaben subjektiv urteilen, ist also nicht zu vermeiden. In dieser Arbeit wird gezeigt, wie mit Hilfe eines dreistufigen Bewertungsverfahrens Tragglieder in Qualitätsklassen eingestuft werden können. Abhängig von ihrem mittleren Schadensausmaß, ihrer Strukturbedeutung I (wiederum von ihrem Stellenwert bzw. den Konsequenzen ihrer Schädigung abhängig) und ihrem Prognosefaktor L ergibt sich ihr Versagensrisiko mit. Das Risiko für eine Versagen der Gesamtstruktur wird aus der Topologie ermittelt. Wenn das mittlere Schadensausmaß nicht eindeutig festgelegt werden kann, oder wenn die Material-, Geometrie- oder Lastangaben vage sind, wird im Rahmen "Weitergehender Untersuchungen" ein mathematisches Verfahren basierend auf der Fuzzy-Logik vorgeschlagen. Es filtert auch bei komplexen Ursache-Wirkungsbeziehungen die dominierende Schadensursache heraus und vermeidet, daß mit Unsicherheiten behaftete Parameter für zuverlässige Absolutwerte gehalten werden. Um den mittleren Schadensindex und daraus das Risiko zu berechnen, werden die einzelnen Schadensindizes (je nach Fehlermodus) abhängig von ihrer Bedeutung mit Wichtungsfaktoren belegt,und zusätzlich je nach Art, Bedeutung und Zuverlässigkeit der erhaltenen Information durch Gamma dividiert. Hiermit wurde ein neues Verfahren zur Analyse komplexer Versagensmechanismen vorgestellt, welches nachvollziehbare Schlußfolgerungen ermöglicht
    corecore