56 research outputs found

    Knowledge base ontological debugging guided by linguistic evidence

    Get PDF
    Le résumé en français n'a pas été communiqué par l'auteur.When they grow in size, knowledge bases (KBs) tend to include sets of axioms which are intuitively absurd but nonetheless logically consistent. This is particularly true of data expressed in OWL, as part of the Semantic Web framework, which favors the aggregation of set of statements from multiple sources of knowledge, with overlapping signatures.Identifying nonsense is essential if one wants to avoid undesired inferences, but the sparse usage of negation within these datasets generally prevents the detection of such cases on a strict logical basis. And even if the KB is inconsistent, identifying the axioms responsible for the nonsense remains a non trivial task. This thesis investigates the usage of automatically gathered linguistic evidence in order to detect and repair violations of common sense within such datasets. The main intuition consists in exploiting distributional similarity between named individuals of an input KB, in order to identify consequences which are unlikely to hold if the rest of the KB does. Then the repair phase consists in selecting axioms to be preferably discarded (or at least amended) in order to get rid of the nonsense. A second strategy is also presented, which consists in strengthening the input KB with a foundational ontology, in order to obtain an inconsistency, before performing a form of knowledge base debugging/revision which incorporates this linguistic input. This last step may also be applied directly to an inconsistent input KB. These propositions are evaluated with different sets of statements issued from the Linked Open Data cloud, as well as datasets of a higher quality, but which were automatically degraded for the evaluation. The results seem to indicate that distributional evidence may actually constitute a relevant common ground for deciding between conflicting axioms

    Knowledge base ontological debugging guided by linguistic evidence

    Get PDF
    Le résumé en français n'a pas été communiqué par l'auteur.When they grow in size, knowledge bases (KBs) tend to include sets of axioms which are intuitively absurd but nonetheless logically consistent. This is particularly true of data expressed in OWL, as part of the Semantic Web framework, which favors the aggregation of set of statements from multiple sources of knowledge, with overlapping signatures.Identifying nonsense is essential if one wants to avoid undesired inferences, but the sparse usage of negation within these datasets generally prevents the detection of such cases on a strict logical basis. And even if the KB is inconsistent, identifying the axioms responsible for the nonsense remains a non trivial task. This thesis investigates the usage of automatically gathered linguistic evidence in order to detect and repair violations of common sense within such datasets. The main intuition consists in exploiting distributional similarity between named individuals of an input KB, in order to identify consequences which are unlikely to hold if the rest of the KB does. Then the repair phase consists in selecting axioms to be preferably discarded (or at least amended) in order to get rid of the nonsense. A second strategy is also presented, which consists in strengthening the input KB with a foundational ontology, in order to obtain an inconsistency, before performing a form of knowledge base debugging/revision which incorporates this linguistic input. This last step may also be applied directly to an inconsistent input KB. These propositions are evaluated with different sets of statements issued from the Linked Open Data cloud, as well as datasets of a higher quality, but which were automatically degraded for the evaluation. The results seem to indicate that distributional evidence may actually constitute a relevant common ground for deciding between conflicting axioms

    System- and Data-Driven Methods and Algorithms

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This two-volume handbook covers methods as well as applications. This first volume focuses on real-time control theory, data assimilation, real-time visualization, high-dimensional state spaces and interaction of different reduction techniques

    A New Rigid Block Model for Masonry Structures

    Get PDF
    This dissertation presents a new rigid block model for the analysis of masonry elements and structures. In this work a masonry structure is modelled as a continuum composed by Normal Rigid No-Tension (NRNT) material. The NRNT represents an extension of the model material of Heyman to 2d/3d continua. The material is rigid in compression, but extensional deformations, allowed at zero energy price, can be either regular or singular; then extensional deformation can appear as either diffuse (smeared cracks) or concentrated (macroscopic cracks), and there is not any reason to prefer one upon another, on an energy ground. The fact that rigid block deformation seems to be the preferred failure mode for real masonry structures stems from mechanical characteristics, such as toughness, interlocking, finite friction and cohesion, that are not inherent to the simplified NRNT continuum model. So, it is interesting to see if rigid block mechanisms can arise naturally in solving the equilibrium problem, and if there is any legitimate way to force rigid block mechanisms over diffuse cracking. The boundary value problem for a continuum composed of NRNT material is formulated as an energy minimum search, and two numerical methods for approximating the solution are proposed. With the PR method the energy is minimized in the set of piecewise-rigid (PR) displacements. With the C0 method the possibility to restrict the search of the minimum to continuous (C0) displacement fields, by adopting some classical Finite Element (FE) approximation, is explored. The C0 solution, though more cumbersome from the numerical point of view, appears as more adaptable than the rigid block approximation, in approximating fracture lines that are slanted, that is far from being located on the skeleton of a mesh. This sluggishness of the rigid block approximation in reproducing “slanted” cracks (that is cracks not developing along the boundary of the rigid elements), is one of the main critical issues inherent to the piecewise rigid (PR) approximation. Then there is another reason to adopt the C0 approximation, namely for seeing if the C0 approximation can be used, in combination with the PR approximation, as a preliminary analysis for suggesting the optimal rigid block partition. After the two methods are introduced, some benchmark problems are analysed to illustrate the numerical performances of the two approaches and the “pro et contra” of these two opposed strategies. The effectiveness of the rigid block approach combined with energy minimization, is also tested on some benchmark problems, such as the problem of the arch subject to given settlements, the Couplet’s problem, and the determination of the collapse load multiplier of horizontal forces. A number of case studies concerning real masonry structures is also presented to illustrate the effectiveness of the proposed method

    Ontological Investigations in the Quantum Domain: A deflationary approach on ontology of physics

    Get PDF
    The aim of this thesis is to propose a deflationary approach towards the ontological analysis of physical theories. Such an approach sustains that the development of ontologies for physical theories must be neutral relatively to the debate between realists and anti-realists in philosophy of physics. Mainly, our attention will be oriented towards what we called "quantum domain", which includes the non-relativistic Quantum Mechanics and variants of the Quantum Field Theory. This meta-ontological approach consists in an attempt to provide a methodology for the development of specific ontologies for physical theories. With this aim we suggest the separation between physical phenomena and physical theories on one side, and physical theories and ontologies on the other side. This separation intends to endorse the idea that physical theories can provide all the informative contents required for the development of the ontologies associated with them. The result is a defense of the idea that physical theories can positively contribute for the constitution of the ontologies in physics. Additionally, we present a critical analysis of four different ontological approaches to QM and QFT. Each one of these interpretations deals with a different ontological category (tropes, events and processes). At the end of the thesis we will analyze the concepts of interaction and commutation in QFT in order to exemplify the main lines of the deflationary approach suggested here

    Geometric complexity theory, tensor rank, and Littlewood-Richardson coefficients

    Get PDF
    Diese Arbeit führt gründlich in die Geometrische Komplexitätstheorie ein, ein Ansatz, um untere Berechnungskomplexitätsschranken mittels Methoden aus der algebraischen Geometrie und Darstellungstheorie zu finden. Danach konzentrieren wir uns auf die relevanten darstellungstheoretischen Multiplizitäten, und zwar auf Plethysmenkoeffizienten, Kronecker-Koeffizienten und Littlewood-Richardson-Koeffizienten. Diese Multiplizitäten haben eine Beschreibung als Dimensionen von Höchstgewichtsvektorräumen, für welche konkrete Basen nur im Littlewood-Richardson-Fall bekannt sind.Durch explizite Konstruktion von Höchstgewichtsvektoren können wir zeigen, dass der Grenzrang der m x m Matrixmultiplikation mindestens 3 m^2 - 2 ist, und der Grenzrang der 2 x 2 Matrixmultiplikation genau sieben ist. Dies liefert einen neuen Beweis für ein Ergebnis von Landsberg (J. Amer. Math. Soc., 19:447-459, 2005).Desweiteren erhalten wir Nichtverschwindungsresultate für rechteckige Kronecker-Koeffizienten und wir beweisen eine Vermutung von Weintraub (J. Algebra, 129 (1): 103-114, 1990) uber das Nicht-Verschwinden von Plethysmen-koeffizienten von geraden Partitionen.Unsere eingehenden Untersuchungen zu Littlewood-Richardson-Koeffizienten c_We provide a thorough introduction to Geometric Complexity Theory, an approach towards computational complexity lower bounds via methods from algebraic geometry and representation theory. Then we focus on the relevant representation theoretic multiplicities, namely plethysm coefficients, Kronecker coefficients, and Littlewood-Richardson coefficients. These multiplicities can be described as dimensions of highest weight vector spaces for which explicit bases are known only in the Littlewood-Richardson case.By explicit construction of highest weight vectors we can show that the border rank of m x m matrix multiplication is a least 3 m^2 - 2 and the border rank of 2 x 2 matrix multiplication is exactly seven. The latter gives a new proof of a result by Landsberg (J. Amer. Math. Soc., 19:447-459, 2005).Moreover, we obtain new nonvanishing results for rectangular Kronecker coefficients and we prove a conjecture by Weintraub (J. Algebra, 129 (1): 103-114, 1990) about the nonvanishing of plethysm coefficients of even partitions.Our in-depth study of Littlewood-Richardson coefficients c_Tag der Verteidigung: 18.10.2012Paderborn, Univ., Diss., 201

    Applications

    Get PDF

    Model Order Reduction

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This three-volume handbook covers methods as well as applications. This third volume focuses on applications in engineering, biomedical engineering, computational physics and computer science
    • …
    corecore