256 research outputs found

    Homology-based Distributed Coverage Hole Detection in Wireless Sensor Networks

    Get PDF
    Homology theory provides new and powerful solutions to address the coverage problems in wireless sensor networks (WSNs). They are based on algebraic objects, such as Cech complex and Rips complex. Cech complex gives accurate information about coverage quality but requires a precise knowledge of the relative locations of nodes. This assumption is rather strong and hard to implement in practical deployments. Rips complex provides an approximation of Cech complex. It is easier to build and does not require any knowledge of nodes location. This simplicity is at the expense of accuracy. Rips complex can not always detect all coverage holes. It is then necessary to evaluate its accuracy. This work proposes to use the proportion of the area of undiscovered coverage holes as performance criteria. Investigations show that it depends on the ratio between communication and sensing radii of a sensor. Closed-form expressions for lower and upper bounds of the accuracy are also derived. For those coverage holes which can be discovered by Rips complex, a homology-based distributed algorithm is proposed to detect them. Simulation results are consistent with the proposed analytical lower bound, with a maximum difference of 0.5%. Upper bound performance depends on the ratio of communication and sensing radii. Simulations also show that the algorithm can localize about 99% coverage holes in about 99% cases

    Formally Verified Tableau-Based Reasoners for a Description Logic

    Get PDF
    Description Logics are a family of logics used to represent and reason about conceptual and terminological knowledge. One of the most basic description logics is ALC , used as a basis from which to obtain others. Description logics are particularly important to provide a logical basis for the web ontology languages (such as OWL) used in the Semantic Web. In order to increase the reliability of the Semantic Web, formal methods can be applied, and in particular formal verification of its reasoning services can be carried out. In this paper, we present the formal verification of a tableau-based satisfiability algorithm for the logic ALC . The verification has been completed in several stages. First, we develop an abstract formalization of satisfiability-checking of ALC -concepts. Secondly, we define and formally verify a tableau-based algorithm in which the order of rule application and branch selection can be flexibly specified, using a methodology of refinements to transfer the main properties from the ALC abstract formalization. Finally, we obtain verified and executable reasoners from the algorithm via a process of instantiation.Ministerio de Ciencia e InnovaciĂłn TIN2009-09492Junta de AndalucĂ­a TIC-0606

    KP solitons in shallow water

    Full text link
    The main purpose of the paper is to provide a survey of our recent studies on soliton solutions of the Kadomtsev-Petviashvili (KP) equation. The classification is based on the far-field patterns of the solutions which consist of a finite number of line-solitons. Each soliton solution is then defined by a point of the totally non-negative Grassmann variety which can be parametrized by a unique derangement of the symmetric group of permutations. Our study also includes certain numerical stability problems of those soliton solutions. Numerical simulations of the initial value problems indicate that certain class of initial waves asymptotically approach to these exact solutions of the KP equation. We then discuss an application of our theory to the Mach reflection problem in shallow water. This problem describes the resonant interaction of solitary waves appearing in the reflection of an obliquely incident wave onto a vertical wall, and it predicts an extra-ordinary four-fold amplification of the wave at the wall. There are several numerical studies confirming the prediction, but all indicate disagreements with the KP theory. Contrary to those previous numerical studies, we find that the KP theory actually provides an excellent model to describe the Mach reflection phenomena when the higher order corrections are included to the quasi-two dimensional approximation. We also present laboratory experiments of the Mach reflection recently carried out by Yeh and his colleagues, and show how precisely the KP theory predicts this wave behavior.Comment: 50 pages, 25 figure

    AN INTERDISCIPLINARY APPROACH FOR THE SEISMIC VULNERABILITY ASSESSMENT OF HISTORICAL CENTRES IN MASONRY BUILDING AGGREGATES: APPLICATION TO THE CITY OF SCARPERIA, ITALY

    Get PDF
    Abstract. The seismic vulnerability of masonry building aggregates is very difficult to determine, since it is affected by many uncertainties. The most uncertain quantities concern the historical periodization of structural aggregates. Moreover, the studies made at the urban scale can hardly be thorough, and usually the knowledge achieved on the single units is not fully satisfactory, so that the structural designer has to deal with uncompleted architectonical surveys and partial data; one of the most important problems concerns the lack of knowledge about the boundary conditions between adjacent structures. In order to perform mechanical analyses, an extensive knowledge of materials and techniques adopted is required. In this paper, an integrated methodology for the seismic assessment of building aggregate is presented. It concerns a multidisciplinary knowledge-based approach calibrated over the historical centres and the urban aggregates; the procedure joins different aspects, such as the use of modern technologies for an integrated knowledge, plans reconstructions through archival documents, laser scanner digital survey of urban fronts, non-destructive investigations of the materials. GIS and BIM platforms have been used to implement and collect data in order to perform detailed analyses. The information allowed to assess the seismic vulnerability of the building aggregates and the expected damage scenarios through empirical methodologies. The city of Scarperia, founded a few kilometres from Florence during the Medieval Age and characterized by a medium seismicity, has been chosen as a case study for the presented procedure

    SUPERSYMMETRIC OBJECTS IN GAUGED SUPERGRAVITIES

    Get PDF
    The formulation of a unified description of fundamental interactions has always been the most relevant and intriguing challenge in physics. To this end, a microscopic description of gravity and its consequent unification to electromagnetic and nuclear forces is needed. A theory realizing such a framework would be called theory of quantum gravity and, so far, the only consistent setup which seeks to realize this purpose is given by string theory. In this context, the divergences appearing in classical gravity theory are resolved at high energies in terms of contributions coming from the physics of extra dimensions. From this point of view, in order to gain the stability of higher-dimensional objects, like strings and branes, the formulation of quantum theories of gravity requires supersymmetry. These fundamental extended objects give rise, at low energies, to supersymmetric systems that are strongly coupled to gravity like, for example, black holes. The theories describing the low-energy regime of objects living at higher dimensions are called supergravity theories and constitute the general setup of this thesis. These are classical interacting theories of gravity characterized by local supersymmetry. In this context, the description of supersymmetric objects is realized by classical solutions of the equations of motion and the link with high energies is controlled by the AdS/CFT correspondence and by string compactifications. Starting from this general framework, in this thesis the results published during the three years of doctoral studies are presented. Three particular supergravity realizations in four- five- and seven dimensions are considered and discussed in relation to the physics of the supersymmetric objects arising from them. The analysis of the results in terms of the underlying microscopic theories is also developed with particular emphasis on string compactifications, giving rise to the three supergravities considered, and on the holographic interpretation, explaining the origin of the supersymmetric solutions presented in terms of non-perturbative states of string theory

    A mathematical framework for critical transitions: normal forms, variance and applications

    Full text link
    Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.Comment: minor corrections to previous versio

    Remote sensing in Michigan for land resource management: Highway impact assessment

    Get PDF
    An existing section of M-14 freeway constructed in 1964 and a potential extension from Ann Arbor to Plymouth, Michigan provided an opportunity for investigating the potential uses of remote sensing techniques in providing projective information needed for assessing the impact of highway construction. Remote sensing data included multispectral scanner imagery and aerial photography. Only minor effects on vegetation, soils, and land use were found to have occurred in the existing corridor. Adverse changes expected to take place in the corridor proposed for extension of the freeway can be minimized by proper design of drainage ditches and attention to good construction practices. Remote sensing can be used to collect and present many types of data useful for highway impact assessment on land use, vegetation categories and species, soil properties and hydrologic characteristics

    Faster and better: a machine learning approach to corner detection

    Full text link
    The repeatability and efficiency of a corner detector determines how likely it is to be useful in a real-world application. The repeatability is importand because the same scene viewed from different positions should yield features which correspond to the same real-world 3D locations [Schmid et al 2000]. The efficiency is important because this determines whether the detector combined with further processing can operate at frame rate. Three advances are described in this paper. First, we present a new heuristic for feature detection, and using machine learning we derive a feature detector from this which can fully process live PAL video using less than 5% of the available processing time. By comparison, most other detectors cannot even operate at frame rate (Harris detector 115%, SIFT 195%). Second, we generalize the detector, allowing it to be optimized for repeatability, with little loss of efficiency. Third, we carry out a rigorous comparison of corner detectors based on the above repeatability criterion applied to 3D scenes. We show that despite being principally constructed for speed, on these stringent tests, our heuristic detector significantly outperforms existing feature detectors. Finally, the comparison demonstrates that using machine learning produces significant improvements in repeatability, yielding a detector that is both very fast and very high quality.Comment: 35 pages, 11 figure
    • 

    corecore