17,551 research outputs found

    The COST IRACON Geometry-based Stochastic Channel Model for Vehicle-to-Vehicle Communication in Intersections

    Full text link
    Vehicle-to-vehicle (V2V) wireless communications can improve traffic safety at road intersections and enable congestion avoidance. However, detailed knowledge about the wireless propagation channel is needed for the development and realistic assessment of V2V communication systems. We present a novel geometry-based stochastic MIMO channel model with support for frequencies in the band of 5.2-6.2 GHz. The model is based on extensive high-resolution measurements at different road intersections in the city of Berlin, Germany. We extend existing models, by including the effects of various obstructions, higher order interactions, and by introducing an angular gain function for the scatterers. Scatterer locations have been identified and mapped to measured multi-path trajectories using a measurement-based ray tracing method and a subsequent RANSAC algorithm. The developed model is parameterized, and using the measured propagation paths that have been mapped to scatterer locations, model parameters are estimated. The time variant power fading of individual multi-path components is found to be best modeled by a Gamma process with an exponential autocorrelation. The path coherence distance is estimated to be in the range of 0-2 m. The model is also validated against measurement data, showing that the developed model accurately captures the behavior of the measured channel gain, Doppler spread, and delay spread. This is also the case for intersections that have not been used when estimating model parameters.Comment: Submitted to IEEE Transactions on Vehicular Technolog

    A Distance-Based Test of Association Between Paired Heterogeneous Genomic Data

    Full text link
    Due to rapid technological advances, a wide range of different measurements can be obtained from a given biological sample including single nucleotide polymorphisms, copy number variation, gene expression levels, DNA methylation and proteomic profiles. Each of these distinct measurements provides the means to characterize a certain aspect of biological diversity, and a fundamental problem of broad interest concerns the discovery of shared patterns of variation across different data types. Such data types are heterogeneous in the sense that they represent measurements taken at very different scales or described by very different data structures. We propose a distance-based statistical test, the generalized RV (GRV) test, to assess whether there is a common and non-random pattern of variability between paired biological measurements obtained from the same random sample. The measurements enter the test through distance measures which can be chosen to capture particular aspects of the data. An approximate null distribution is proposed to compute p-values in closed-form and without the need to perform costly Monte Carlo permutation procedures. Compared to the classical Mantel test for association between distance matrices, the GRV test has been found to be more powerful in a number of simulation settings. We also report on an application of the GRV test to detect biological pathways in which genetic variability is associated to variation in gene expression levels in ovarian cancer samples, and present results obtained from two independent cohorts

    Evaluating environmental issue- Valuation as co-ordination in a pluralistic world.

    Get PDF
    Les méthodes classiques d'évaluation des ressources environnementales et des impacts sur l'environnement font implicitement l'hypothèse que la détermination des bonnes valeurs est la question clé pour la décision publique. Le processus de décision est alors compris comme l'application d'un classement univoque et objectivement déterminé d'un ensemble d'actions possibles. Quelle que soit la complexité du processus de décision dans le monde réel, concepts et méthodes d'évaluation demeurent intangibles. Le paysage se transforme lorsqu'on considère les pratiques d'évaluation comme une composante d'un processus de coordination publique impliquant des acteurs en conflit et porteurs d'enjeux variés. L'hypothèse explorée dans cet article est que la reconnaissance du caractère complexe et conflictuel de la décision appelle une nouvelle compréhension de l'évaluation elle-même. L'article propose de considérer l'évaluation comme le support de la recherche d'un accord légitime entre plusieurs types d'acteurs. Ce faisant l'évaluation doit répondre aux exigences générales de justification sur la scène publique. De ce point de vue, l'évaluation économique offre un cadre important, qui n'est toutefois qu'un cadre parmi d'autres. L'article pointe une direction clé qui est la recherche de compromis de justification et l'adoption de conventions méthodologiques qui soient en ligne avec les repères de justification utilisés par les acteurs concernés.Ordres de justification;Environnement;Evaluation;Décision publique

    Space Complexity of Fault-Tolerant Register Emulations

    Get PDF
    Driven by the rising popularity of cloud storage, the costs associated with implementing reliable storage services from a collection of fault-prone servers have recently become an actively studied question. The well-known ABD result shows that an f-tolerant register can be emulated using a collection of 2f + 1 fault-prone servers each storing a single read-modify-write object type, which is known to be optimal. In this paper we generalize this bound: we investigate the inherent space complexity of emulating reliable multi-writer registers as a fucntion of the type of the base objects exposed by the underlying servers, the number of writers to the emulated register, the number of available servers, and the failure threshold. We establish a sharp separation between registers, and both max-registers (the base object types assumed by ABD) and CAS in terms of the resources (i.e., the number of base objects of the respective types) required to support the emulation; we show that no such separation exists between max-registers and CAS. Our main technical contribution is lower and upper bounds on the resources required in case the underlying base objects are fault-prone read/write registers. We show that the number of required registers is directly proportional to the number of writers and inversely proportional to the number of servers.Comment: Conference version appears in Proceedings of PODC '1

    Learning to Localize and Align Fine-Grained Actions to Sparse Instructions

    Full text link
    Automatic generation of textual video descriptions that are time-aligned with video content is a long-standing goal in computer vision. The task is challenging due to the difficulty of bridging the semantic gap between the visual and natural language domains. This paper addresses the task of automatically generating an alignment between a set of instructions and a first person video demonstrating an activity. The sparse descriptions and ambiguity of written instructions create significant alignment challenges. The key to our approach is the use of egocentric cues to generate a concise set of action proposals, which are then matched to recipe steps using object recognition and computational linguistic techniques. We obtain promising results on both the Extended GTEA Gaze+ dataset and the Bristol Egocentric Object Interactions Dataset

    Progress in AI Planning Research and Applications

    Get PDF
    Planning has made significant progress since its inception in the 1970s, in terms both of the efficiency and sophistication of its algorithms and representations and its potential for application to real problems. In this paper we sketch the foundations of planning as a sub-field of Artificial Intelligence and the history of its development over the past three decades. Then some of the recent achievements within the field are discussed and provided some experimental data demonstrating the progress that has been made in the application of general planners to realistic and complex problems. The paper concludes by identifying some of the open issues that remain as important challenges for future research in planning

    Study of fault-tolerant software technology

    Get PDF
    Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance
    corecore