28 research outputs found

    Chaotic Free-Space Laser Communication over Turbulent Channel

    Get PDF
    The dynamics of errors caused by atmospheric turbulence in a self-synchronizing chaos based communication system that stably transmits information over a \sim5 km free-space laser link is studied experimentally. Binary information is transmitted using a chaotic sequence of short-term pulses as carrier. The information signal slightly shifts the chaotic time position of each pulse depending on the information bit. We report the results of an experimental analysis of the atmospheric turbulence in the channel and the impact of turbulence on the Bit-Error-Rate (BER) performance of this chaos based communication system.Comment: 4 pages, 5 figure

    Brain death and postmortem organ donation: Report of a questionnaire from the CENTER-TBI study

    Get PDF
    Background: We aimed to investigate the extent of the agreement on practices around brain death and postmortem organ donation. Methods: Investigators from 67 Collaborative European NeuroTrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI) study centers completed several questionnaires (response rate: 99%). Results: Regarding practices around brain death, we found agreement on the clinical evaluation (prerequisites and neurological assessment) for brain death determination (BDD) in 100% of the centers. However, ancillary tests were required for BDD in 64% of the centers. BDD for nondonor patients was deemed mandatory in 18% of the centers before withdrawing life-sustaining measures (LSM). Also, practices around postmortem organ donation varied. Organ donation after circulatory arrest was forbidden in 45% of the centers. When withdrawal of LSM was contemplated, in 67% of centers the patients with a ventricular drain in situ had this removed, either sometimes or all of the time. Conclusions: This study showed both agreement and some regional differences regarding practices around brain death and postmortem organ donation. We hope our results help quantify and understand potential differences, and provide impetus for current dialogs toward further harmonization of practices around brain death and postmortem organ donation

    Approximation of confidence sets for output error systems using interval analysis

    No full text
    \u3cp\u3eStandard identification techniques usually result in a single point estimate of the system parameters. This is justified in cases when the number of observations is large compared to the number of system parameters. However in case of small sample count it is more reasonable to identify a set of possible parameters which contain the nominal parameters with a given probability. These confidence sets cannot be calculated directly. The paper proposes interval analytic techniques to approximate confidence sets of model parameters up to arbitrary precision. The origins of interval analysis lie in the field of reliable computing, giving certified results for every computation. It has been used to solve global optimization problems numerically providing theoretical certificates on the optimality of the results. This method of global optimization is modified in a suitable way to generate the needed confidence sets. Introduction to interval analytic techniques is given and the methodology of global optimization via these is also presented. The modifications of this algorithm needed to construct the confidence sets are discussed and the method is illustrated on a simple example. The presented algorithm is focused on the output error model structure but the methodology can be extended to more general cases as well.\u3c/p\u3

    Recalibrating fine-grained locking in parallel bucket hash tables

    No full text
    \u3cp\u3eMutual exclusion protects data structures in parallel environments in order to preserve data integrity. A lock being held effectively blocks the execution of all other threads wanting to access the same shared resource until the lock is released. This blocking behavior reduces the level of parallelism causing performance loss. Fine grained locking reduces the contention for the locks resulting in better throughput, however, the granularity, i.e. how many locks to use, is not straightforward. In large bucket hash tables, the best approach is to divide the table into blocks, each containing one or more buckets, and locking these blocks independently. The size of the block, for optimal performance, depends on the time spent within the critical sections, which depends on the table's internal properties, and the arrival intensity of the queries. A queuing model is presented capturing this behavior, and an adaptive algorithm is presented fine-tuning the granularity of locking (the block size) to adapt to the execution environment.\u3c/p\u3

    Camera placement optimization in object localization systems

    No full text
    \u3cp\u3eThis paper focuses on the placement of cameras in order to achieve the highest possible localization accuracy with a multi-camera system. The cameras have redundant fields of view. They have to be placed according to some natural constraints but user defined constraints are allowed as well. A camera model is described and the components causing the localization errors are identified. Some localization accuracy measures are defined for any number of cameras. The multi-camera placement is analytically formulated using the expanded measures for multiple cameras. An example of placing two cameras is shown and the generalizations into higher dimensional parameter spaces are examined. There are publications where camera placement algorithms are formulated or compared. We make an attempt to examine the analytical solution of this problem in case of different objective functions.\u3c/p\u3

    Fluid level dependent Markov fluid models with continuous zero transition

    No full text
    \u3cp\u3eMarkov fluid models with fluid level dependent behaviour are considered in this paper. One of the main difficulties of the analysis of these models is to handle the case when in a given state the fluid rate changes sign from positive to negative at a given fluid level. We refer to this case as zero transition. The case when this sign change is due to a discontinuity of the fluid rate function results in probability mass at the given fluid level. We show that the case when the sign change is due to a continuous finite polynomial function of the fluid rate results in a qualitatively different behaviour: no probability mass develops and different stationary equations apply. We consider this latter case of sign change, present its stationary description and propose a numerical procedure for its evaluation.\u3c/p\u3

    Simple log-domain chaotic oscillator

    No full text
    corecore