977 research outputs found

    Nationwide population-based cohort study of uterine rupture in Belgium : results from the Belgian Obstetric Surveillance System

    Get PDF
    Objectives: We aimed to assess the prevalence of uterine rupture in Belgium and to evaluate risk factors, management and outcomes for mother and child. Design: Nationwide population-based prospective cohort study. Setting: Emergency obstetric care. Participation of 97% of maternity units covering 98.6% of the deliveries in Belgium. Participants: All women with uterine rupture in Belgium between January 2012 and December 2013. 8 women were excluded because data collection forms were not returned. Results: Data on 90 cases of confirmed uterine rupture were obtained, of which 73 had a previous Caesarean section (CS), representing an estimated prevalence of 3.6 (95% CI 2.9 to 4.4) per 10000 deliveries overall and of 27 (95% CI 21 to 33) and 0.7 (95% CI 0.4 to 1.2) per 10000 deliveries in women with and without previous CS, respectively. Rupture occurred during trial of labour after caesarean section (TOLAC) in 57 women (81.4%, 95% CI 68% to 88%), with a high rate of augmented (38.5%) and induced (29.8%) labour. All patients who underwent induction of labour had an unfavourable cervix at start of induction (Bishop Score 7 in 100%). Other uterine surgery was reported in the history of 22 cases (24%, 95% CI 17% to 34%), including 1 case of myomectomy, 3 cases of salpingectomy and 2 cases of hysteroscopic resection of a uterine septum. 14 cases ruptured in the absence of labour (15.6%, 95% CI 9.5% to 24.7%). No mothers died; 8 required hysterectomy (8.9%, 95% CI 4.6% to 16.6%). There were 10 perinatal deaths (perinatal mortality rate 117/1000 births, 95% CI 60 to 203) and perinatal asphyxia was observed in 29 infants (34.5%, 95% CI 25.2% to 45.1%). Conclusions: The prevalence of uterine rupture in Belgium is similar to that in other Western countries. There is scope for improvement through the implementation of nationally adopted guidelines on TOLAC, to prevent use of unsafe procedures, and thereby reduce avoidable morbidity and mortality

    Bioavailability of Vortioxetine After a Roux-en-Y Gastric Bypass.

    Get PDF

    Optimal quantum detectors for unambiguous detection of mixed states

    Get PDF
    We consider the problem of designing an optimal quantum detector that distinguishes unambiguously between a collection of mixed quantum states. Using arguments of duality in vector space optimization, we derive necessary and sufficient conditions for an optimal measurement that maximizes the probability of correct detection. We show that the previous optimal measurements that were derived for certain special cases satisfy these optimality conditions. We then consider state sets with strong symmetry properties, and show that the optimal measurement operators for distinguishing between these states share the same symmetries, and can be computed very efficiently by solving a reduced size semidefinite program.Comment: Submitted to Phys. Rev.

    Grain-size characterization of reworked fine-grained aeolian deposits

    Get PDF
    After a previous review of the grain-size characteristics of in situ (primary) fine-grained aeolian deposits, reworked (secondary) aeolian deposits, as modified in lacustrine environments and by alluvial and pedogenic processes, are discussed in this paper. As a reference, the grain-size characteristics of primary loess deposits are shortly described. Commonly, pedogenesis and weathering of primary loess may lead to clay neoformation and thus to an enrichment in grain diameters of 4-8 mu m, a size which is comparable to the fine background loess. Remarkably, the modal grain-size values of primary loess are preserved after re -deposition in lakes and flood plains. But, secondary lacustrine settings show a very characteristic admixture with a clayey population of 1-2,5 mu m diameter due to the process of settling in standing water. Similarly, alluvial settings show often an addition with coarse-grained sediment supplied by previously eroded sediment. However, floodplain settings show also often the presence of pools and other depressions which behave similarly to lacustrine environments. As a result, alluvial secondary loess sediments are characterized by the poorest grain-size sorting when compared with the other secondary loess and primary loess. Despite the characteristic texture of each of these deposits, grain-size characteristics of the described individual sediment categories are not always fully diagnostic and thus grain-size analysis should be complemented by other information, as sedimentary structures and fauna or flora, to reliably reconstruct the sedimentary processes and environments

    Quantum Detection with Unknown States

    Full text link
    We address the problem of distinguishing among a finite collection of quantum states, when the states are not entirely known. For completely specified states, necessary and sufficient conditions on a quantum measurement minimizing the probability of a detection error have been derived. In this work, we assume that each of the states in our collection is a mixture of a known state and an unknown state. We investigate two criteria for optimality. The first is minimization of the worst-case probability of a detection error. For the second we assume a probability distribution on the unknown states, and minimize of the expected probability of a detection error. We find that under both criteria, the optimal detectors are equivalent to the optimal detectors of an ``effective ensemble''. In the worst-case, the effective ensemble is comprised of the known states with altered prior probabilities, and in the average case it is made up of altered states with the original prior probabilities.Comment: Refereed version. Improved numerical examples and figures. A few typos fixe

    <i>Vibrio kanaloae</i> sp. nov., <i>Vibrio pomeroyi</i> sp. nov. and <i>Vibrio chagasii</i> sp. nov., from sea water and marine animals

    Get PDF
    The taxonomic position of the fluorescent amplified fragment length polymorphism fingerprinting groups A46 (five isolates), A51 (six isolates), A52 (five isolates) and A53 (seven isolates) obtained in a previous study were further analysed through a polyphasic approach. The 23 isolates were phylogenetically related to Vibrio splendidus, but DNA-DNA hybridization experiments proved that they belong to three novel species. Chemotaxonomic and phenotypic analyses further disclosed several features that differentiate between the 23 isolates and known Vibrio species. The names Vibrio kanaloae sp. nov. (type strain LMG 20539T=CAIM 485T; EMBL accession no. AJ316193; G+C content 44·7 mol%), Vibrio pomeroyi sp. nov. (type strain LMG 20537T=CAIM 578T; EMBL accession no. AJ491290; G+C content 44·1 mol%) and Vibrio chagasii sp. nov. (type strain LMG 21353T=CAIM 431T; EMBL accession no. AJ316199; G+C content 44·6 mol%) are respectively proposed to encompass the five isolates of A46, the six isolates of A51 and the 12 isolates of A52/A53. The three novel species can be distinguished from known Vibrio species by several phenotypic features, including utilization and fermentation of various carbon sources, -galactosidase activity and fatty acid content (particularly of 12 : 0, 14 : 0, 14 : 0 iso and 16 : 0 iso)

    Generating Non-Linear Interpolants by Semidefinite Programming

    Full text link
    Interpolation-based techniques have been widely and successfully applied in the verification of hardware and software, e.g., in bounded-model check- ing, CEGAR, SMT, etc., whose hardest part is how to synthesize interpolants. Various work for discovering interpolants for propositional logic, quantifier-free fragments of first-order theories and their combinations have been proposed. However, little work focuses on discovering polynomial interpolants in the literature. In this paper, we provide an approach for constructing non-linear interpolants based on semidefinite programming, and show how to apply such results to the verification of programs by examples.Comment: 22 pages, 4 figure

    Vertex importance extension of betweenness centrality algorithm

    Get PDF
    Variety of real-life structures can be simplified by a graph. Such simplification emphasizes the structure represented by vertices connected via edges. A common method for the analysis of the vertices importance in a network is betweenness centrality. The centrality is computed using the information about the shortest paths that exist in a graph. This approach puts the importance on the edges that connect the vertices. However, not all vertices are equal. Some of them might be more important than others or have more significant influence on the behavior of the network. Therefore, we introduce the modification of the betweenness centrality algorithm that takes into account the vertex importance. This approach allows the further refinement of the betweenness centrality score to fulfill the needs of the network better. We show this idea on an example of the real traffic network. We test the performance of the algorithm on the traffic network data from the city of Bratislava, Slovakia to prove that the inclusion of the modification does not hinder the original algorithm much. We also provide a visualization of the traffic network of the city of Ostrava, the Czech Republic to show the effect of the vertex importance adjustment. The algorithm was parallelized by MPI (http://www.mpi-forum.org/) and was tested on the supercomputer Salomon (https://docs.it4i.cz/) at IT4Innovations National Supercomputing Center, the Czech Republic.808726

    Exploiting Chordality in Optimization Algorithms for Model Predictive Control

    Full text link
    In this chapter we show that chordal structure can be used to devise efficient optimization methods for many common model predictive control problems. The chordal structure is used both for computing search directions efficiently as well as for distributing all the other computations in an interior-point method for solving the problem. The chordal structure can stem both from the sequential nature of the problem as well as from distributed formulations of the problem related to scenario trees or other formulations. The framework enables efficient parallel computations.Comment: arXiv admin note: text overlap with arXiv:1502.0638
    corecore