204 research outputs found
The Conditional Common Information in Classical and Quantum Secret Key Distillation
© 2018 IEEE. In this paper, we consider two extensions of the Gács-Körner common information to three variables, the conditional common information (cCI) and the coarse-grained conditional common information (ccCI). Both quantities are shown to be useful technical tools in the study of classical and quantum resource transformations. In particular, the ccCI is shown to have an operational interpretation as the optimal rate of secret key extraction from an eavesdropped classical source pXYZ when Alice (X) and Bob (Y) are unable to communicate but share common randomness with the eavesdropper Eve (Z). Moving to the quantum setting, we consider two different ways of generating a tripartite quantum state from classical correlations pXYZ : 1) coherent encodings ∑xyz√pxyz|xyz〉 and 2) incoherent encodings ∑xyzpxyz|xyz〉〈xyz|. We study how well can Alice and Bob extract secret key from these quantum sources using quantum operations compared with the extraction of key from the underlying classical sources pXYZ using classical operations. While the power of quantum mechanics increases Alice and Bob's ability to generate shared randomness, it also equips Eve with a greater arsenal of eavesdropping attacks. Therefore, it is not obvious who gains the greatest advantage for distilling secret key when replacing a classical source with a quantum one. We first demonstrate that the classical key rate of pXYZ is equivalent to the quantum key rate for an incoherent quantum encoding of the distribution. For coherent encodings, we next show that the classical and quantum rates are generally incomparable, and in fact, their difference can be arbitrarily large in either direction. Finally, we introduce a "zoo" of entangled tripartite states all characterized by the conditional common information of their encoded probability distributions. Remarkably, for these states almost all entanglement measures, such as Alice and Bob's entanglement cost, squashed entanglement, and relative entropy of entanglement, can be sharply bounded or even exactly expressed in terms of the conditional common information. In the latter case, we thus present a rare instance in which the various entropic entanglement measures of a quantum state can be explicitly calculated
Classical Analog to Entanglement Reversibility
© 2015 American Physical Society. In this Letter we study the problem of secrecy reversibility. This asks when two honest parties can distill secret bits from some tripartite distribution pXYZ and transform secret bits back into pXYZ at equal rates using local operation and public communication. This is the classical analog to the well-studied problem of reversibly concentrating and diluting entanglement in a quantum state. We identify the structure of distributions possessing reversible secrecy when one of the honest parties holds a binary distribution, and it is possible that all reversible distributions have this form. These distributions are more general than what is obtained by simply constructing a classical analog to the family of quantum states known to have reversible entanglement. An indispensable tool used in our analysis is a conditional form of the Gács-Körner common information
The Control System for the Cryogenics in the LHC Tunnel [First Experience and Improvements]
The Large Hadron Collider (LHC) was commissioned at CERN and started operation with beams in 2008. Several months of operation in nominal cryogenic conditions have triggered an optimisation of the process functional analysis. This lead to a few revisions of the control logic, which were realised on-the-fly. During the 2008-09 shut-down, and in order to enhance the safety, availability and operability of the LHC cryogenics, a major rebuild of the logic and several hardware modifications were implemented. The databases, containing instruments and controls in-formation, are being rationalized; the automatic tool, that extracts data for the control software, is being simplified. This paper describes the main improvements and sug-gests perspectives of further developments
Orbital dynamics of "smart dust" devices with solar radiation pressure and drag
This paper investigates how perturbations due to asymmetric solar radiation pressure, in the presence of Earth shadow, and atmospheric drag can be balanced to obtain long-lived Earth centred orbits for swarms of micro-scale 'smart dust' devices, without the use of active control. The secular variation of Keplerian elements is expressed analytically through an averaging technique. Families of solutions are then identified where Sun-synchronous apse-line precession is achieved passively to maintain asymmetric solar radiation pressure. The long-term orbit evolution is characterized by librational motion, progressively decaying due to the non-conservative effect of atmospheric drag. Long-lived orbits can then be designed through the interaction of energy gain from asymmetric solar radiation pressure and energy dissipation due to drag. In this way, the usual short drag lifetime of such high area-to-mass spacecraft can be greatly extended (and indeed selected). In addition, the effect of atmospheric drag can be exploited to ensure the rapid end-of-life decay of such devices, thus preventing long-lived orbit debris
First Experience with the LHC Cryogenic Instrumentation
The LHC under commissioning at CERN will be the world's largest superconducting accelerator and therefore makes extensive use of cryogenic instruments. These instruments are installed in the tunnel and therefore have to withstand the LHC environment that imposes radiation-tolerant design and construction. Most of the instruments require individual calibration; some of them exhibit several variants as concerns measuring span; all relevant data are therefore stored in an Oracle® database. Those data are used for the various quality assurance procedures defined for installation and commissioning, as well as for generating tables used by the control system to configure automatically the input/output channels. This paper describes the commissioning of the sensors and the corresponding electronics, the first measurement results during the cool-down of one machine sector; it discusses the different encountered problems and their corresponding solutions
Prevalence and clinical implications of respiratory viruses in stable chronic obstructive pulmonary disease (COPD) and exacerbations: a systematic review and meta-analysis protocol.
INTRODUCTION: Both stable chronic obstructive pulmonary disease (COPD) and acute exacerbations represent leading causes of death, disability and healthcare expenditure. They are complex, heterogeneous and their mechanisms are poorly understood. The role of respiratory viruses has been studied extensively but is still not adequately addressed clinically. Through a rigorous evidence update, we aim to define the prevalence and clinical burden of the different respiratory viruses in stable COPD and exacerbations, and to investigate whether viral load of usual respiratory viruses could be used for diagnosis of exacerbations triggered by viruses, which are currently not diagnosed or treated aetiologically. METHODS AND ANALYSIS: Based on a prospectively registered protocol, we will systematically review the literature using standard methods recommended by the Cochrane Collaboration and the Grading of Recommendations Assessment, Development and Evaluation working group. We will search Medline/PubMed, Excerpta Medica dataBASE (EMBASE), the Cochrane Library, the WHO's Clinical Trials Registry and the proceedings of relevant international conferences on 2 March 2020. We will evaluate: (A) the prevalence of respiratory viruses in stable COPD and exacerbations, (B) differences in the viral loads of respiratory viruses in stable COPD vs exacerbations, to explore whether the viral load of prevalent respiratory viruses could be used as a diagnostic biomarker for exacerbations triggered by viruses and (C) the association between the presence of respiratory viruses and clinical outcomes in stable COPD and in exacerbations. ETHICS AND DISSEMINATION: Ethics approval is not required since no primary data will be collected. Our findings will be presented in national and international scientific conferences and will be published in peer reviewed journals. Respiratory viruses currently represent a lost opportunity to improve the outcomes of both stable COPD and exacerbations. Our work aspires to 'demystify' the prevalence and clinical burden of viruses in stable COPD and exacerbations and to promote clinical and translational research. PROSPERO REGISTRATION NUMBER: CRD42019147658
Individual variability in cardiac biomarker release after 30 min of high-intensity rowing in elite and amateur athletes
This study had two objectives: (i) to examine individual variation in the pattern of cardiac troponin I (cTnI) and N-terminal pro-brain natriuretic peptide (NT-proBNP) release in response to high-intensity rowing exercise, and (ii) to establish whether individual heterogeneity in biomarker appearance was influenced by athletic status (elite vs. amateur). We examined cTnI and NT-proBNP in 18 elite and 14 amateur rowers before and 5 min, 1, 3, 6, 12, and 24 h after a 30-min maximal rowing test. Compared with pre-exercise levels, peak postexercise cTnI (pre: 0.014 ± 0.030 μg·L–1; peak post: 0.058 ± 0.091 μg·L–1; p = 0.000) and NT-proBNP (pre: 15 ± 11 ng·L–1; peak post: 31 ± 19 ng·L–1; p = 0.000) were elevated. Substantial individual heterogeneity in peak and time-course data was noted for cTnI. Peak cTnI exceeded the upper reference limit (URL) in 9 elite and 3 amateur rowers. No rower exceeded the URL for NT-proBNP. Elite rowers had higher baseline (0.019 ± 0.038 vs. 0.008 ± 0.015 μg·L–1; p = 0.003) and peak postexercise cTnI (0.080 ± 0.115 vs. 0.030 ± 0.029 μg·L–1; p = 0.022) than amateur rowers, but the change with exercise was similar between groups. There were no significant differences in baseline and peak postexercise NT-proBNP between groups. In summary, marked individuality in the cTnI response to a short but high-intensity rowing bout was observed. Athletic status did not seem to affect the change in cardiac biomarkers in response to high-intensity exercise
Quality standards for managing children and adolescents with bronchiectasis. an international consensus
The global burden of bronchiectasis in children and adolescents is being recognised increasingly. However, marked inequity exists between, and within, settings and countries for resources and standards of care afforded to children and adolescents with bronchiectasis compared with those with other chronic lung diseases. The European Respiratory Society (ERS) clinical practice guideline for the management of bronchiectasis in children and adolescents was published recently. Here we present an international consensus of quality standards of care for children and adolescents with bronchiectasis based upon this guideline.The panel used a standardised approach that included a Delphi process with 201 respondents from the parents and patients' survey, and 299 physicians (across 54 countries) who care for children and adolescents with bronchiectasis.The seven quality standards of care statements developed by the panel address the current absence of quality standards for clinical care related to paediatric bronchiectasis. These internationally derived, clinician-, parent- and patient-informed, consensus-based quality standards statements can be used by parents and patients to access and advocate for quality care for their children and themselves, respectively. They can also be used by healthcare professionals to advocate for their patients, and by health services as a monitoring tool, to help optimise health outcomes
A comparative analysis of predictive models of morbidity in intensive care unit after cardiac surgery – Part II: an illustrative example
<p>Abstract</p> <p>Background</p> <p>Popular predictive models for estimating morbidity probability after heart surgery are compared critically in a unitary framework. The study is divided into two parts. In the first part modelling techniques and intrinsic strengths and weaknesses of different approaches were discussed from a theoretical point of view. In this second part the performances of the same models are evaluated in an illustrative example.</p> <p>Methods</p> <p>Eight models were developed: Bayes linear and quadratic models, <it>k</it>-nearest neighbour model, logistic regression model, Higgins and direct scoring systems and two feed-forward artificial neural networks with one and two layers. Cardiovascular, respiratory, neurological, renal, infectious and hemorrhagic complications were defined as morbidity. Training and testing sets each of 545 cases were used. The optimal set of predictors was chosen among a collection of 78 preoperative, intraoperative and postoperative variables by a stepwise procedure. Discrimination and calibration were evaluated by the area under the receiver operating characteristic curve and Hosmer-Lemeshow goodness-of-fit test, respectively.</p> <p>Results</p> <p>Scoring systems and the logistic regression model required the largest set of predictors, while Bayesian and <it>k</it>-nearest neighbour models were much more parsimonious. In testing data, all models showed acceptable discrimination capacities, however the Bayes quadratic model, using only three predictors, provided the best performance. All models showed satisfactory generalization ability: again the Bayes quadratic model exhibited the best generalization, while artificial neural networks and scoring systems gave the worst results. Finally, poor calibration was obtained when using scoring systems, <it>k</it>-nearest neighbour model and artificial neural networks, while Bayes (after recalibration) and logistic regression models gave adequate results.</p> <p>Conclusion</p> <p>Although all the predictive models showed acceptable discrimination performance in the example considered, the Bayes and logistic regression models seemed better than the others, because they also had good generalization and calibration. The Bayes quadratic model seemed to be a convincing alternative to the much more usual Bayes linear and logistic regression models. It showed its capacity to identify a minimum core of predictors generally recognized as essential to pragmatically evaluate the risk of developing morbidity after heart surgery.</p
- …