3,047 research outputs found

    White cell count in the normal range and short-term and long-term mortality: international comparisons of electronic health record cohorts in England and New Zealand

    Get PDF
    OBJECTIVES: Electronic health records offer the opportunity to discover new clinical implications for established blood tests, but international comparisons have been lacking. We tested the association of total white cell count (WBC) with all-cause mortality in England and New Zealand. SETTING: Primary care practices in England (ClinicAl research using LInked Bespoke studies and Electronic health Records (CALIBER)) and New Zealand (PREDICT). DESIGN: Analysis of linked electronic health record data sets: CALIBER (primary care, hospitalisation, mortality and acute coronary syndrome registry) and PREDICT (cardiovascular risk assessments in primary care, hospitalisations, mortality, dispensed medication and laboratory results). PARTICIPANTS: People aged 30-75 years with no prior cardiovascular disease (CALIBER: N=686 475, 92.0% white; PREDICT: N=194 513, 53.5% European, 14.7% Pacific, 13.4% Maori), followed until death, transfer out of practice (in CALIBER) or study end. PRIMARY OUTCOME MEASURE: HRs for mortality were estimated using Cox models adjusted for age, sex, smoking, diabetes, systolic blood pressure, ethnicity and total:high-density lipoprotein (HDL) cholesterol ratio. RESULTS: We found 'J'-shaped associations between WBC and mortality; the second quintile was associated with lowest risk in both cohorts. High WBC within the reference range (8.65-10.05×10(9)/L) was associated with significantly increased mortality compared to the middle quintile (6.25-7.25×10(9)/L); adjusted HR 1.51 (95% CI 1.43 to 1.59) in CALIBER and 1.33 (95% CI 1.06 to 1.65) in PREDICT. WBC outside the reference range was associated with even greater mortality. The association was stronger over the first 6 months of follow-up, but similar across ethnic groups. CONCLUSIONS: Clinically recorded WBC within the range considered 'normal' is associated with mortality in ethnically different populations from two countries, particularly within the first 6 months. Large-scale international comparisons of electronic health record cohorts might yield new insights from widely performed clinical tests. TRIAL REGISTRATION NUMBER: NCT02014610

    Approximate Consensus in Highly Dynamic Networks: The Role of Averaging Algorithms

    Full text link
    In this paper, we investigate the approximate consensus problem in highly dynamic networks in which topology may change continually and unpredictably. We prove that in both synchronous and partially synchronous systems, approximate consensus is solvable if and only if the communication graph in each round has a rooted spanning tree, i.e., there is a coordinator at each time. The striking point in this result is that the coordinator is not required to be unique and can change arbitrarily from round to round. Interestingly, the class of averaging algorithms, which are memoryless and require no process identifiers, entirely captures the solvability issue of approximate consensus in that the problem is solvable if and only if it can be solved using any averaging algorithm. Concerning the time complexity of averaging algorithms, we show that approximate consensus can be achieved with precision of ε\varepsilon in a coordinated network model in O(nn+1log1ε)O(n^{n+1} \log\frac{1}{\varepsilon}) synchronous rounds, and in O(ΔnnΔ+1log1ε)O(\Delta n^{n\Delta+1} \log\frac{1}{\varepsilon}) rounds when the maximum round delay for a message to be delivered is Δ\Delta. While in general, an upper bound on the time complexity of averaging algorithms has to be exponential, we investigate various network models in which this exponential bound in the number of nodes reduces to a polynomial bound. We apply our results to networked systems with a fixed topology and classical benign fault models, and deduce both known and new results for approximate consensus in these systems. In particular, we show that for solving approximate consensus, a complete network can tolerate up to 2n-3 arbitrarily located link faults at every round, in contrast with the impossibility result established by Santoro and Widmayer (STACS '89) showing that exact consensus is not solvable with n-1 link faults per round originating from the same node

    Do logarithmic proximity measures outperform plain ones in graph clustering?

    Full text link
    We consider a number of graph kernels and proximity measures including commute time kernel, regularized Laplacian kernel, heat kernel, exponential diffusion kernel (also called "communicability"), etc., and the corresponding distances as applied to clustering nodes in random graphs and several well-known datasets. The model of generating random graphs involves edge probabilities for the pairs of nodes that belong to the same class or different predefined classes of nodes. It turns out that in most cases, logarithmic measures (i.e., measures resulting after taking logarithm of the proximities) perform better while distinguishing underlying classes than the "plain" measures. A comparison in terms of reject curves of inter-class and intra-class distances confirms this conclusion. A similar conclusion can be made for several well-known datasets. A possible origin of this effect is that most kernels have a multiplicative nature, while the nature of distances used in cluster algorithms is an additive one (cf. the triangle inequality). The logarithmic transformation is a tool to transform the first nature to the second one. Moreover, some distances corresponding to the logarithmic measures possess a meaningful cutpoint additivity property. In our experiments, the leader is usually the logarithmic Communicability measure. However, we indicate some more complicated cases in which other measures, typically, Communicability and plain Walk, can be the winners.Comment: 11 pages, 5 tables, 9 figures. Accepted for publication in the Proceedings of 6th International Conference on Network Analysis, May 26-28, 2016, Nizhny Novgorod, Russi

    Prolonged dual anti-platelet therapy in stable coronary disease: a comparative observational study of benefits and harms in unselected versus trial populations

    Get PDF
    Objective: To estimate the potential magnitude in unselected patients of the benefits and harms of prolonged dual antiplatelet therapy after acute myocardial infarction seen in selected patients with high risk characteristics in trials. Design: Observational population based cohort study. Setting: PEGASUS-TIMI-54 trial population and CALIBER (ClinicAl research using LInked Bespoke studies and Electronic health Records). Participants: 7238 patients who survived a year or more after acute myocardial infarction. Interventions: Prolonged dual antiplatelet therapy after acute myocardial infarction. Main outcome measures: Recurrent acute myocardial infarction, stroke, or fatal cardiovascular disease. Fatal, severe, or intracranial bleeding. Results: 1676/7238 (23.1%) patients met trial inclusion and exclusion criteria (“target” population). Compared with the placebo arm in the trial population, in the target population the median age was 12 years higher, there were more women (48.6% v 24.3%), and there was a substantially higher cumulative three year risk of both the primary (benefit) trial endpoint of recurrent acute myocardial infarction, stroke, or fatal cardiovascular disease (18.8% (95% confidence interval 16.3% to 21.8%) v 9.04%) and the primary (harm) endpoint of fatal, severe, or intracranial bleeding (3.0% (2.0% to 4.4%) v 1.26% (TIMI major bleeding)). Application of intention to treat relative risks from the trial (ticagrelor 60 mg daily arm) to CALIBER’s target population showed an estimated 101 (95% confidence interval 87 to 117) ischaemic events prevented per 10 000 treated per year and an estimated 75 (50 to 110) excess fatal, severe, or intracranial bleeds caused per 10 000 patients treated per year. Generalisation from CALIBER’s target subgroup to all 7238 real world patients who were stable at least one year after acute myocardial infarction showed similar three year risks of ischaemic events (17.2%, 16.0% to 18.5%), with an estimated 92 (86 to 99) events prevented per 10 000 patients treated per year, and similar three year risks of bleeding events (2.3%, 1.8% to 2.9%), with an estimated 58 (45 to 73) events caused per 10 000 patients treated per year. Conclusions: This novel use of primary-secondary care linked electronic health records allows characterisation of “healthy trial participant” effects and confirms the potential absolute benefits and harms of dual antiplatelet therapy in representative patients a year or more after acute myocardial infarction

    Optimized T1- and T2-weighted volumetric brain imaging as a diagnostic tool in very preterm neonates.

    Get PDF
    BACKGROUND: T1- and T2-W MR sequences used for obtaining diagnostic information and morphometric measurements in the neonatal brain are frequently acquired using different imaging protocols. Optimizing one protocol for obtaining both kinds of information is valuable. OBJECTIVE: To determine whether high-resolution T1- and T2-W volumetric sequences optimized for preterm brain imaging could provide both diagnostic and morphometric value. MATERIALS AND METHODS: Thirty preterm neonates born between 24 and 32 weeks' gestational age were scanned during the first 2 weeks after birth. T1- and T2-W high-resolution sequences were optimized in terms of signal-to-noise ratio, contrast-to-noise ratio and scan time and compared to conventional spin-echo-based sequences. RESULTS: No differences were found between conventional and high-resolution T1-W sequences for diagnostic confidence, image quality and motion artifacts. A preference for conventional over high-resolution T2-W sequences for image quality was observed. High-resolution T1 images provided better delineation of thalamic myelination and the superior temporal sulcus. No differences were found for detection of myelination and sulcation using conventional and high-resolution T2-W images. CONCLUSION: High-resolution T1- and T2-W volumetric sequences can be used in clinical MRI in the very preterm brain to provide both diagnostic and morphometric information

    Asthma in childhood: a complex, heterogeneous disease

    Get PDF
    Asthma in childhood is a heterogeneous disease with different phenotypes and variable clinical manifestations, which depend on the age, gender, genetic background, and environmental influences of the patients. Several longitudinal studies have been conducted to classify the phenotypes of childhood asthma, on the basis of the symptoms, triggers of wheezing illness, or pathophysiological features of the disease. These studies have provided us with important information about the different wheezing phenotypes in young children and about potential mechanisms and risk factors for the development of chronic asthma. The goal of these studies was to provide a better insight into the causes and natural course of childhood asthma. It is well-known that complicated interactions between genes and environmental factors contribute to the development of asthma. Because childhood is a period of rapid growth in both the lungs and the immune system, developmental factors should be considered in the pathogenesis of childhood asthma. The pulmonary system continues to grow and develop until linear growth is completed. Longitudinal studies have reported significant age-related immune development during postnatal early life. These observations suggest that the phenotypes of childhood asthma vary among children and also in an individual child over time. Improved classification of heterogeneous conditions of the disease will help determine novel strategies for primary and secondary prevention and for the development of individualized treatment for childhood asthma

    Nonattendance in pediatric pulmonary clinics: an ambulatory survey

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Nonattendance for scheduled appointments disturbs the effective management of pediatric pulmonary clinics. We hypothesized that the reasons for non-attendance and the necessary solutions might be different in pediatric pulmonary medicine than in other pediatric fields. We therefore investigated the factors associated with nonattendance this field in order to devise a corrective strategy.</p> <p>Methods</p> <p>The effect of age, gender, ethnic origin, waiting time for an appointment and the timing of appointments during the day on nonattendance proportion were assessed. Chi-square tests were used to analyze statistically significant differences of categorical variables. Logistic regression models were used for multivariate analysis.</p> <p>Results</p> <p>A total of 1190 pediatric pulmonology clinic visits in a 21 month period were included in the study. The overall proportion of nonattendance was 30.6%. Nonattendance was 23.8% when there was a short waiting time for an appointment (1–7 days) and 36.3% when there was a long waiting time (8 days and above) (p-value < 0.001). Nonattendance was 28.7% between 8 a.m. to 3 p.m. and 37.5% after 3 p.m. (p = 0.007). Jewish rural patients had 15.4% nonattendance, Jewish urban patients had 31.2% nonattendance and Bedouin patients had 32.9% nonattendance (p < 0.004). Age and gender were not significantly associated with nonattendance proportions. A multivariate logistic regression model demonstrated that the waiting time for an appointment, time of the day, and the patients' origin was significantly associated with nonattendance.</p> <p>Conclusion</p> <p>The factors associated with nonattendance in pediatric pulmonary clinics include the length of waiting time for an appointment, the hour of the appointment within the day and the origin of the patient.</p

    Baryon Washout, Electroweak Phase Transition, and Perturbation Theory

    Get PDF
    We analyze the conventional perturbative treatment of sphaleron-induced baryon number washout relevant for electroweak baryogenesis and show that it is not gauge-independent due to the failure of consistently implementing the Nielsen identities order-by-order in perturbation theory. We provide a gauge-independent criterion for baryon number preservation in place of the conventional (gauge-dependent) criterion needed for successful electroweak baryogenesis. We also review the arguments leading to the preservation criterion and analyze several sources of theoretical uncertainties in obtaining a numerical bound. In various beyond the standard model scenarios, a realistic perturbative treatment will likely require knowledge of the complete two-loop finite temperature effective potential and the one-loop sphaleron rate.Comment: 25 pages, 9 figures; v2 minor typos correcte
    corecore