1,734 research outputs found

    Relative validity of a web-based food frequency questionnaire for patients with type 1 and type 2 diabetes in Denmark

    Get PDF
    BACKGROUND: Diet has an important role in the management of diabetes. However, little is known about dietary intake in Danish diabetes patients. A food frequency questionnaire (FFQ) focusing on most relevant nutrients in diabetes including carbohydrates, dietary fibres and simple sugars was developed and validated. OBJECTIVES: To examine the relative validity of nutrients calculated by a web-based food frequency questionnaire for patients with diabetes. DESIGN: The FFQ was validated against a 4-day pre-coded food diary (FD). Intakes of nutrients were calculated. Means of intake were compared and cross-classifications of individuals according to intake were performed. To assess the agreement between the two methods, Pearson and Spearman's correlation coefficients and weighted kappa coefficients were calculated. SUBJECTS: Ninety patients (64 with type 1 diabetes and 26 with type 2 diabetes) accepted to participate in the study. Twenty-six were excluded from the final study population. SETTING: 64 volunteer diabetes patients at the Steno Diabetes Center. RESULTS: Intakes of carbohydrates, simple sugars, dietary fibres and total energy were higher according to the FFQ compared with the FD. However, intakes of nutrients were grossly classified in the same or adjacent quartiles with an average of 82% of the selected nutrients when comparing the two methods. In general, moderate agreement between the two methods was found. CONCLUSION: The FFQ was validated for assessment of a range of nutrients. Comparing the intakes of selected nutrients (carbohydrates, dietary fibres and simple sugars), patients were classified correctly according to low and high intakes. The FFQ is a reliable dietary assessment tool to use in research and evaluation of patient education for patients with diabetes

    Observation of Coulomb-Assisted Dipole-Forbidden Intraexciton Transitions in Semiconductors

    Get PDF
    We use terahertz pulses to induce resonant transitions between the eigenstates of optically generated exciton populations in a high-quality semiconductor quantum-well sample. Monitoring the excitonic photoluminescence, we observe transient quenching of the 1s1s exciton emission, which we attribute to the terahertz-induced 1s1s-to-2p2p excitation. Simultaneously, a pronounced enhancement of the 2s2s-exciton emission is observed, despite the 1s1s-to-2s2s transition being dipole forbidden. A microscopic many-body theory explains the experimental observations as a Coulomb-scattering mixing of the 2ss and 2pp states, yielding an effective terahertz transition between the 1ss and 2ss populations.Comment: 5 pages, 3 figure

    Biophysical suitability, economic pressure and land-cover change: a global probabilistic approach and insights for REDD+

    Get PDF
    There has been a concerted effort by the international scientific community to understand the multiple causes and patterns of land-cover change to support sustainable land management. Here, we examined biophysical suitability, and a novel integrated index of “Economic Pressure on Land” (EPL) to explain land cover in the year 2000, and estimated the likelihood of future land-cover change through 2050, including protected area effectiveness. Biophysical suitability and EPL explained almost half of the global pattern of land cover (R 2 = 0.45), increasing to almost two-thirds in areas where a long-term equilibrium is likely to have been reached (e.g. R 2 = 0.64 in Europe). We identify a high likelihood of future land-cover change in vast areas with relatively lower current and past deforestation (e.g. the Congo Basin). Further, we simulated emissions arising from a “business as usual” and two reducing emissions from deforestation and forest degradation (REDD) scenarios by incorporating data on biomass carbon. As our model incorporates all biome types, it highlights a crucial aspect of the ongoing REDD + debate: if restricted to forests, “cross-biome leakage” would severely reduce REDD + effectiveness for climate change mitigation. If forests were protected from deforestation yet without measures to tackle the drivers of land-cover change, REDD + would only reduce 30 % of total emissions from land-cover change. Fifty-five percent of emissions reductions from forests would be compensated by increased emissions in other biomes. These results suggest that, although REDD + remains a very promising mitigation tool, implementation of complementary measures to reduce land demand is necessary to prevent this leakage

    Measurement of the Tau Branching Fractions into Leptons

    Get PDF
    Using data collected with the L3 detector near the Z resonance, corresponding to an integrated luminosity of 150pb-1, the branching fractions of the tau lepton into electron and muon are measured to be B(tau->e nu nu) = (17.806 +- 0.104 (stat.) +- 0.076 (syst.)) %, B(tau->mu nu nu) = (17.342 +- 0.110 (stat.) +- 0.067 (syst.)) %. From these results the ratio of the charged current coupling constants of the muon and the electron is determined to be g_mu/g_e = 1.0007 +- 0.0051. Assuming electron-muon universality, the Fermi constant is measured in tau lepton decays as G_F = (1.1616 +- 0.0058) 10^{-5} GeV^{-2}. Furthermore, the coupling constant of the strong interaction at the tau mass scale is obtained as alpha_s(m_tau^2) = 0.322 +- 0.009 (exp.) +- 0.015 (theory)

    Search for Heavy Neutral and Charged Leptons in e+ e- Annihilation at LEP

    Get PDF
    A search for exotic unstable neutral and charged heavy leptons as well as for stable charged heavy leptons is performed with the L3 detector at LEP. Sequential, vector and mirror natures of heavy leptons are considered. No evidence for their existence is found and lower limits on their masses are set

    Robust automated detection of microstructural white matter degeneration in Alzheimer’s disease using machine learning classification of multicenter DTI data

    Get PDF
    Diffusion tensor imaging (DTI) based assessment of white matter fiber tract integrity can support the diagnosis of Alzheimer’s disease (AD). The use of DTI as a biomarker, however, depends on its applicability in a multicenter setting accounting for effects of different MRI scanners. We applied multivariate machine learning (ML) to a large multicenter sample from the recently created framework of the European DTI study on Dementia (EDSD). We hypothesized that ML approaches may amend effects of multicenter acquisition. We included a sample of 137 patients with clinically probable AD (MMSE 20.6±5.3) and 143 healthy elderly controls, scanned in nine different scanners. For diagnostic classification we used the DTI indices fractional anisotropy (FA) and mean diffusivity (MD) and, for comparison, gray matter and white matter density maps from anatomical MRI. Data were classified using a Support Vector Machine (SVM) and a Naïve Bayes (NB) classifier. We used two cross-validation approaches, (i) test and training samples randomly drawn from the entire data set (pooled cross-validation) and (ii) data from each scanner as test set, and the data from the remaining scanners as training set (scanner-specific cross-validation). In the pooled cross-validation, SVM achieved an accuracy of 80% for FA and 83% for MD. Accuracies for NB were significantly lower, ranging between 68% and 75%. Removing variance components arising from scanners using principal component analysis did not significantly change the classification results for both classifiers. For the scanner-specific cross-validation, the classification accuracy was reduced for both SVM and NB. After mean correction, classification accuracy reached a level comparable to the results obtained from the pooled cross-validation. Our findings support the notion that machine learning classification allows robust classification of DTI data sets arising from multiple scanners, even if a new data set comes from a scanner that was not part of the training sample

    Higgs Candidates in e+e- Interactions at root(s) = 206.6 GeV

    Full text link
    In a search for the Standard Model Higgs boson, carried out on 212.5 pb-1 of data collected by the L3 detector at the highest LEP centre-of-mass energies, including 116.5 pb-1 above root(s) = 206GeV, an excess of candidates for the process e+e- -> Z* -> HZ is found for Higgs masses near 114.5GeV. We present an analysis of our data and the characteristics of our strongest candidates.Comment: Footnote added, matches the version to be published in Physics Letters

    Inclusive D* Production in Two-Photon Collisions at LEP

    Get PDF
    Inclusive D^{*+-} production in two-photon collisions is studied with the L3 detector at LEP, using 683 pb^{-1} of data collected at centre-of-mass energies from 183 to 208 GeV. Differential cross sections are determined as functions of the transverse momentum and pseudorapidity of the D^{*+-} mesons in the kinematic region 1 GeV < P_T < 12 GeV and |eta| < 1.4. The cross sections sigma(e^+e^- -> e^+e^-D^{*+-}X) in this kinematical region is measured and the sigma(e^+e^- -> e^+e^- cc{bar}X) cross section is derived. The measurements are compared with next-to-leading order perturbative QCD calculations

    Standard Model Higgs Boson with the L3 Experiment at LEP

    Get PDF
    Final results of the search for the Standard Model Higgs boson are presented for the data collected by the L3 detector at LEP at centre-of-mass energies up to about 209 GeV. These data are compared with the expectations of Standard Model processes for Higgs boson masses up to 120 GeV. A lower limit on the mass of the Standard Model Higgs boson of 112.0 GeV is set at the 95 % confidence level. The most significant high mass candidate is a Hnunu event. It has a reconstructed Higgs mass of 115 GeV and it was recorded at root(s)=206.4 GeV
    corecore