2,234 research outputs found
Are the school prevention programmes - aimed at de-normalizing smoking among youths - beneficial in the long term? An example from the Smoke Free Class Competition in Italy
Tobacco smoking by young people is of great concern because it usually leads to regular smoking, nicotine addiction and quitting difficulties. Young people "hooked" by tobacco maintain the profits of the tobacco industry by replacing smokers who quit or die. If new generations could be tobacco-free, as supported by tobacco endgame strategies, the tobacco epidemic could end within decades. Smoking prevention programmes for teens are offered by schools with the aim to prevent or delay smoking onset. Among these, the Smoke Free Class Competition (SFC) was widely implemented in Europe. Its effectiveness yielded conflicting results, but it was only evaluated at short/medium term (6 - 18 months). The aim of this study is to evaluate its effectiveness after a longer follow-up (3 to 5 years) in order to allow enough time for the maturing of the students and the internalization of the experience and its contents. Fifteen classes were randomly sampled from two Italian high schools of Bologna province that regularly offered the SFC to first year students; 382 students (174 participating in the SFC and 208 controls) were retrospectively followed-up and provided their "smoking histories". At the end of their last year of school (after 5 years from the SFC), the percentage of students who stated that they were regular smokers was lower among the SFC students than in controls: 13.5% vs 32.9% (p=0.03). From the students' "smoking histories", statistically significant protective ORs were observed for SFC students at the end of 1st and 5th year: 0.42 (95% CI 0.19-0.93) and 0.32 (95% CI 0.11-0.91) respectively. Absence of smokers in the family was also a strongly statistically significant factor associated with being a non-smoker student. These results suggest that SFC may have a positive impact on lowering the prevalence of smoking in the long term (5 years)
Education Policy Analysis Archives 15/18
Fissures in standards formulation : the role of neoconservative and neoliberal discourses in justifying standards development in Wisconsin and Minnesota/ Samantha Caughlan [and] Richard Beach
Interpolating gauge fixing for Chern-Simons theory
Chern-Simons theory is analyzed with a gauge-fixing which allows to discuss
the Landau gauge and the light-cone gauge at the same time.Comment: 11 pages, Report TUW-93-2
Age-Specific 18F-FDG Image Processing Pipelines and Analysis Are Essential for Individual Mapping of Seizure Foci in Paediatric Patients with Intractable Epilepsy
Fluoro-18-deoxyglucose positron emission tomography (FDG-PET) is an important tool for the pre-surgical assessment of children with drug-resistant epilepsy. Standard assessment is carried out visually and this is often subjective and highly user-dependent. Voxel-wise statistics can be used to remove user-dependent biases by automatically identifying areas of significant hypo/hyper-metabolism, associated to the epileptogenic area. In the clinical settings, this analysis is carried out using commercially available software. These software packages suffer from two main limitations when applied to paediatric PET data: 1) paediatric scans are spatially normalised to an adult standard template and 2) statistical comparisons use an adult control dataset. The aim of this work is to provide a reliable observer-independent pipeline for the analysis of paediatric FDG-PET scans, as part of pre-surgical planning in epilepsy. METHODS: A pseudo-control dataset (n = 19 for 6-9y, n = 93 for 10-20y) was used to create two age-specific FDG-PET paediatric templates in standard paediatric space. The FDG-PET scans of 46 epilepsy patients (n = 16 for 6-9y, n = 30 for 10-17y) were retrospectively collated and analysed using voxel-wise statistics. This was implemented with the standard pipeline available in the commercial software Scenium and an in-house Statistical Parametric Mapping v.8 (SPM8) pipeline (including age-specific paediatric templates and normal database). A kappa test was used to assess the level of agreement between findings of voxel-wise analyses and the clinical diagnosis of each patient. The SPM8 pipeline was further validated using post-surgical seizure-free patients. RESULTS: Improved agreement with the clinical diagnosis was reported using SPM8, in terms of focus localisation, especially for the younger patient group: kScenium=0.489 versus kSPM=0.805. The proposed pipeline also showed a sensitivity of ~70% in both age ranges, for the localisation of hypo-metabolic areas on paediatric FDG-PET scans in post-surgical seizure-free patients. CONCLUSION: We show that by creating age-specific templates and using paediatric control databases, our pipeline provides an accurate and sensitive semi-quantitative method for assessing FDG-PET scans of patients under 18y
Generalized Bayesian Record Linkage and Regression with Exact Error Propagation
Record linkage (de-duplication or entity resolution) is the process of
merging noisy databases to remove duplicate entities. While record linkage
removes duplicate entities from such databases, the downstream task is any
inferential, predictive, or post-linkage task on the linked data. One goal of
the downstream task is obtaining a larger reference data set, allowing one to
perform more accurate statistical analyses. In addition, there is inherent
record linkage uncertainty passed to the downstream task. Motivated by the
above, we propose a generalized Bayesian record linkage method and consider
multiple regression analysis as the downstream task. Records are linked via a
random partition model, which allows for a wide class to be considered. In
addition, we jointly model the record linkage and downstream task, which allows
one to account for the record linkage uncertainty exactly. Moreover, one is
able to generate a feedback propagation mechanism of the information from the
proposed Bayesian record linkage model into the downstream task. This feedback
effect is essential to eliminate potential biases that can jeopardize resulting
downstream task. We apply our methodology to multiple linear regression, and
illustrate empirically that the "feedback effect" is able to improve the
performance of record linkage.Comment: 18 pages, 5 figure
The origin of ultra high energy cosmic rays
We briefly discuss some open problems and recent developments in the
investigation of the origin and propagation of ultra high energy cosmic rays
(UHECRs).Comment: Invited Review Talk at TAUP 2005 (Zaragoza - September 10-14, 2005).
7 page
Numerical propagation of high energy cosmic rays in the Galaxy I: technical issues
We present the results of a numerical simulation of propagation of cosmic
rays with energy above eV in a complex magnetic field, made in
general of a large scale component and a turbulent component. Several
configurations are investigated that may represent specific aspects of a
realistic magnetic field of the Galaxy, though the main purpose of this
investigation is not to achieve a realistic description of the propagation in
the Galaxy, but rather to assess the role of several effects that define the
complex problem of propagation. Our simulations of Cosmic Rays in the Galaxy
will be presented in Paper II. We identified several effects that are difficult
to interpret in a purely diffusive approach and that play a crucial role in the
propagation of cosmic rays in the complex magnetic field of the Galaxy. We
discuss at length the problem of the extrapolation of our results to much lower
energies where data are available on the confinement time of cosmic rays in the
Galaxy. The confinement time and its dependence on particles' rigidity are
crucial ingredients for 1) relating the source spectrum to the observed cosmic
ray spectrum; 2) quantifying the production of light elements by spallation; 3)
predicting the anisotropy as a function of energy.Comment: 29 pages, 12 figures, submitted to JCA
- …