1,334 research outputs found
Are the school prevention programmes - aimed at de-normalizing smoking among youths - beneficial in the long term? An example from the Smoke Free Class Competition in Italy
Tobacco smoking by young people is of great concern because it usually leads to regular smoking, nicotine addiction and quitting difficulties. Young people "hooked" by tobacco maintain the profits of the tobacco industry by replacing smokers who quit or die. If new generations could be tobacco-free, as supported by tobacco endgame strategies, the tobacco epidemic could end within decades. Smoking prevention programmes for teens are offered by schools with the aim to prevent or delay smoking onset. Among these, the Smoke Free Class Competition (SFC) was widely implemented in Europe. Its effectiveness yielded conflicting results, but it was only evaluated at short/medium term (6 - 18 months). The aim of this study is to evaluate its effectiveness after a longer follow-up (3 to 5 years) in order to allow enough time for the maturing of the students and the internalization of the experience and its contents. Fifteen classes were randomly sampled from two Italian high schools of Bologna province that regularly offered the SFC to first year students; 382 students (174 participating in the SFC and 208 controls) were retrospectively followed-up and provided their "smoking histories". At the end of their last year of school (after 5 years from the SFC), the percentage of students who stated that they were regular smokers was lower among the SFC students than in controls: 13.5% vs 32.9% (p=0.03). From the students' "smoking histories", statistically significant protective ORs were observed for SFC students at the end of 1st and 5th year: 0.42 (95% CI 0.19-0.93) and 0.32 (95% CI 0.11-0.91) respectively. Absence of smokers in the family was also a strongly statistically significant factor associated with being a non-smoker student. These results suggest that SFC may have a positive impact on lowering the prevalence of smoking in the long term (5 years)
The origin of ultra high energy cosmic rays
We briefly discuss some open problems and recent developments in the
investigation of the origin and propagation of ultra high energy cosmic rays
(UHECRs).Comment: Invited Review Talk at TAUP 2005 (Zaragoza - September 10-14, 2005).
7 page
Numerical propagation of high energy cosmic rays in the Galaxy I: technical issues
We present the results of a numerical simulation of propagation of cosmic
rays with energy above eV in a complex magnetic field, made in
general of a large scale component and a turbulent component. Several
configurations are investigated that may represent specific aspects of a
realistic magnetic field of the Galaxy, though the main purpose of this
investigation is not to achieve a realistic description of the propagation in
the Galaxy, but rather to assess the role of several effects that define the
complex problem of propagation. Our simulations of Cosmic Rays in the Galaxy
will be presented in Paper II. We identified several effects that are difficult
to interpret in a purely diffusive approach and that play a crucial role in the
propagation of cosmic rays in the complex magnetic field of the Galaxy. We
discuss at length the problem of the extrapolation of our results to much lower
energies where data are available on the confinement time of cosmic rays in the
Galaxy. The confinement time and its dependence on particles' rigidity are
crucial ingredients for 1) relating the source spectrum to the observed cosmic
ray spectrum; 2) quantifying the production of light elements by spallation; 3)
predicting the anisotropy as a function of energy.Comment: 29 pages, 12 figures, submitted to JCA
Generalized Bayesian Record Linkage and Regression with Exact Error Propagation
Record linkage (de-duplication or entity resolution) is the process of
merging noisy databases to remove duplicate entities. While record linkage
removes duplicate entities from such databases, the downstream task is any
inferential, predictive, or post-linkage task on the linked data. One goal of
the downstream task is obtaining a larger reference data set, allowing one to
perform more accurate statistical analyses. In addition, there is inherent
record linkage uncertainty passed to the downstream task. Motivated by the
above, we propose a generalized Bayesian record linkage method and consider
multiple regression analysis as the downstream task. Records are linked via a
random partition model, which allows for a wide class to be considered. In
addition, we jointly model the record linkage and downstream task, which allows
one to account for the record linkage uncertainty exactly. Moreover, one is
able to generate a feedback propagation mechanism of the information from the
proposed Bayesian record linkage model into the downstream task. This feedback
effect is essential to eliminate potential biases that can jeopardize resulting
downstream task. We apply our methodology to multiple linear regression, and
illustrate empirically that the "feedback effect" is able to improve the
performance of record linkage.Comment: 18 pages, 5 figure
Observables in Topological Yang-Mills Theories
Using topological Yang-Mills theory as example, we discuss the definition and
determination of observables in topological field theories (of Witten-type)
within the superspace formulation proposed by Horne. This approach to the
equivariant cohomology leads to a set of bi-descent equations involving the
BRST and supersymmetry operators as well as the exterior derivative. This
allows us to determine superspace expressions for all observables, and thereby
to recover the Donaldson-Witten polynomials when choosing a Wess-Zumino-type
gauge.Comment: 39 pages, Late
Total Gastrectomy for locally advanced Cancer: The total Laparoscopic Approach
Total gastrectomy is the treatment of choice for adenocarcinoma of the upper and middle third of the stomach resected with curative intent. The laparoscopic approach allows satisfactory exploration of the peritoneal cavity and optimizes staging in borderline T3 or T4 tumours in patients affected by locally advanced tumours or intraperitoneal carcinomatosis. Laparoscopy can eliminate unnecessary laparotomies in 10 % of patients affected by these conditions with formal contraindications for resection [1] . Complete resection of the stomach associated with D2 lymph node dissection is also performed using a currently well-established technique [2, 3] . The specificity of laparoscopic gastric resection for cancer is that the stomach and the greatomentum are withdrawn separately.Reconstruction of the digestive tract is more complex, and requires a variety of techniques (supra-umbilical mini-laparotomy, Orvil® technique, enlarging a port-site for passage of a circular stapler, mechanical side to side anastomosis, etc), but none ofthese has become the gold standard [4-7] . This explains the difficulties encountered in promoting the widespread use of minimally invasive resection in western countries. Scientific societies insist on the need for prospective studies to establish the place of laparoscopy for gastric cancer (prophylactic gastrectomy for CDH-1 related gastric cancer, < T3 Tumours, palliative gastrectomy) [4] . Here, we present our technique for total resection of the stomach and D2 lymph node dissection, which allows the manualcreation of a feasible, safe, tension-free and effective esojejunal anastomosis. It can be performed by any surgeon familiar with laparoscopic surgery and the principles of oncologic resection. The cost is also relatively low because neither a circular staplernor other special equipment is required. Finally, the incision for extraction of the specimen can be placed in any area of the abdomen (usually through a supra-pubic incision in our practice).Keywords: Gastric cancer, laparoscopy, total gastrectomy, lymphadenectomy, Intracorporeal anastomosis.Total gastrectomy is the treatment of choice for adenocarcinoma of the upper and middle third of the stomach resected with curative intent. The laparoscopic approach allows satisfactory exploration of the peritoneal cavity and optimizes staging in borderline T3 or T4 tumours in patients affected by locally advanced tumours or intraperitoneal carcinomatosis. Laparoscopy can eliminate unnecessary laparotomies in 10 % of patients affected by these conditions with formal contraindications for resection [1] . Complete resection of the stomach associated with D2 lymph node dissection is also performed using a currently well-established technique [2, 3] . The specificity of laparoscopic gastric resection for cancer is that the stomach and the great omentum are withdrawn separately.Reconstruction of the digestive tract is more complex, and requires a variety of techniques (supra-umbilical mini-laparotomy, Orvil® technique, enlarging a port-site for passage of a circular stapler, mechanical side to side anastomosis, etc), but none of these has become the gold standard [4-7] . This explains the difficulties encountered in promoting the widespread use of minimally invasive resection in western countries. Scientific societies insist on the need for prospective studies to establish the place of laparoscopy for gastric cancer (prophylactic gastrectomy for CDH-1 related gastric cancer, < T3 Tumours, palliative gastrectomy) [4] . Here, we present our technique for total resection of the stomach and D2 lymph node dissection, which allows the manual creation of a feasible, safe, tension-free and effective esojejunal anastomosis. It can be performed by any surgeon familiar with laparoscopic surgery and the principles of oncologic resection. The cost is also relatively low because neither a circular stapler nor other special equipment is required. Finally, the incision for extraction of the specimen can be placed in any area of the abdomen (usually through a supra-pubic incision in our practice).Keywords: Gastric cancer, laparoscopy, total gastrectomy, lymphadenectomy, Intracorporeal anastomosis
Stochastic conversions of TeV photons into axion-like particles in extragalactic magnetic fields
Very-high energy photons emitted by distant cosmic sources are absorbed on
the extragalactic background light (EBL) during their propagation. This effect
can be characterized in terms of a photon transfer function at Earth. The
presence of extragalactic magnetic fields could also induce conversions between
very high-energy photons and hypothetical axion-like particles (ALPs). The
turbulent structure of the extragalactic magnetic fields would produce a
stochastic behaviour in these conversions, leading to a statistical
distribution of the photon transfer functions for the different realizations of
the random magnetic fields. To characterize this effect, we derive new
equations to calculate the mean and the variance of this distribution. We find
that, in presence of ALP conversions, the photon transfer functions on
different lines of sight could have relevant deviations with respect to the
mean value, producing both an enhancement or a suppression in the observable
photon flux with respect to the expectations with only absorption. As a
consequence, the most striking signature of the mixing with ALPs would be a
reconstructed EBL density from TeV photon observations which appears to vary
over different directions of the sky: consistent with standard expectations in
some regions, but inconsistent in others.Comment: v2: 22 pages, 5 eps figures. Minor changes. A reference added.
Matches the version published on JCA
Experimental determination of the J(pi) components of the spin-dipole resonance in B-12
AbstractThe inclusive C12(d→,He2) and exclusive C12(d→,He2+n) reactions have been studied with a beam energy of 171 MeV and scattering angles for the (d,He2) reaction θ=0° and 3°. The studies focused on the separation of the isovector spin-dipole resonance (IVSGDR) into its components by measuring tensor-analysing powers and observing the direct neutron decay to the low-lying proton-hole states in 11B. Merging the information obtained from both measurements resulted in the first-time verification of model-independent predictions of tensor-analysing powers at extreme forward angles and the experimental decomposition of the IVSGDR into its Jπ components. The experimental results are in reasonable agreement with theoretical estimates based on shell-model calculations
Fast Liquid Chromatography Coupled with Tandem Mass Spectrometry for the Analysis of Vanillic and Syringic Acids in Ice Cores
The development of new analytical systems and the improvement of the existing ones to obtain high-resolution measurements of chemical markers in samples from ice cores, is one of the main challenges the paleoclimatic scientific community is facing. Different chemical species can be used as markers for tracking emission sources or specific environmental processes. Although some markers, such as methane sulfonic acid (a proxy of marine productivity), are commonly used, there is a lack of data on other organic tracers in ice cores, making their continuous analysis analytically challenging. Here, we present an innovative combination of fast liquid chromatography coupled with tandem mass spectrometry (FLC-MS/MS) to continuously determine organic markers in ice cores. After specific optimization, this approach was applied to the quantification of vanillic and syringic acids, two specific markers for biomass burning. Using the validated method, detection limits of 3.6 and 4.6 pg mL-1for vanillic and syringic acids, respectively, were achieved. Thanks to the coupling of FLC-MS/MS with the continuous flow analytical system, we obtained one measurement every 30 s, which corresponds to a sampling resolution of a sample every 1.5 cm with a melting rate of 3.0 cm min-1. To check the robustness of the method, we analyzed two parallel sticks of an alpine ice core over more than 5 h. Vanillic acid was found with concentrations in the range of picograms per milliliter, suggesting the combustion of coniferous trees, which are found throughout the Italian Alps
High Energy Neutrinos From Superheavy Dark Matter Annihilation
Superheavy ( GeV) particles produced during inflation may be the
dark matter, independent of their interaction strength. Strongly interacting
superheavy particles will be captured by the sun, and their annihilation in the
center of the sun will produce a flux of energetic neutrinos that should be
detectable by neutrino telescopes. Depending on the particle mass, event rates
in a cubic-kilometer detector range from several per hour to several per year.
The signature of the process is a predominance of tau neutrinos, with a
relatively flat energy spectrum of events ranging from 50 GeV to many TeV, and
with the mean energy of detected tau neutrinos about 3 TeV.Comment: 24 pages, 7 figure
- …