690 research outputs found
A dimensionally continued Poisson summation formula
We generalize the standard Poisson summation formula for lattices so that it
operates on the level of theta series, allowing us to introduce noninteger
dimension parameters (using the dimensionally continued Fourier transform).
When combined with one of the proofs of the Jacobi imaginary transformation of
theta functions that does not use the Poisson summation formula, our proof of
this generalized Poisson summation formula also provides a new proof of the
standard Poisson summation formula for dimensions greater than 2 (with
appropriate hypotheses on the function being summed). In general, our methods
work to establish the (Voronoi) summation formulae associated with functions
satisfying (modular) transformations of the Jacobi imaginary type by means of a
density argument (as opposed to the usual Mellin transform approach). In
particular, we construct a family of generalized theta series from Jacobi theta
functions from which these summation formulae can be obtained. This family
contains several families of modular forms, but is significantly more general
than any of them. Our result also relaxes several of the hypotheses in the
standard statements of these summation formulae. The density result we prove
for Gaussians in the Schwartz space may be of independent interest.Comment: 12 pages, version accepted by JFAA, with various additions and
improvement
Research and innovation as a catalyst for food system transformation
Background: Food systems are associated with severe and persistent problems worldwide. Governance approaches aiming to foster sustainable transformation of food systems face several challenges due to the complex nature of food systems. Scope and approach: In this commentary we argue that addressing these governance challenges requires the development and adoption of novel research and innovation (R&I) approaches that will provide evidence to inform food system transformation and will serve as catalysts for change. We first elaborate on the complexity of food systems (transformation) and stress the need to move beyond traditional linear R&I approaches to be able to respond to persistent problems that affect food systems. Though integrated transdisciplinary approaches are promising, current R&I systems do not sufficiently support such endeavors. As such, we argue, we need strategies that trigger a double transformation - of food systems and of their R&I systems. Key Findings and Conclusions: Seizing the opportunities to transform R&I systems has implications for how research is done - pointing to the need for competence development among researchers, policy makers and society in general - and requires specific governance interventions that stimulate a systemic approach. Such interventions should foster transdisciplinary and transformative research agendas that stimulate portfolios of projects that will reinforce one another, and stimulate innovative experiments to shape conditions for systemic change. In short, a thorough rethinking of the role of R&I as well as how it is funded is a crucial step towards the development of the integrative policies that are necessary to engender systemic change - in the food system and beyond
South African food allergy consensus document 2014
The prevalence of food allergy is increasing worldwide and is an important cause of anaphylaxis. There are no local South African food allergy guidelines. This document was devised by the Allergy Society of South Africa (ALLSA), the South African Gastroenterology Society (SAGES) and the Association for Dietetics in South Africa (ADSA). Subjects may have reactions to more than one food, and different types and severity of reactions to different foods may coexist in one individual. A detailed history directed at identifying the type and severity of possible reactions is essential for every food allergen under consideration. Skin-prick tests and specific immunoglobulin E (IgE) (ImmunoCAP) tests prove IgE sensitisation rather than clinical reactivity. The magnitude of sensitisation combined with the history may be sufficient to ascribe causality, but where this is not possible an incremental oral food challenge may be required to assess tolerance or clinical allergy. For milder non-IgE-mediated conditions a diagnostic elimination diet may be followed with food re-introduction at home to assess causality. The primary therapy for food allergy is strict avoidance of the offending food/s, taking into account nutritional status and provision of alternative sources of nutrients. Acute management of severe reactions requires prompt intramuscular administration of adrenaline 0.01 mg/kg and basic resuscitation. Adjunctive therapy includes antihistamines, bronchodilators and corticosteroids. Subjects with food allergy require risk assessment and those at increased risk for future severe reactions require the implementation of risk-reduction strategies, including education of the patient, families and all caregivers (including teachers), the provision of a written emergency action plan, a MedicAlert necklace or bracelet and injectable adrenaline (preferably via auto-injector) where necessary.http://www.samj.org.zaam2016Paediatrics and Child Healt
Results of the BiPo-1 prototype for radiopurity measurements for the SuperNEMO double beta decay source foils
The development of BiPo detectors is dedicated to the measurement of
extremely high radiopurity in Tl and Bi for the SuperNEMO
double beta decay source foils. A modular prototype, called BiPo-1, with 0.8
of sensitive surface area, has been running in the Modane Underground
Laboratory since February, 2008. The goal of BiPo-1 is to measure the different
components of the background and in particular the surface radiopurity of the
plastic scintillators that make up the detector. The first phase of data
collection has been dedicated to the measurement of the radiopurity in
Tl. After more than one year of background measurement, a surface
activity of the scintillators of (Tl) 1.5
Bq/m is reported here. Given this level of background, a larger BiPo
detector having 12 m of active surface area, is able to qualify the
radiopurity of the SuperNEMO selenium double beta decay foils with the required
sensitivity of (Tl) 2 Bq/kg (90% C.L.) with a six
month measurement.Comment: 24 pages, submitted to N.I.M.
Spectral modeling of scintillator for the NEMO-3 and SuperNEMO detectors
We have constructed a GEANT4-based detailed software model of photon
transport in plastic scintillator blocks and have used it to study the NEMO-3
and SuperNEMO calorimeters employed in experiments designed to search for
neutrinoless double beta decay. We compare our simulations to measurements
using conversion electrons from a calibration source of and show
that the agreement is improved if wavelength-dependent properties of the
calorimeter are taken into account. In this article, we briefly describe our
modeling approach and results of our studies.Comment: 16 pages, 10 figure
Machine Learning in Automated Text Categorization
The automated categorization (or classification) of texts into predefined
categories has witnessed a booming interest in the last ten years, due to the
increased availability of documents in digital form and the ensuing need to
organize them. In the research community the dominant approach to this problem
is based on machine learning techniques: a general inductive process
automatically builds a classifier by learning, from a set of preclassified
documents, the characteristics of the categories. The advantages of this
approach over the knowledge engineering approach (consisting in the manual
definition of a classifier by domain experts) are a very good effectiveness,
considerable savings in terms of expert manpower, and straightforward
portability to different domains. This survey discusses the main approaches to
text categorization that fall within the machine learning paradigm. We will
discuss in detail issues pertaining to three different problems, namely
document representation, classifier construction, and classifier evaluation.Comment: Accepted for publication on ACM Computing Survey
Menus for Feeding Black Holes
Black holes are the ultimate prisons of the Universe, regions of spacetime
where the enormous gravity prohibits matter or even light to escape to
infinity. Yet, matter falling toward the black holes may shine spectacularly,
generating the strongest source of radiation. These sources provide us with
astrophysical laboratories of extreme physical conditions that cannot be
realized on Earth. This chapter offers a review of the basic menus for feeding
matter onto black holes and discusses their observational implications.Comment: 27 pages. Accepted for publication in Space Science Reviews. Also to
appear in hard cover in the Space Sciences Series of ISSI "The Physics of
Accretion onto Black Holes" (Springer Publisher
Partial Wave Analysis of
BES data on are presented. The
contribution peaks strongly near threshold. It is fitted with a
broad resonance with mass MeV, width MeV. A broad resonance peaking at 2020 MeV is also required
with width MeV. There is further evidence for a component
peaking at 2.55 GeV. The non- contribution is close to phase
space; it peaks at 2.6 GeV and is very different from .Comment: 15 pages, 6 figures, 1 table, Submitted to PL
Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 7 TeV pp collisions with the ATLAS detector
A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fb−1 of proton–proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results
Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC
Measurements of inclusive jet suppression in heavy ion collisions at the LHC
provide direct sensitivity to the physics of jet quenching. In a sample of
lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated
luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with
a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the
transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the
anti-kt algorithm with values for the distance parameter that determines the
nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of
the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp.
Jet production is found to be suppressed by approximately a factor of two in
the 10% most central collisions relative to peripheral collisions. Rcp varies
smoothly with centrality as characterized by the number of participating
nucleons. The observed suppression is only weakly dependent on jet radius and
transverse momentum. These results provide the first direct measurement of
inclusive jet suppression in heavy ion collisions and complement previous
measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables,
submitted to Physics Letters B. All figures including auxiliary figures are
available at
http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02
- …