239 research outputs found
Tools and Procedures for the CTA Array Calibration
The Cherenkov Telescope Array (CTA) is an international initiative to build
the next generation ground-based very-high-energy gamma-ray observatory. Full
sky coverage will be assured by two arrays, one located on each of the northern
and southern hemispheres. Three different sizes of telescopes will cover a wide
energy range from tens of GeV up to hundreds of TeV. These telescopes, of which
prototypes are currently under construction or completion, will have different
mirror sizes and fields-of-view designed to access different energy regimes.
Additionally, there will be groups of telescopes with different optics system,
camera and electronics design. Given this diversity of instruments, an overall
coherent calibration of the full array is a challenging task. Moreover, the CTA
requirements on calibration accuracy are much more stringent than those
achieved with current Imaging Atmospheric Cherenkov Telescopes, like for
instance: the systematic errors in the energy scale must not exceed 10%.In this
contribution we present both the methods that, applied directly to the acquired
observational CTA data, will ensure that the calibration is correctly performed
to the stringent required precision, and the calibration equipment that,
external to the telescopes, is currently under development and testing.
Moreover, some notes about the operative procedure to be followed with both
methods and instruments, will be described. The methods applied to the
observational CTA data include the analysis of muon ring images, of carefully
selected cosmic-ray air shower images, of the reconstructed electron spectrum
and that of known gamma-ray sources and the possible use of stereo techniques
hardware-independent. These methods will be complemented with the use of
calibrated light sources located on ground or on board unmanned aerial
vehicles.Comment: All CTA contributions at arXiv:1709.0348
Three-dimensional flow structure and bed morphology in large elongate meander loops with different outer bank roughness characteristics
© 2016. American Geophysical Union. All Rights Reserved. Few studies have examined the three-dimensional flow structure and bed morphology within elongate loops of large meandering channels. The present study focuses on the spatial patterns of three-dimensional flow structure and bed morphology within two elongate meander loops and examines how differences in outer bank roughness influence near-bank flow characteristics. Three-dimensional velocities were measured during two different events—a near-bankfull flow and an overbank event. Detailed data on channel bathymetry and bed form geometry were obtained during a near-bankfull event. Flow structure within the loops is characterized by strong topographic steering by the point bar, by the development of helical motion associated with flow curvature, and by acceleration of flow where bedrock is exposed along the outer bank. Near-bank velocities during the overbank event are less than those for the near-bankfull flow, highlighting the strong influence of the point bar on redistribution of mass and momentum of the flow at subbankfull stages. Multiple outer bank pools are evident within the elongate meander loop with low outer bank roughness, but are not present in the loop with high outer bank roughness, which may reflect the influence of abundant large woody debris on near-bank velocity characteristics. The positions of pools within both loops can be linked to spatial variations in planform curvature. The findings indicate that flow structure and bed morphology in these large elongate loops is similar to that in small elongate loops, but differs somewhat from flow structure and bed morphology reported for experimental elongate loops
A Scalable Correlator Architecture Based on Modular FPGA Hardware, Reuseable Gateware, and Data Packetization
A new generation of radio telescopes is achieving unprecedented levels of
sensitivity and resolution, as well as increased agility and field-of-view, by
employing high-performance digital signal processing hardware to phase and
correlate large numbers of antennas. The computational demands of these imaging
systems scale in proportion to BMN^2, where B is the signal bandwidth, M is the
number of independent beams, and N is the number of antennas. The
specifications of many new arrays lead to demands in excess of tens of PetaOps
per second.
To meet this challenge, we have developed a general purpose correlator
architecture using standard 10-Gbit Ethernet switches to pass data between
flexible hardware modules containing Field Programmable Gate Array (FPGA)
chips. These chips are programmed using open-source signal processing libraries
we have developed to be flexible, scalable, and chip-independent. This work
reduces the time and cost of implementing a wide range of signal processing
systems, with correlators foremost among them,and facilitates upgrading to new
generations of processing technology. We present several correlator
deployments, including a 16-antenna, 200-MHz bandwidth, 4-bit, full Stokes
parameter application deployed on the Precision Array for Probing the Epoch of
Reionization.Comment: Accepted to Publications of the Astronomy Society of the Pacific. 31
pages. v2: corrected typo, v3: corrected Fig. 1
What Next-Generation 21 cm Power Spectrum Measurements Can Teach Us About the Epoch of Reionization
A number of experiments are currently working towards a measurement of the 21
cm signal from the Epoch of Reionization. Whether or not these experiments
deliver a detection of cosmological emission, their limited sensitivity will
prevent them from providing detailed information about the astrophysics of
reionization. In this work, we consider what types of measurements will be
enabled by a next-generation of larger 21 cm EoR telescopes. To calculate the
type of constraints that will be possible with such arrays, we use simple
models for the instrument, foreground emission, and the reionization history.
We focus primarily on an instrument modeled after the
collecting area Hydrogen Epoch of Reionization Array (HERA) concept design, and
parameterize the uncertainties with regard to foreground emission by
considering different limits to the recently described "wedge" footprint in
k-space. Uncertainties in the reionization history are accounted for using a
series of simulations which vary the ionizing efficiency and minimum virial
temperature of the galaxies responsible for reionization, as well as the mean
free path of ionizing photons through the IGM. Given various combinations of
models, we consider the significance of the possible power spectrum detections,
the ability to trace the power spectrum evolution versus redshift, the
detectability of salient power spectrum features, and the achievable level of
quantitative constraints on astrophysical parameters. Ultimately, we find that
of collecting area is enough to ensure a very high significance
() detection of the reionization power spectrum in even the
most pessimistic scenarios. This sensitivity should allow for meaningful
constraints on the reionization history and astrophysical parameters,
especially if foreground subtraction techniques can be improved and
successfully implemented.Comment: 27 pages, 18 figures, updated SKA numbers in appendi
Time-resolved Neutron-gamma-ray Data Acquisition for in Situ Subsurface Planetary Geochemistry
The current gamma-ray/neutron instrumentation development effort at NASA Goddard Space Flight Center aims to extend the use of active pulsed neutron interrogation techniques to probe the subsurface elemental composition of planetary bodies in situ. Previous NASA planetary science missions, that used neutron and/or gamma-ray spectroscopy instruments, have relied on neutrons produced from galactic cosmic rays. One of the distinguishing features of this effort is the inclusion of a high intensity 14.1 MeV pulsed neutron generator synchronized with a custom data acquisition system to time each event relative to the pulse. With usually only one opportunity to collect data, it is difficult to set a priori time-gating windows to obtain the best possible results. Acquiring time-tagged, event-by-event data from nuclear induced reactions provides raw data sets containing channel/energy, and event time for each gamma ray or neutron detected. The resulting data set can be plotted as a function of time or energy using optimized analysis windows after the data are acquired. Time windows can now be chosen to produce energy spectra that yield the most statistically significant and accurate elemental composition results that can be derived from the complete data set. The advantages of post-processing gamma-ray time-tagged event-by-event data in experimental tests using our prototype instrument will be demonstrated
Bottom and Suspended Sediment Backscatter Measurements in a Flume—Towards Quantitative Bed and Water Column Properties
For health and impact studies of water systems, monitoring underwater environments is essential, for which multi-frequency single- and multibeam echosounders are commonly used state-of-the-art technologies. However, the current scarcity of sediment reference datasets of both bottom backscatter angular response and water column scattering hampers empirical data interpretation. Comprehensive reference data derived from measurements in a controlled environment should optimize the use of empirical backscatter data. To prepare for such innovative experiments, we conducted a feasibility experiment in the Delta Flume (Deltares, The Netherlands). Several configurations of sonar data were recorded of the flume floor and suspended sediment plumes. The results revealed that flume reverberation was sufficiently low and that the differential settling of fine-sand plumes in the water column was clearly detected. Following this successful feasibility test, future comprehensive experiments will feature multi-frequency multi-angle measurements on a variety of sediment types, additional scatterers and sediment plumes, resulting in reference datasets for an improved interpretation of underwater backscatter measurements for scientific observation and sustainable management
String Theory on Warped AdS_3 and Virasoro Resonances
We investigate aspects of holographic duals to time-like warped AdS_3
space-times--which include G\"odel's universe--in string theory. Using
worldsheet techniques similar to those that have been applied to AdS_3
backgrounds, we are able to identify space-time symmetry algebras that act on
the dual boundary theory. In particular, we always find at least one Virasoro
algebra with computable central charge. Interestingly, there exists a dense set
of points in the moduli space of these models in which there is actually a
second commuting Virasoro algebra, typically with different central charge than
the first. We analyze the supersymmetry of the backgrounds, finding related
enhancements, and comment on possible interpretations of these results. We also
perform an asymptotic symmetry analysis at the level of supergravity, providing
additional support for the worldsheet analysis.Comment: 24 pages + appendice
Governance, Coordination and Evaluation: the case for an epistemological focus and a return to C.E. Lindblom
While much political science research focuses on conceptualizing and analyzing various forms of governance, there remains a need to develop frameworks and criteria for governance evaluation (Torfing et al 2012). The post-positivist turn, influential in recent governance theory, emphasizes the complexity, uncertainty and the contested normative dimensions of policy analysis. Yet a central evaluative question still arises concerning the capacity of governance networks to facilitate ‘coordination’. The classic contributions of Charles Lindblom, although pre-dating the contemporary governance literature, can enable further elaboration of and engagement with this question. Lindblom’s conceptualisation of coordination challenges in the face of complexity shares with post-positivism a recognition of the inevitably contested nature of policy goals. Yet Lindblom suggests a closer focus on the complex, dynamically evolving, broadly ‘economic’ choices and trade-offs involved in defining and delivery policy for enabling these goals to be achieved and the significant epistemological challenges that they raise for policy-makers. This focus can complement and enrich both post-positivist scholarship and the process and incentives-orientated approaches which predominate in contemporary political science research on coordination in governance. This is briefly illustrated through a short case study evaluating governance for steering markets towards delivering low and zero carbon homes in England
Working with wood in rivers in the Western United States
Recognition of the important physical and ecological roles played by large wood in channels and on floodplains has grown substantially during recent decades. Although large wood continues to be routinely removed from many river corridors worldwide, the practice of wood reintroduction has spread across the United States, the United Kingdom and western Europe, Australia, and New Zealand. The state-of-science regarding working with wood in rivers was discussed during a workshop held in Colorado, USA, in September 2022 with 40 participants who are scientists and practitioners from across the USA, UK, Europe, and Japan. The objectives of this paper are to present the findings from the workshop; summarize two case studies of wood in river restoration in the western United States; and provide suggestions for advancing the practice of wood in river management. We summarize the workshop results based on participant judgements and recommendations with respect to: (i) limitations and key barriers to using wood, which reflect perceptions and practicalities; (ii) gaps in the use of large wood in river management; (iii) scenarios in which wood is generally used effectively; and (iv) scenarios in which wood is generally not used effectively. The case studies illustrate the importance of the local geomorphic context, the configuration complexity of the wood, and the potential for modification of river corridor morphology to enhance desired benefits. Moving forward, we stress the importance of collaboration across disciplines and across communities of research scientists, practitioners, regulators, and potential stakeholders; accounting for stakeholder perceptions of the use of large wood; and increasing non-scientist access to the latest state-of-science knowledge
PhosTryp: a phosphorylation site predictor specific for parasitic protozoa of the family trypanosomatidae
<p>Abstract</p> <p>Background</p> <p>Protein phosphorylation modulates protein function in organisms at all levels of complexity. Parasites of the <it>Leishmania </it>genus undergo various developmental transitions in their life cycle triggered by changes in the environment. The molecular mechanisms that these organisms use to process and integrate these external cues are largely unknown. However <it>Leishmania </it>lacks transcription factors, therefore most regulatory processes may occur at a post-translational level and phosphorylation has recently been demonstrated to be an important player in this process. Experimental identification of phosphorylation sites is a time-consuming task. Moreover some sites could be missed due to the highly dynamic nature of this process or to difficulties in phospho-peptide enrichment.</p> <p>Results</p> <p>Here we present PhosTryp, a phosphorylation site predictor specific for trypansomatids. This method uses an SVM-based approach and has been trained with recent <it>Leishmania </it>phosphosproteomics data. PhosTryp achieved a 17% improvement in prediction performance compared with Netphos, a non organism-specific predictor. The analysis of the peptides correctly predicted by our method but missed by Netphos demonstrates that PhosTryp captures <it>Leishmania</it>-specific phosphorylation features. More specifically our results show that <it>Leishmania </it>kinases have sequence specificities which are different from their counterparts in higher eukaryotes. Consequently we were able to propose two possible <it>Leishmania</it>-specific phosphorylation motifs.</p> <p>We further demonstrate that this improvement in performance extends to the related trypanosomatids <it>Trypanosoma brucei </it>and <it>Trypanosoma cruzi</it>. Finally, in order to maximize the usefulness of PhosTryp, we trained a predictor combining all the peptides from <it>L. infantum, T. brucei and T. cruzi</it>.</p> <p>Conclusions</p> <p>Our work demonstrates that training on organism-specific data results in an improvement that extends to related species. PhosTryp is freely available at <url>http://phostryp.bio.uniroma2.it</url></p
- …