1,127 research outputs found

    Central Acceptance Testing for Camera Technologies for CTA

    Full text link
    The Cherenkov Telescope Array (CTA) is an international initiative to build the next generation ground based very-high energy gamma-ray observatory. It will consist of telescopes of three different sizes, employing several different technologies for the cameras that detect the Cherenkov light from the observed air showers. In order to ensure the compliance of each camera technology with CTA requirements, CTA will perform central acceptance testing of each camera technology. To assist with this, the Camera Test Facilities (CTF) work package is developing a detailed test program covering the most important performance, stability, and durability requirements, including setting up the necessary equipment. Performance testing will include a wide range of tests like signal amplitude, time resolution, dead-time determination, trigger efficiency, performance testing under temperature and humidity variations and several others. These tests can be performed on fully-integrated cameras using a portable setup at the camera construction sites. In addition, two different setups for performance tests on camera sub-units are being built, which can provide early feedback for camera development. Stability and durability tests will include the long-term functionality of movable parts, water tightness of the camera housing, temperature and humidity cycling, resistance to vibrations during transport or due to possible earthquakes, UV-resistance of materials and several others. Some durability tests will need to be contracted out because they will need dedicated equipment not currently available within CTA. The planned test procedures and the current status of the test facilities will be presented.Comment: 8 pages, 3 figures. In Proceedings of the 34th International Cosmic Ray Conference (ICRC2015), The Hague, The Netherlands. All CTA contributions at arXiv:1508.0589

    Political market and regulatory uncertainty: Insights and implications for integrated strategy

    Get PDF
    Managers can craft effective integrated strategy by properly assessing regulatory uncertainty. Leveraging the existing political markets literature, we predict regulatory uncertainty from the novel interaction of demand and supply side rivalries across a range of political markets. We argue for two primary drivers of regulatory uncertainty: ideology-motivated interests opposed to the firm and a lack of competition for power among political actors supplying public policy. We align three, previously disparate dimensions of nonmarket strategy - profile level, coalition breadth, and pivotal target - to levels of regulatory uncertainty. Through this framework, we demonstrate how and when firms employ different nonmarket strategies. To illustrate variation in nonmarket strategy across levels of regulatory uncertainty, we analyze several market entry decisions of foreign firms operating in the global telecommunications sector

    Detection and count of Salmonella enterica in pork meat products

    Get PDF
    A direct plating technique for the enumeration S. enterica in 90 pig meat samples was evaluated in comparison with a three tube-MPN procedure. For the detection of S. enterica the ISO 6597:2002 method was employed. Pork samples were collected at retail level in northern Italy. A total of 15 (16.7%) Salmonella positive samples were detected. By the use of the MPN method, S. enterica was countable m 12 (80.0%) samples, while the direct count gave positive results in two (13.3%) samples only The ISO 6597.2002 method identified 12 (80 %) contaminated samples out of 15. The enumeration levels of S. enterica ranged from 0.03 MPN/g to \u3e 110 MPN/g by the MPN method, and from 10 CFU/g to 180 CFU/g by direct plating. Seven Salmonella serovars were detected. S. Typhimurium, S. Derby, S. Give, S. Rissen, S. Livingstone, S. Brandenburg and S. London, with S. Typhimurium and S. Derby as the predominant ones

    Oltre i luoghi, oltre le identità : per una ridefinizione culturale del rapporto tra uomo e natura

    Get PDF
    L’attuale momento della storia umana è inedito, unico, probabilmente irripetibile. Nel corso dei secoli e dei millenni passati poco o per nulla sono state ascoltate le parole che raccontavano del fluire del mondo, della sua incessante trasformazione. Eraclito, Lao-Tze ecc. sin da tempi remoti spiegavano questa verità: tutto scorre e si trasforma. Il pensiero occidentale ha fissato, durante il suo corso, questo scorrere cercando di creare e applicare modelli replicabili e riapplicabili. Il modello più efficiente ed efficace è stato quello della natura o meglio del rapporto uomo-natura. La crisi climatica che la Terra sta vivendo richiede un profondo ripensamento di questo concetto, come di molti altri; ma proprio a partire da esso, quale base strutturale, si palesa l’urgenza di riprogettare l’abitare umano (Heidegger 2015). Tutto ciò a partire da una rifondazione di base filosofica e geografica del rapporto tra uomo e mondo che abbia come centro nuovi termini, concetti e parole in grado di comprendere la complessità della situazione del mondo; base essenziale per una riflessione in tal senso sarà il concetto di “hyperobject” prodotto da Timothy Morton (2013)

    Habitat availability for amphibians and extinction threat: A global analysis

    Get PDF
    Aim: Habitat loss and degradation are the factors threatening the largest number of amphibian species. However, quantitative measures of habitat availability only exist for a small subset of them. We evaluated the relationships between habitat availability, extinction risk and drivers of threat for the world's amphibians. We developed deductive habitat suitability models to estimate the extent of suitable habitat and the proportion of suitable habitat (PSH) inside the geographic range of each species, covering species and areas for which little or no high-resolution distribution data are available. Location: Global. Methods: We used information on habitat preferences to develop habitat suitability models at 300-m resolution, by integrating range maps with land cover and elevation. Model performance was assessed by comparing model output with point localities where species were recorded. We then used habitat availability as a surrogate of area of occupancy. Using the IUCN criteria, we identified species having narrow area of occupancy, for which extinction risk is likely underestimated. Results: We developed models for 5363 amphibians. Validation success of models was high (94%), being better for forest specialists and generalists than for open habitat specialists. Generalists had proportionally more habitat than forest or open habitat specialists. The PSH was lower for species having small geographical ranges, currently listed as threatened, and for which habitat loss is recognized as a threat. Differences in habitat availability among biogeographical realms were strong. We identified 61 forest species for which the extinction risk may be higher that currently assessed in the Red List, due to limited extent of suitable habitat. Main conclusions: Habitat models can accurately predict amphibian distribution at fine scale and allow describing biogeographical patterns of habitat availability. The strong relationship between amount of suitable habitat and extinction threat may help the conservation assessment in species for which limited information is currently available

    In silico identification of new putative pathogenic variants in the NEU1 sialidase gene affecting enzyme function and subcellular localization

    Get PDF
    The NEU1 gene is the first identified member of the human sialidases, glycohydrolitic enzymes that remove the terminal sialic acid from oligosaccharide chains. Mutations in NEU1 gene are causative of sialidosis (MIM 256550), a severe lysosomal storage disorder showing autosomal recessive mode of inheritance. Sialidosis has been classified into two subtypes: sialidosis type I, a normomorphic, late-onset form, and sialidosis type II, a more severe neonatal or early-onset form. A total of 50 causative mutations are reported in HGMD database, most of which are missense variants. To further characterize the NEU1 gene and identify new functionally relevant protein isoforms, we decided to study its genetic variability in the human population using the data generated by two large sequencing projects: the 1000 Genomes Project (1000G) and the NHLBI GO Exome Sequencing Project (ESP). Together these two datasets comprise a cohort of 7595 sequenced individuals, making it possible to identify rare variants and dissect population specific ones. By integrating this approach with biochemical and cellular studies, we were able to identify new rare missense and frameshift alleles in NEU1 gene. Among the 9 candidate variants tested, only two resulted in significantly lower levels of sialidase activity (pC and c.700G>A. These two mutations give rise to the amino acid substitutions p.V217A and p.D234N, respectively. NEU1 variants including either of these two amino acid changes have 44% and 25% residual sialidase activity when compared to the wild-type enzyme, reduced protein levels and altered subcellular localization. Thus they may represent new, putative pathological mutations resulting in sialidosis type I. The in silico approach used in this study has enabled the identification of previously unknown NEU1 functional alleles that are widespread in the population and could be tested in future functional studies

    Realtime processing of LOFAR data for the detection of nano-second pulses from the Moon

    Get PDF
    The low flux of the ultra-high energy cosmic rays (UHECR) at the highest energies provides a challenge to answer the long standing question about their origin and nature. Even lower fluxes of neutrinos with energies above 102210^{22} eV are predicted in certain Grand-Unifying-Theories (GUTs) and e.g.\ models for super-heavy dark matter (SHDM). The significant increase in detector volume required to detect these particles can be achieved by searching for the nano-second radio pulses that are emitted when a particle interacts in Earth's moon with current and future radio telescopes. In this contribution we present the design of an online analysis and trigger pipeline for the detection of nano-second pulses with the LOFAR radio telescope. The most important steps of the processing pipeline are digital focusing of the antennas towards the Moon, correction of the signal for ionospheric dispersion, and synthesis of the time-domain signal from the polyphased-filtered signal in frequency domain. The implementation of the pipeline on a GPU/CPU cluster will be discussed together with the computing performance of the prototype.Comment: Proceedings of the 22nd International Conference on Computing in High Energy and Nuclear Physics (CHEP2016), US

    Cosmic Ray Physics with the LOFAR Radio Telescope

    Full text link
    The LOFAR radio telescope is able to measure the radio emission from cosmic ray induced air showers with hundreds of individual antennas. This allows for precision testing of the emission mechanisms for the radio signal as well as determination of the depth of shower maximum XmaxX_{\max}, the shower observable most sensitive to the mass of the primary cosmic ray, to better than 20 g/cm2^2. With a densely instrumented circular area of roughly 320 m2^2, LOFAR is targeting for cosmic ray astrophysics in the energy range 101610^{16} - 101810^{18} eV. In this contribution we give an overview of the status, recent results, and future plans of cosmic ray detection with the LOFAR radio telescope.Comment: Proceedings of the 26th Extended European Cosmic Ray Symposium (ECRS), Barnaul/Belokurikha, 201
    corecore