132 research outputs found

    Floquet topological transitions in extended Kane-Mele models with disorder

    Get PDF
    In this work we use Floquet theory to theoretically study the influence of circularly polarized light on disordered two-dimensional models exhibiting topological transitions. We find circularly polarized light can induce a topological transition in extended Kane-Mele models that include additional hopping terms and on-site disorder. The topological transitions are understood from the Floquet-Bloch band structure of the clean system at high symmetry points in the first Brillouin zone. The light modifies the equilibrium band structure of the clean system in such a way that the smallest gap in the Brillouin zone can be shifted from the MM points to the K(Kâ€Č)K(K') points, the Γ\Gamma point, or even other lower symmetry points. The movement of the minimal gap point through the Brillouin zone as a function of laser parameters is explained in the high frequency regime through the Magnus expansion. In the disordered model, we compute the Bott index to reveal topological phases and transitions. The disorder can induce transitions from topologically non-trivial states to trivial states or vice versa, both examples of Floquet topological Anderson transitions. As a result of the movement of the minimal gap point through the Brillouin zone as a function of laser parameters, the nature of the topological phases and transitions is laser-parameter dependent--a contrasting behavior to the Kane-Mele model.Comment: 10 pages, 7 figure

    The VITROVAC Cavity for the TERA/PIMMS Medical Synchrotron

    Get PDF
    A proton and light-ion medical synchrotron is characterised by a large frequency swing for the RF between the injection and the top energy. For this purpose, a VITROVACÂź-loaded RF cavity has been developed for the Proton-Ion Medical Machine Study (PIMMS) at CERN, and for TERA, the Italian project of a proton and light-ion synchrotron for cancer therapy, based on the PIMMS study. The main features are a large frequency swing, particularly extended to the low frequency range, a very large relative permeability and a low Q factor. The total power needed is less than 100 kW, while a very small bias power is required for the frequency tuning. The main mechanical characteristics are compactness (less than 1.5 m), and simplicity of construction. As a result, the requirements of the medical synchrotron are comfortably satisfied, namely: 0.4 to 3 MHz swing, 3 kV peak voltage at a repetition rate of less than 1 s

    Progress of the klystron and Cavity test stand for the FAIR proton linac

    Get PDF
    In collaboration between the FAIR project, GSI, and CNRS, the IPNO lab provided the high power RF components for a cavity and klystron test stand. For initial operation of the 3 MW Thales TH2181 klystron at 325.224 MHz we received a high voltage modulator from CERN Linac 4 as a loan. Here we report, how we integrated the combination of klystron, high voltage modulator, and auxiliaries to accumulate operating experience. RF operation of the klystron started on a water cooled load, soon the circulator will be included and then the prototype CH cavity in the radiation shielded area will be powered. The 45 kW amplifiers for the 3 buncher structures of the FAIR proton Linac were checked at the test stand, and the results are presented here

    Modernisation of the 108 MHz RF systems of the UNILAC post stripper section

    Get PDF

    A Tamarisk Habitat Suitability Map for the Continental US

    Get PDF
    This paper presents a national-scale map of habitat suitability for a high-priority invasive species, Tamarisk (Tamarisk spp., salt cedar). We successfully integrate satellite data and tens of thousands of field sampling points through logistic regression modeling to create a habitat suitability map that is 90% accurate. This interagency effort uses field data collected and coordinated through the US Geological Survey and nation-wide environmental data layers derived from NASA s MODerate Resolution Imaging Spectroradiometer (MODIS). We demonstrate the utilization of the map by ranking the lower 48 US states (and the District of Columbia) based upon their absolute, as well as proportional, areas of highly likely and moderately likely habitat for Tamarisk. The interagency effort and modeling approach presented here could be applied to map other harmful species in the US and globally

    A New Application to Facilitate Post-Fire Recovery and Rehabilitation in Savanna Ecosystems

    Get PDF
    The U.S. government spends an estimated $3billion per year to fight forest fires in the United States. Post-fire rehabilitation activities represent a small but essential portion of that total. The Rehabilitation Capability Convergence for Ecosystem Recovery (RECOVER) system is currently under development for Savanna ecosystems in the western U.S. The prototype of this system has been built and will have realworld testing during the summer 2013 fire season. When fully deployed, the RECOVER system will provide the emergency rehabilitation teams with critical and timely information for management decisions regarding stabilization and rehabilitation strategies

    MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science Through Cloud-enabled Climate Analytics-as-a-service

    Get PDF
    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data

    759–5 Use of an Interactive Electronic Whiteboard to Teach Clinical Cardiology Decision Analysis to Medical Students

    Get PDF
    We used innovative state-of-the-art computer and collaboration technologies to teach first-year medical students an analytic methodology to solve difficult clinical cardiology problems to make informed medical decisions. Clinical examples included the decision to administer thrombolytic therapy considering the risk of hemorrhagic stroke, and activity recommendations for athletes at risk for sudden death. Students received instruction on the decision-analytic approach which integrates pathophysiology, treatment efficacy, diagnostic test interpretation, health outcomes, patient preferences, and cost-effectiveness into a decision-analytic model.The traditional environment of a small group and blackboard was significantly enhanced by using an electronic whiteboard, the Xerox LiveBoardℱ. The LiveBoard features an 80486-based personal computer, large (3’×4’) display, and wireless pens for input. It allowed the integration of decision-analytic software, statistical software, digital slides, and additional media. We developed TIDAL (Team Interactive Decision Analysis in the Large-screen environment), a software package to interactively construct decision trees, calculate expected utilities, and perform one- and two-way sensitivity analyses using pen and gesture inputs. The Live Board also allowed the novel incorporation of Gambler, a utility assessment program obtained from the New England Medical Center. Gambler was used to obtain utilities for outcomes such as non-disabling hemorrhagic stroke. The interactive nature of the LiveBoard allowed real-time decision model development by the class, followed by instantaneous calculation of expected utilities and sensitivity analyses. The multimedia aspect and interactivity were conducive to extensive class participation.Ten out of eleven students wanted decision-analytic software available for use during their clinical years and all students would recommend the course to next year's students. We plan to experiment with the electronic collaboration features of this technology and allow groups separated by time or space to collaborate on decisions and explore the models created

    Spin‐Flipping Polarized Deuterons At COSY

    Full text link
    We recently stored a 1.85 GeV/c vertically polarized deuteron beam in the COSY Ring in JĂŒlich; we then spin‐flipped it by ramping a new air‐core rf dipole’s frequency through an rf‐induced spin resonance to manipulate the polarization direction of the deuteron beam. We first experimentally determined the resonance’s frequency and set the dipole’s rf voltage to its maximum; then we varied its frequency ramp time and frequency range. We used the EDDA detector to measure the vector and tensor polarization asymmetries. We have not yet extracted the deuteron’s tensor polarization spin‐flip parameters from the measured data, since our short run did not provide adequate tensor analyzing‐power data at 1.85 GeV/c. However, with a 100 Hz frequency ramp and our longest ramp time of 400 s, the deuterons’ vector polarization spin‐flip efficiency was 48±1%. © 2004 American Institute of PhysicsPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/87592/2/763_1.pd

    Big Data Challenges in Climate Science: Improving the Next-Generation Cyberinfrastructure

    Get PDF
    The knowledge we gain from research in climate science depends on the generation, dissemination, and analysis of high-quality data. This work comprises technical practice as well as social practice, both of which are distinguished by their massive scale and global reach. As a result, the amount of data involved in climate research is growing at an unprecedented rate. Climate model intercomparison (CMIP) experiments, the integration of observational data and climate reanalysis data with climate model outputs, as seen in the Obs4MIPs, Ana4MIPs, and CREATE-IP activities, and the collaborative work of the Intergovernmental Panel on Climate Change (IPCC) provide examples of the types of activities that increasingly require an improved cyberinfrastructure for dealing with large amounts of critical scientific data. This paper provides an overview of some of climate science's big data problems and the technical solutions being developed to advance data publication, climate analytics as a service, and interoperability within the Earth System Grid Federation (ESGF), the primary cyberinfrastructure currently supporting global climate research activities
    • 

    corecore