45 research outputs found

    The SPARC Toroidal Field Model Coil Program

    Full text link
    The SPARC Toroidal Field Model Coil (TFMC) Program was a three-year effort between 2018 and 2021 that developed novel Rare Earth Yttrium Barium Copper Oxide (REBCO) superconductor technologies and then successfully utilized these technologies to design, build, and test a first-in-class, high-field (~20 T), representative-scale (~3 m) superconducting toroidal field coil. With the principal objective of demonstrating mature, large-scale, REBCO magnets, the project was executed jointly by the MIT Plasma Science and Fusion Center (PSFC) and Commonwealth Fusion Systems (CFS). The TFMC achieved its programmatic goal of experimentally demonstrating a large-scale high-field REBCO magnet, achieving 20.1 T peak field-on-conductor with 40.5 kA of terminal current, 815 kN/m of Lorentz loading on the REBCO stacks, and almost 1 GPa of mechanical stress accommodated by the structural case. Fifteen internal demountable pancake-to-pancake joints operated in the 0.5 to 2.0 nOhm range at 20 K and in magnetic fields up to 12 T. The DC and AC electromagnetic performance of the magnet, predicted by new advances in high-fidelity computational models, was confirmed in two test campaigns while the massively parallel, single-pass, pressure-vessel style coolant scheme capable of large heat removal was validated. The REBCO current lead and feeder system was experimentally qualified up to 50 kA, and the crycooler based cryogenic system provided 600 W of cooling power at 20 K with mass flow rates up to 70 g/s at a maximum design pressure of 20 bar-a for the test campaigns. Finally, the feasibility of using passive, self-protection against a quench in a fusion-scale NI TF coil was experimentally assessed with an intentional open-circuit quench at 31.5 kA terminal current.Comment: 17 pages 9 figures, overview paper and the first of a six-part series of papers covering the TFMC Progra

    The SPARC Toroidal Field Model Coil Program

    Get PDF

    Observed controls on resilience of groundwater to climate variability in sub-Saharan Africa

    Get PDF
    Groundwater in sub-Saharan Africa supports livelihoods and poverty alleviation1,2, maintains vital ecosystems, and strongly influences terrestrial water and energy budgets. Yet the hydrological processes that govern groundwater recharge and sustainability—and their sensitivity to climatic variability—are poorly constrained4. Given the absence of firm observational constraints, it remains to be seen whether model-based projections of decreased water resources in dry parts of the region4 are justified. Here we show, through analysis of multidecadal groundwater hydrographs across sub-Saharan Africa, that levels of aridity dictate the predominant recharge processes, whereas local hydrogeology influences the type and sensitivity of precipitation–recharge relationships. Recharge in some humid locations varies by as little as five per cent (by coefficient of variation) across a wide range of annual precipitation values. Other regions, by contrast, show roughly linear precipitation–recharge relationships, with precipitation thresholds (of roughly ten millimetres or less per day) governing the initiation of recharge. These thresholds tend to rise as aridity increases, and recharge in drylands is more episodic and increasingly dominated by focused recharge through losses from ephemeral overland flows. Extreme annual recharge is commonly associated with intense rainfall and flooding events, themselves often driven by large-scale climate controls. Intense precipitation, even during years of lower overall precipitation, produces some of the largest years of recharge in some dry subtropical locations. Our results therefore challenge the ‘high certainty’ consensus regarding decreasing water resources in such regions of sub-Saharan Africa. The potential resilience of groundwater to climate variability in many areas that is revealed by these precipitation–recharge relationships is essential for informing reliable predictions of climate-change impacts and adaptation strategies

    A View from the Past Into our Collective Future: The Oncofertility Consortium Vision Statement

    Get PDF
    Today, male and female adult and pediatric cancer patients, individuals transitioning between gender identities, and other individuals facing health extending but fertility limiting treatments can look forward to a fertile future. This is, in part, due to the work of members associated with the Oncofertility Consortium. The Oncofertility Consortium is an international, interdisciplinary initiative originally designed to explore the urgent unmet need associated with the reproductive future of cancer survivors. As the strategies for fertility management were invented, developed or applied, the individuals for who the program offered hope, similarly expanded. As a community of practice, Consortium participants share information in an open and rapid manner to addresses the complex health care and quality-of-life issues of cancer, transgender and other patients. To ensure that the organization remains contemporary to the needs of the community, the field designed a fully inclusive mechanism for strategic planning and here present the findings of this process. This interprofessional network of medical specialists, scientists, and scholars in the law, medical ethics, religious studies and other disciplines associated with human interventions, explore the relationships between health, disease, survivorship, treatment, gender and reproductive longevity. The goals are to continually integrate the best science in the service of the needs of patients and build a community of care that is ready for the challenges of the field in the future

    Environmental complexity and biodiversity: the multi-layered evolutionary history of a log-dwelling velvet worm in montane temperate Australia

    Get PDF
    Phylogeographic studies provide a framework for understanding the importance of intrinsic versus extrinsic factors in shaping patterns of biodiversity through identifying past and present microevolutionary processes that contributed to lineage divergence. Here we investigate population structure and diversity of the Onychophoran (velvet worm) Euperipatoides rowelli in southeastern Australian montane forests that were not subject to Pleistocene glaciations, and thus likely retained more forest cover than systems under glaciation. Over a ~100 km transect of structurally-connected forest, we found marked nuclear and mitochondrial (mt) DNA genetic structuring, with spatially-localised groups. Patterns from mtDNA and nuclear data broadly corresponded with previously defined geographic regions, consistent with repeated isolation in refuges during Pleistocene climatic cycling. Nevertheless, some E. rowelli genetic contact zones were displaced relative to hypothesized influential landscape structures, implying more recent processes overlying impacts of past environmental history. Major impacts at different timescales were seen in the phylogenetic relationships among mtDNA sequences, which matched geographic relationships and nuclear data only at recent timescales, indicating historical gene flow and/or incomplete lineage sorting. Five major E. rowelli phylogeographic groups were identified, showing substantial but incomplete reproductive isolation despite continuous habitat. Regional distinctiveness, in the face of lineages abutting within forest habitat, could indicate pre- and/or postzygotic gene flow limitation. A potentially functional phenotypic character, colour pattern variation, reflected the geographic patterns in the molecular data. Spatial-genetic patterns broadly match those in previously-studied, co-occurring low-mobility organisms, despite a variety of life histories. We suggest that for E. rowelli, the complex topography and history of the region has led to interplay among limited dispersal ability, historical responses to environmental change, local adaptation, and some resistance to free admixture at geographic secondary contact, leading to strong genetic structuring at fine spatial scale

    Обследование фундаментов исторического здания «Дом игумена» в г.Чернигове

    No full text
    BACKGROUND: Occupational exposure is an important consideration during emergency department thoracotomy (EDT). While human immunodeficiency virus/hepatitis prevalence in trauma patients (0-16.8%) and occupational exposure rates during operative trauma procedures (1.9-18.0%) have been reported, exposure risk during EDT is unknown. We hypothesized that occupational exposure risk during EDT would be greater than other operative trauma procedures. METHODS: A prospective, observational study at 16 US trauma centers was performed (2015-2016). All bedside EDT resuscitation providers were surveyed with a standardized data collection tool and risk factors analyzed with respect to the primary end point, EDT occupational exposure (percutaneous injury, mucous membrane, open wound, or eye splash). Provider and patient variables and outcomes were evaluated with single and multivariable logistic regression analyses. RESULTS: One thousand three hundred sixty participants (23% attending, 59% trainee, 11% nurse, 7% other) were surveyed after 305 EDTs (gunshot wound, 68%; prehospital cardiopulmonary resuscitation, 57%; emergency department signs of life, 37%), of which 15 patients survived (13 neurologically intact) their hospitalization. Overall, 22 occupational exposures were documented, resulting in an exposure rate of 7.2% (95% confidence interval [CI], 4.7-10.5%) per EDT and 1.6% (95% CI, 1.0-2.4%) per participant. No differences in trauma center level, number of participants, or hours worked were identified. Providers with exposures were primarily trainees (68%) with percutaneous injuries (86%) during the thoracotomy (73%). Full precautions were utilized in only 46% of exposed providers, while multiple variable logistic regression determined that each personal protective equipment item utilized during EDT correlated with a 34% decreased risk of occupational exposure (odds ratio, 0.66; 95% CI, 0.48-0.91; p = 0.010). CONCLUSIONS: Our results suggest that the risk of occupational exposure should not deter providers from performing EDT. Despite the small risk of viral transmission, our data revealed practices that may place health care providers at unnecessary risk of occupational exposure. Regardless of the lifesaving nature of the procedure, improved universal precaution compliance with personal protective equipment is paramount and would further minimize occupational exposure risks during EDT. LEVEL OF EVIDENCE: Therapeutic/care management study, level III
    corecore