4,763 research outputs found

    Mass Measurements and the Bound--Electron g Factor

    Full text link
    The accurate determination of atomic masses and the high-precision measurement of the bound-electron g factor are prerequisites for the determination of the electron mass, which is one of the fundamental constants of nature. In the 2002 CODATA adjustment [P. J. Mohr and B. N. Taylor, Rev. Mod. Phys. 77, 1 (2005)], the values of the electron mass and the electron-proton mass ratio are mainly based on g factor measurements in combination with atomic mass measurements. In this paper, we briefly discuss the prospects for obtaining other fundamental information from bound-electron g factor measurements, we present some details of a recent investigation of two-loop binding corrections to the g factor, and we also investigate the radiative corrections in the limit of highly excited Rydberg S states with a long lifetime, where the g factor might be explored using a double resonance experiment.Comment: 13 pages, LaTeX; dedicated to Prof. H.-J. Kluge on the occasion of his 65th birthday, to appear in Int. J. Mass. Spectrometr

    Model-based identification and testing of appropriate strategies to minimize N2O emissions from biofilm deammonification

    Get PDF
    Based on a one-year pilot plant operation of a two-step biofilm nitritation-anammox pilot plant, N2O mitigation strategies were identified by applying a newly developed biofilm modeling approach. Due to adapted plant operation, the N2O emission could be diminished by 75% (8.8% → 2.3% of NH4-Noxidized_AOB). The results (measurement and simulation) confirm the huge importance of denitrification as an N2O source or N2O sink, depending on the boundary conditions. A significant reduction of N2O emissions could only be achieved with a one-step deammonification system, which is related to low nitrite and HNO2 concentrations. Increased oxygen concentrations in the bulk phase are not related to decreased emissions. N2O formation by ammonium-oxidizing bacteria (AOB) just shifts deeper into the biofilm; zones with low oxygen concentrations are not avoidable in biofilm systems. Low oxygen concentrations in the bulk phase, however, result in a reduction of the total net N2O formation due to increased activity of heterotrophic bacteria directly at the source of N2O formation (outer biofilm layer). For the model-based identification of mitigation strategies, the standard modeling approaches for biofilms were expanded by including the factor-based N2O formation and emission approach. The new model 'Biofilm/N2OISAH' was successfully validated using data from pilot-scale measurement campaigns. Altogether, the investigation confirms that the employed digital model can strongly support the development of N2O mitigation strategies without the need for specialized measurement inside the biofilm

    Cast Out: Vagrancy and Homelessness in Global and Historical Perspective

    Get PDF
    Throughout history, those arrested for vagrancy have generally been poor men and women, often young, able-bodied, unemployed, and homeless. Most histories of vagrancy have focused on the European and American experiences. Cast Out: Vagrancy and Homelessness in Global and Historical Perspective is the first book to consider the shared global heritage of vagrancy laws, homelessness, and the historical processes they accompanied. In this ambitious collection, vagrancy and homelessness are used to examine a vast array of phenomena, from the migration of labor to social and governmental responses to poverty through charity, welfare, and prosecution. The essays in Cast Out represent the best scholarship on these subjects and include discussions of the lives of the underclass, strategies for surviving and escaping poverty, the criminalization of poverty by the state, the rise of welfare and development programs, the relationship between imperial powers and colonized peoples, and the struggle to achieve independence after colonial rule. By juxtaposing these histories, the authors explore vagrancy as a common response to poverty, labor dislocation, and changing social norms, as well as how this strategy changed over time and adapted to regional peculiarities. Part of a growing literature on world history, Cast Out offers fresh perspectives and new research in fields that have yet to fully investigate vagrancy and homelessness. This book by leading scholars in the field is for policy makers, as well as for courses on poverty, homelessness, and world history. Contributors: Richard B. Allen, David Arnold, A. L. Beier, Andrew Burton, Vincent DiGirolamo, Andrew A. Gentes, Robert Gordon, Frank Tobias Higbie, Thomas H. Holloway, Abby Margolis, Paul Ocobock, Aminda M. Smith, Linda Woodbridgehttps://ohioopen.library.ohio.edu/oupress/1000/thumbnail.jp

    Smoothed Complexity Theory

    Get PDF
    Smoothed analysis is a new way of analyzing algorithms introduced by Spielman and Teng (J. ACM, 2004). Classical methods like worst-case or average-case analysis have accompanying complexity classes, like P and AvgP, respectively. While worst-case or average-case analysis give us a means to talk about the running time of a particular algorithm, complexity classes allows us to talk about the inherent difficulty of problems. Smoothed analysis is a hybrid of worst-case and average-case analysis and compensates some of their drawbacks. Despite its success for the analysis of single algorithms and problems, there is no embedding of smoothed analysis into computational complexity theory, which is necessary to classify problems according to their intrinsic difficulty. We propose a framework for smoothed complexity theory, define the relevant classes, and prove some first hardness results (of bounded halting and tiling) and tractability results (binary optimization problems, graph coloring, satisfiability). Furthermore, we discuss extensions and shortcomings of our model and relate it to semi-random models.Comment: to be presented at MFCS 201

    Transient wall shear stress estimation in coronary bifurcations using convolutional neural networks

    Full text link
    Background and Objective: Haemodynamic metrics, such as blood flow induced shear stresses at the inner vessel lumen, are associated with the development and progression of coronary artery disease. Understanding these metrics may therefore improve the assessment of an individual's coronary disease risk. However, the calculation of such luminal Wall Shear Stress (WSS) using traditional Computational Fluid Dynamics (CFD) methods is relatively slow and computationally expensive. As a result, CFD based haemodynamic computation is not suitable for integrated and large-scale use in clinical settings. Methods: In this work, deep learning techniques are proposed as an alternative method to CFD, whereby luminal WSS magnitude can be predicted in coronary bifurcations throughout the cardiac cycle based on the steady state solution (which takes <120 seconds to calculate including preprocessing), vessel geometry and additional global features. The deep learning model is trained on a dataset of 101 patient-specific and 2626 synthetic left main bifurcation models with 26 separate patient-specific cases used as the test set. Results: The model showed high fidelity predictions with <5% (normalised against mean WSS magnitude) deviation to CFD derived values as the gold-standard method, while being orders of magnitude faster with on average <2 minutes versus 3 hours computation for transient CFD. Conclusions: This method therefore offers a new approach to substantially reduce the computational cost involved in, for example, large-scale population studies of coronary haemodynamic metrics, and may therefore open the pathway for future clinical integration

    Is What We Want What We Need, and Can We Get It in Writing? The Third-Wave of Feminism Hits the Beach of Modern Parentage Presumptions

    Get PDF
    Modern statutes on parentage regarding artificial insemination and the cases that have interpreted them reflect the explosion of family gender roles by second-wave feminism. Although a natural father now is generally expected to share the rights and obligations of parentage with a natural mother, this is not so if he is a mere contributor of biological material. Are the modern presumptions underlying such statutes, what we used to want, what we have come to need? Or, is current law too much a reflection of the essentialism for which the second-wave is sometimes justly criticized?Third-wave individualism and resistance to inflexible doctrine supply interesting lenses for an examination of the developing law in this area. Particularly, a recent Kansas 1 case, In the Interest of K.M.H., is the first to evaluate a statute designed to give power to individual choice by making a parentage presumption secondary to an agreement between a woman and a sperm donor that the donor will be treated as a parent.This paper explores the context and outcome of this case and whether we have exhausted the limits of legal reform that can be achieved through the creation of- even progressive - presumptions about parentage. Given the changes wrought under the influence of the second-wave, do such presumptions retain vitality and usefulness? Or, do they produce only a different, but not necessarily better, set of obstacles to formation and preservation of individual families and their informed choices

    Model assisted identification of N2O mitigation strategies for full-scale reject water treatment plants

    Get PDF
    In a 3-year research project, a new approach to forecast biological N2O formation and emission at high-strength reject water treatment has been developed (ASM3/1_N2OISAH). It was calibrated by extensive batch-tests and finally evaluated by long-term measurement campaigns realized at three wastewater treatment plants (WWTPs) with different process configurations for nitrogen removal of reject water. To enable a model application with common full-scale data, the nitritation-connected supplementary processes that are responsible for N2O formation are not depicted in the model. Instead, within the new model approach the N2O formation is linked to the NH4-N oxidation rate by defining specific formation factors [N2O-Nform/NH4-Nox], depending on the concentrations of NO2 and O2 as well as the NH4 load. A comparison between the measured and the modeled N2O concentrations in the liquid and gas phase at the full-scale treatment plants prove the ability of the proposed modelling approach to represent the observed trends of N2O formation, emission and reduction using the standard parameter set of kinetics and formation factors. Thus, enabling a reliable estimation of the N2O emissions for different operational conditions. The measurements indicate that a formation of N2O by AOB cannot completely be avoided. However, a considerable reduction of the formed N2O was observed in an anoxic environment. Applying the model, operational settings and mitigation strategies can now be identified without extensive measurement campaigns. For further enhancement of the model, first results for kinetics of N2O reduction kinetics by denitrification processes were determined in laboratory-scale batch tests

    Recoil correction to the bound-electron g factor in H-like atoms to all orders in αZ\alpha Z

    Get PDF
    The nuclear recoil correction to the bound-electron g factor in H-like atoms is calculated to first order in m/Mm/M and to all orders in αZ\alpha Z. The calculation is performed in the range Z=1-100. A large contribution of terms of order (αZ)5(\alpha Z)^5 and higher is found. Even for hydrogen, the higher-order correction exceeds the (αZ)4(\alpha Z)^4 term, while for uranium it is above the leading (αZ)2(\alpha Z)^2 correction.Comment: 6 pages, 3 tables, 1 figur

    Recoil correction to the ground state energy of hydrogenlike atoms

    Get PDF
    The recoil correction to the ground state energy of hydrogenlike atoms is calculated to all orders in \alpha Z in the range Z = 1-110. The nuclear size corrections to the recoil effect are partially taken into account. In the case of hydrogen, the relativistic recoil correction beyond the Salpeter contribution and the nonrelativistic nuclear size correction to the recoil effect, amounts to -7.2(2) kHz. The total recoil correction to the ground state energy in hydrogenlike uranium (^{238}U^{91+}) constitutes 0.46 eV.Comment: 16 pages, 1 figure (eps), Latex, submitted to Phys.Rev.
    • …
    corecore