4,854 research outputs found

    Validation and Benchmarking of a Practical Free Magnetic Energy and Relative Magnetic Helicity Budget Calculation in Solar Magnetic Structures

    Full text link
    In earlier works we introduced and tested a nonlinear force-free (NLFF) method designed to self-consistently calculate the free magnetic energy and the relative magnetic helicity budgets of the corona of observed solar magnetic structures. The method requires, in principle, only a single, photospheric or low-chromospheric, vector magnetogram of a quiet-Sun patch or an active region and performs calculations in the absence of three-dimensional magnetic and velocity-field information. In this work we strictly validate this method using three-dimensional coronal magnetic fields. Benchmarking employs both synthetic, three-dimensional magnetohydrodynamic simulations and nonlinear force-free field extrapolations of the active-region solar corona. We find that our time-efficient NLFF method provides budgets that differ from those of more demanding semi-analytical methods by a factor of ~3, at most. This difference is expected from the physical concept and the construction of the method. Temporal correlations show more discrepancies that, however, are soundly improved for more complex, massive active regions, reaching correlation coefficients of the order of, or exceeding, 0.9. In conclusion, we argue that our NLFF method can be reliably used for a routine and fast calculation of free magnetic energy and relative magnetic helicity budgets in targeted parts of the solar magnetized corona. As explained here and in previous works, this is an asset that can lead to valuable insight into the physics and the triggering of solar eruptions.Comment: 32 pages, 14 figures, accepted by Solar Physic

    Hypernuclear No-Core Shell Model

    Get PDF
    We extend the No-Core Shell Model (NCSM) methodology to incorporate strangeness degrees of freedom and apply it to single-Λ\Lambda hypernuclei. After discussing the transformation of the hyperon-nucleon (YN) interaction into Harmonic-Oscillator (HO) basis and the Similarity Renormalization Group transformation applied to it to improve model-space convergence, we present two complementary formulations of the NCSM, one that uses relative Jacobi coordinates and symmetry-adapted basis states to fully exploit the symmetries of the hypernuclear Hamiltonian, and one working in a Slater determinant basis of HO states where antisymmetrization and computation of matrix elements is simple and to which an importance-truncation scheme can be applied. For the Jacobi-coordinate formulation, we give an iterative procedure for the construction of the antisymmetric basis for arbitrary particle number and present the formulae used to embed two- and three-baryon interactions into the many-body space. For the Slater-determinant formulation, we discuss the conversion of the YN interaction matrix elements from relative to single-particle coordinates, the importance-truncation scheme that tailors the model space to the description of the low-lying spectrum, and the role of the redundant center-of-mass degrees of freedom. We conclude with a validation of both formulations in the four-body system, giving converged ground-state energies for a chiral Hamiltonian, and present a short survey of the A≤7A\le7 hyper-helium isotopes.Comment: 17 pages, 8 figures; accepted versio

    Agent-based modeling: a systematic assessment of use cases and requirements for enhancing pharmaceutical research and development productivity.

    Get PDF
    A crisis continues to brew within the pharmaceutical research and development (R&D) enterprise: productivity continues declining as costs rise, despite ongoing, often dramatic scientific and technical advances. To reverse this trend, we offer various suggestions for both the expansion and broader adoption of modeling and simulation (M&S) methods. We suggest strategies and scenarios intended to enable new M&S use cases that directly engage R&D knowledge generation and build actionable mechanistic insight, thereby opening the door to enhanced productivity. What M&S requirements must be satisfied to access and open the door, and begin reversing the productivity decline? Can current methods and tools fulfill the requirements, or are new methods necessary? We draw on the relevant, recent literature to provide and explore answers. In so doing, we identify essential, key roles for agent-based and other methods. We assemble a list of requirements necessary for M&S to meet the diverse needs distilled from a collection of research, review, and opinion articles. We argue that to realize its full potential, M&S should be actualized within a larger information technology framework--a dynamic knowledge repository--wherein models of various types execute, evolve, and increase in accuracy over time. We offer some details of the issues that must be addressed for such a repository to accrue the capabilities needed to reverse the productivity decline

    Validating Animal Models

    Get PDF
    This paper responds to a recent challenge for the validity of extrapolation of neurobiological knowledge from laboratory animals to humans. According to this challenge, experimental neurobiology, and thus neuroscience, is in a state of crisis because the knowledge produced in different laboratories hardly generalizes from one laboratory to another. Presumably, this is so because neurobiological laboratories use simplified animal models of human conditions that differ across laboratories. By contrast, I argue that maintaining a multiplicity of experimental protocols and simple models is well justified. It fosters rather than precludes the validity of extrapolation of neurobiological knowledge. The discipline is thriving
    • …
    corecore