3,630 research outputs found

    Microscopically-based energy density functionals for nuclei using the density matrix expansion: Implementation and pre-optimization

    Get PDF
    In a recent series of papers, Gebremariam, Bogner, and Duguet derived a microscopically based nuclear energy density functional by applying the Density Matrix Expansion (DME) to the Hartree-Fock energy obtained from chiral effective field theory (EFT) two- and three-nucleon interactions. Due to the structure of the chiral interactions, each coupling in the DME functional is given as the sum of a coupling constant arising from zero-range contact interactions and a coupling function of the density arising from the finite-range pion exchanges. Since the contact contributions have essentially the same structure as those entering empirical Skyrme functionals, a microscopically guided Skyrme phenomenology has been suggested in which the contact terms in the DME functional are released for optimization to finite-density observables to capture short-range correlation energy contributions from beyond Hartree-Fock. The present paper is the first attempt to assess the ability of the newly suggested DME functional, which has a much richer set of density dependencies than traditional Skyrme functionals, to generate sensible and stable results for nuclear applications. The results of the first proof-of-principle calculations are given, and numerous practical issues related to the implementation of the new functional in existing Skyrme codes are discussed. Using a restricted singular value decomposition (SVD) optimization procedure, it is found that the new DME functional gives numerically stable results and exhibits a small but systematic reduction of our test χ2\chi^2 function compared to standard Skyrme functionals, thus justifying its suitability for future global optimizations and large-scale calculations.Comment: 17 pages, 6 figure

    Investigation of double beta decay with the NEMO-3 detector

    Full text link
    The double beta decay experiment NEMO~3 has been taking data since February 2003. The aim of this experiment is to search for neutrinoless (0νββ0\nu\beta\beta) decay and investigate two neutrino double beta decay in seven different isotopically enriched samples (100^{100}Mo, 82^{82}Se, 48^{48}Ca, 96^{96}Zr, 116^{116}Cd, 130^{130}Te and 150^{150}Nd). After analysis of the data corresponding to 3.75 y, no evidence for 0νββ0\nu\beta\beta decay in the 100^{100}Mo and 82^{82}Se samples was found. The half-life limits at the 90% C.L. are 1.1⋅10241.1\cdot 10^{24} y and 3.6⋅10233.6\cdot 10^{23} y, respectively. Additionally for 0νββ0\nu\beta\beta decay the following limits at the 90% C.L. were obtained, >1.3⋅1022> 1.3 \cdot 10^{22} y for 48^{48}Ca, >9.2⋅1021> 9.2 \cdot 10^{21} y for 96^{96}Zr and >1.8⋅1022> 1.8 \cdot 10^{22} y for 150^{150}Nd. The 2νββ2\nu\beta\beta decay half-life values were precisely measured for all investigated isotopes.Comment: 12 pages, 4 figures, 5 tables; talk at conference on "Fundamental Interactions Physics" (ITEP, Moscow, November 23-27, 2009

    Overconstrained estimates of neutrinoless double beta decay within the QRPA

    Get PDF
    Estimates of nuclear matrix elements for neutrinoless double beta decay (0nu2beta) based on the quasiparticle random phase approximations (QRPA) are affected by theoretical uncertainties, which can be substantially reduced by fixing the unknown strength parameter g_pp of the residual particle-particle interaction through one experimental constraint - most notably through the two-neutrino double beta decay (2nu2beta) lifetime. However, it has been noted that the g_pp adjustment via 2\nu2\beta data may bring QRPA models in disagreement with independent data on electron capture (EC) and single beta decay (beta^-) lifetimes. Actually, in two nuclei of interest for 0nu2beta decay (Mo-100 and Cd-116), for which all such data are available, we show that the disagreement vanishes, provided that the axial vector coupling g_A is treated as a free parameter, with allowance for g_A<1 (``strong quenching''). Three independent lifetime data (2nu2beta, EC, \beta^-) are then accurately reproduced by means of two free parameters (g_pp, g_A), resulting in an overconstrained parameter space. In addition, the sign of the 2nu2beta matrix element M^2nu is unambiguously selected (M^2nu>0) by the combination of all data. We discuss quantitatively, in each of the two nuclei, these phenomenological constraints and their consequences for QRPA estimates of the 0nu2beta matrix elements and of their uncertainties.Comment: Revised version (27 pages, including 10 figures), focussed on Mo-100 and Cd-116. To appear in J. Phys. G: Nucl. Phys. (2008

    Nuclear energy density optimization: Shell structure

    Full text link
    Nuclear density functional theory is the only microscopical theory that can be applied throughout the entire nuclear landscape. Its key ingredient is the energy density functional. In this work, we propose a new parameterization UNEDF2 of the Skyrme energy density functional. The functional optimization is carried out using the POUNDerS optimization algorithm within the framework of the Skyrme Hartree-Fock-Bogoliubov theory. Compared to the previous parameterization UNEDF1, restrictions on the tensor term of the energy density have been lifted, yielding a very general form of the energy density functional up to second order in derivatives of the one-body density matrix. In order to impose constraints on all the parameters of the functional, selected data on single-particle splittings in spherical doubly-magic nuclei have been included into the experimental dataset. The agreement with both bulk and spectroscopic nuclear properties achieved by the resulting UNEDF2 parameterization is comparable with UNEDF1. While there is a small improvement on single-particle spectra and binding energies of closed shell nuclei, the reproduction of fission barriers and fission isomer excitation energies has degraded. As compared to previous UNEDF parameterizations, the parameter confidence interval for UNEDF2 is narrower. In particular, our results overlap well with those obtained in previous systematic studies of the spin-orbit and tensor terms. UNEDF2 can be viewed as an all-around Skyrme EDF that performs reasonably well for both global nuclear properties and shell structure. However, after adding new data aiming to better constrain the nuclear functional, its quality has improved only marginally. These results suggest that the standard Skyrme energy density has reached its limits and significant changes to the form of the functional are needed.Comment: 18 pages, 13 figures, 12 tables; resubmitted for publication to Phys. Rev. C after second review by refere

    Computing Heavy Elements

    Full text link
    Reliable calculations of the structure of heavy elements are crucial to address fundamental science questions such as the origin of the elements in the universe. Applications relevant for energy production, medicine, or national security also rely on theoretical predictions of basic properties of atomic nuclei. Heavy elements are best described within the nuclear density functional theory (DFT) and its various extensions. While relatively mature, DFT has never been implemented in its full power, as it relies on a very large number (~ 10^9-10^12) of expensive calculations (~ day). The advent of leadership-class computers, as well as dedicated large-scale collaborative efforts such as the SciDAC 2 UNEDF project, have dramatically changed the field. This article gives an overview of the various computational challenges related to the nuclear DFT, as well as some of the recent achievements.Comment: Proceeding of the Invited Talk given at the SciDAC 2011 conference, Jul. 10-15, 2011, Denver, C

    DigiBuzz-VTT – Towards digital twin’s concrete commercial exploitation

    Get PDF
    The DigiBuzz-VTT project, a part of the DigiBuzz common effort, focused on the applications of digital twins in manufacturing industry ecosystems. The DigiBuzz-VTT project had two main focuses, 1) functional digital twins, or simulation-based digital twins, of machines and machine systems and their applications, and 2) the life cycle management of digital twins (the digital part of the twin), emphasising data modelling and data management. These themes were studied from the technical and from the business point of views. The detailed research topics were:• Business opportunities and added value of digital twins for manufacturing industry• Data-based digital twins, use of machine learning for feature recognition• The status of standardisation for the lifecycle data management of digital twins, means for preserving model data• Hybrid modelling with digital twins, combination of experimental and simulation data• The optimisation of the measurement points location, method development• The use of Kalman filters in estimating simulation data correlation with measured data• The status of Industrial Internet of Things (IIoT) for digital twinsThis report summarises the implementation of the DigiBuzz-VTT project and lists the main deliverables of the project. The project produced several scientific articles and research reports, which report the research results in detail

    Nuclear matrix elements of neutrinoless double beta decay with improved short-range correlations

    Full text link
    Nuclear matrix elements of the neutrinoless double beta decays of 96Zr, 100Mo, 116Cd, 128Te, 130Te and 136Xe are calculated for the light-neutrino exchange mechanism by using the proton-neutron quasiparticle random-phase approximation (pnQRPA) with a realistic nucleon-nucleon force. The g_pp parameter of the pnQRPA is fixed by the data on the two-neutrino double beta decays and single beta decays. The finite size of a nucleon, the higher-order terms of nucleonic weak currents, and the nucleon-nucleon short-range correlations (s.r.c) are taken into account. The s.r.c. are computed by the traditional Jastrow method and by the more advanced unitary correlation operator method (UCOM). Comparison of the results obtained by the two methods is carried out. The UCOM computed matrix elements turn out to be considerably larger than the Jastrow computed ones. This result is important for the assessment of the neutrino-mass sensitivity of the present and future double beta experiments.Comment: Two figures, to be published in Physical Review C (2007) as a regular articl
    • …
    corecore