269 research outputs found

    Reproducibility of density functional approximations: how new functionals should be reported

    Full text link
    Density functional theory is the workhorse of chemistry and materials science, and novel density functional approximations (DFAs) are published every year. To become available in program packages, the novel DFAs need to be (re)implemented. However, according to our experience as developers of Libxc [Lehtola et al, SoftwareX 7, 1 (2018)], a constant problem in this task is verification, due to the lack of reliable reference data. As we discuss in this work, this lack has lead to several non-equivalent implementations of functionals such as BP86, PW91, PBE, and B3LYP across various program packages, yielding different total energies. Through careful verification, we have also found many issues with incorrect functional forms in recent DFAs. The goal of this work is to ensure the reproducibility of DFAs: DFAs must be verifiable in order to prevent reappearances of the abovementioned errors and incompatibilities. A common framework for verification and testing is therefore needed. We suggest several ways in which reference energies can be produced with free and open source software, either with non-self-consistent calculations with tabulated atomic densities or via self-consistent calculations with various program packages. The employed numerical parameters -- especially, the quadrature grid -- need to be converged to guarantee the ≲0.1μEh\lesssim0.1\mu E_{h} precision for fully numerical calculations which routinely afford such precision in the total energy. Such sub-μEh\mu E_{h} level of agreement can only be achieved when fully equivalent implementations of the DFA are used. Therefore, also the source code of the reference implementation should be made available in any publication describing a new DFA.Comment: 15 pages, 1 figur

    Many recent density functionals are numerically ill-behaved

    Full text link
    Most computational studies in chemistry and materials science are based on the use of density functional theory. Although the exact density functional is unknown, several density functional approximations (DFAs) offer a good balance of affordable computational cost and semi-quantitative accuracy for applications. The development of DFAs still continues on many fronts, and several new DFAs aiming for improved accuracy are published every year. However, the numerical behavior of these DFAs is an often overlooked problem. In this work, we look at all 592 DFAs for three-dimensional systems available in Libxc 5.2.2 and examine the convergence of the density functional total energy based on tabulated atomic Hartree-Fock wave functions. We show that several recent DFAs, including the celebrated SCAN family of functionals, show impractically slow convergence with typically used numerical quadrature schemes, making these functionals unsuitable both for routine applications or high-precision studies, as thousands of radial quadrature points may be required to achieve sub-ÎĽEh\mu E_{h} accurate total energies for these unctionals, while standard quadrature grids like the SG-3 grid only contain O(100)\mathcal{O}(100) radial quadrature points. These results are both a warning to users to lways check the sufficiency of the quadrature grid when adopting novel functionals, as well as a guideline to the theory community to develop better behaved density functionals.Comment: 16 pages, 6 figure

    Meta-Local Density Functionals : A New Rung on Jacob's Ladder

    Get PDF
    The homogeneous electron gas (HEG) is a key ingredient in the construction of most exchange-correlation functionals of density-functional theory. Often, the energy of the HEG is parameterized as a function of its spin density n(sigma), leading to the local density approximation (LDA) for inhomogeneous systems. However, the connection between the electron density and kinetic energy density of the HEG can be used to generalize the LDA by evaluating it on a geometric average n(sigma)(avg)(r) = n(sigma)(1-x)(r) n(sigma)(x) (r) of the local spin density n(sigma)(r) and the spin density n(sigma)(r) of a HEG that has the local kinetic energy density ts(r) of the inhomogeneous system. This leads to a new family of functionals that we term meta-local density approximations (meta-LDAs), which are still exact for the HEG, which are derived only from properties of the HEG and which form a new rung of Jacob's ladder of density functionals [AIP Conf. Proc. 2001, 577, 1]. The first functional of this ladder, the local tau approximation (LTA) of Ernzerhof and Scuseria [J. Chem. Phys. 1999, 111, 911] that corresponds to x = 1 is unfortunately not stable enough to be used in self-consistent field calculations because it leads to divergent potentials, as we show in this work. However, a geometric averaging of the LDA and LTA densities with smaller values of x not only leads to numerical stability of the resulting functional but also yields more accurate exchange energies in atomic calculations than the LDA, the LTA, or the tLDA functional (x = 1/4) of Eich and Hellgren [J. Chem. Phys. 2014, 141, 224107]. We choose x = 0.50, as it gives the best total energy in self-consistent exchange-only calculations for the argon atom. Atomization energy benchmarks confirm that the choice x = 0.50 also yields improved energetics in combination with correlation functionals in molecules, almost eliminating the well-known overbinding of the LDA and reducing its error by two thirds.Peer reviewe

    Sulfur Molecules in Space by X-rays : A Computational Study

    Get PDF
    X-ray astronomy lacks high resolution spectra of interstellar dust analogues and molecules, severely hampering interstellar medium studies based on upcoming X-ray missions. Various theoretical approaches may be used to address this problem, but they must first be shown to reproduce reliable spectra compared to the experiment. In this work, we calculate the sulfur K edge X-ray absorption spectra of H2S, SO2, and OCS, whose spectra are already known from X-ray experiments and predict the X-ray spectrum of CS, which as far as we are aware has not been measured, thereby hampering its detection by X-ray telescopes. We chose these four molecules as the astrochemistry of sulfur is an unsolved problem and as the four molecules are already known to exist in space. We consider three types of methods for modeling the X-ray spectra: more accurate calculations with the algebraic-diagrammatic construction (ADC) and the CC2, CCSD, and CC3 coupled cluster (CC) approaches as well as more affordable ones with transition potential density functional theory (TP-DFT). A comparison of our computational results to previously reported experimental spectra shows that the core-valence separation (CVS) approaches CVS-ADC(2)-x and CVS-CC3 generally yield a good qualitative level of agreement with the experiment, suggesting that they can be used for interpreting measured spectra, while the TP-DFT method is not reliable for these molecules. However, quantitative agreement with the experiment is still outside the reach of the computational methods studied in this work.Peer reviewe

    Ordovician Vertebrates from Ontario

    Full text link
    23-30http://deepblue.lib.umich.edu/bitstream/2027.42/48471/2/ID321.pd

    Measurements of muon flux in the Pyh\"asalmi underground laboratory

    Full text link
    The cosmic-ray induced muon flux was measured at several depths in the Pyh\"asalmi mine (Finland) using a plastic scintillator telescope mounted on a trailer. The flux was determined at four different depths underground at 400 m (980 m.w.e), at 660 m (1900 m.w.e), at 990 m (2810 m.w.e) and at 1390 m (3960 m.w.e) with the trailer, and also at the ground surface. In addition, previously measured fluxes from depths of 90 m (210 m.w.e) and 210 m (420 m.w.e) are shown. A relation was obtained for the underground muon flux as a function of the depth. The measured flux follows well the general behaviour and is consistent with results determined in other underground laboratories.Comment: 8 pages, 2 figures. Submitted to Nuclear Instrum. Methods

    Business process modelling and visualisation to support e-government decision making: Business/IS alignment

    Get PDF
    © 2017 Springer-Verlag. The final publication is available at Springer via https://doi.org/10.1007/978-3-319-57487-5_4.Alignment between business and information systems plays a vital role in the formation of dependent relationships between different departments in a government organization and the process of alignment can be improved by developing an information system (IS) according to the stakeholders’ expectations. However, establishing strong alignment in the context of the eGovernment environment can be difficult. It is widely accepted that business processes in the government environment plays a pivotal role in capturing the details of IS requirements. This paper presents a method of business process modelling through UML which can help to visualise and capture the IS requirements for the system development. A series of UML models have been developed and discussed. A case study on patient visits to a healthcare clinic in the context of eGovernment has been used to validate the models

    Ontology Domain Modeling Support for Multilingual Servicies in e-commerce: MKBEEM

    Get PDF
    One of the main objectives of a truly user-friendly Information Society is to focus on advanced human language technologies enabling cost-effective interchange across language and culture and more natural interfaces to digital services. The recently launched IST-1999-10589 project MKBEEM (Multilingual Knowledge Based European Electronic Marketplace, 1st Feb. 2000 - 1st Aug. 2002) is rightly in that direction and the work will address basically, written language technologies and its use in the key sector of global business and electronic commerce. In particular MKBEEM will focus on adding multilinguality to all stages of the information cycle, including multilingual content generation and maintenance, automated translation and interpretation and enhancing the natural interactivity and usability of the service with unconstrained language input. On the Knowledge engineering side, the MKBEEM Ontologies will provide a consensual representation of the electronic commerce field in three typical Domains (Tourism, Mail order, Retailers) allowing the exchanges independently of the language of the end user, the service, or the content provider. Ontologies will be used for classifying and indexing catalogues, for filtering user’s query, for facilitating multilingual man-machine dialogues between user and software agent, and for inferring information that is relevant to the user’s request. This paper concentrates on ontology issues, while the used human language processing approaches will be presented closely in our later papers

    Dynamics of forced biopolymer translocation

    Full text link
    We present results from our simulations of biopolymer translocation in a solvent which explain the main experimental findings. The forced translocation can be described by simple force balance arguments for the relevant range of pore potentials in experiments and biological systems. Scaling of translocation time with polymer length varies with pore force and friction. Hydrodynamics affects this scaling and significantly reduces translocation times.Comment: Published in: http://www.iop.org/EJ/article/0295-5075/85/5/58006/epl_85_5_58006.htm

    Preregistration Classification of Mobile LIDAR Data Using Spatial Correlations

    Get PDF
    We explore a novel paradigm for light detection and ranging (LIDAR) point classification in mobile laser scanning (MLS). In contrast to the traditional scheme of performing classification for a 3-D point cloud after registration, our algorithm operates on the raw data stream classifying the points on-the-fly before registration. Hence, we call it preregistration classification (PRC). Specifically, this technique is based on spatial correlations, i.e., local range measurements supporting each other. The proposed method is general since exact scanner pose information is not required, nor is any radiometric calibration needed. Also, we show that the method can be applied in different environments by adjusting two control parameters, without the results being overly sensitive to this adjustment. As results, we present classification of points from an urban environment where noise, ground, buildings, and vegetation are distinguished from each other, and points from the forest where tree stems and ground are classified from the other points. As computations are efficient and done with a minimal cache, the proposed methods enable new on-chip deployable algorithmic solutions. Broader benefits from the spatial correlations and the computational efficiency of the PRC scheme are likely to be gained in several online and offline applications. These range from single robotic platform operations including simultaneous localization and mapping (SLAM) algorithms to wall-clock time savings in geoinformation industry. Finally, PRC is especially attractive for continuous-beam and solid-state LIDARs that are prone to output noisy data
    • …
    corecore