22 research outputs found

    Holomorphic Hartree-Fock Theory: The Nature of Two-Electron Problems.

    Get PDF
    We explore the existence and behavior of holomorphic restricted Hartree-Fock (h-RHF) solutions for two-electron problems. Through algebraic geometry, the exact number of solutions with n basis functions is rigorously identified as 1/2(3n - 1), proving that states must exist for all molecular geometries. A detailed study on the h-RHF states of HZ (STO-3G) then demonstrates both the conservation of holomorphic solutions as geometry or atomic charges are varied and the emergence of complex h-RHF solutions at coalescence points. Using catastrophe theory, the nature of these coalescence points is described, highlighting the influence of molecular symmetry. The h-RHF states of HHeH2+ and HHeH (STO-3G) are then compared, illustrating the isomorphism between systems with two electrons and two electron holes. Finally, we explore the h-RHF states of ethene (STO-3G) by considering the π electrons as a two-electron problem and employ NOCI to identify a crossing of the lowest energy singlet and triplet states at the perpendicular geometry

    Many Labs 5: Testing Pre-Data-Collection Peer Review as an Intervention to Increase Replicability

    Get PDF
    Replication studies in psychological science sometimes fail to reproduce prior findings. If these studies use methods that are unfaithful to the original study or ineffective in eliciting the phenomenon of interest, then a failure to replicate may be a failure of the protocol rather than a challenge to the original finding. Formal pre-data-collection peer review by experts may address shortcomings and increase replicability rates. We selected 10 replication studies from the Reproducibility Project: Psychology (RP:P; Open Science Collaboration, 2015) for which the original authors had expressed concerns about the replication designs before data collection; only one of these studies had yielded a statistically significant effect (p < .05). Commenters suggested that lack of adherence to expert review and low-powered tests were the reasons that most of these RP:P studies failed to replicate the original effects. We revised the replication protocols and received formal peer review prior to conducting new replication studies. We administered the RP:P and revised protocols in multiple laboratories (median number of laboratories per original study = 6.5, range = 3–9; median total sample = 1,279.5, range = 276–3,512) for high-powered tests of each original finding with both protocols. Overall, following the preregistered analysis plan, we found that the revised protocols produced effect sizes similar to those of the RP:P protocols (Δr = .002 or .014, depending on analytic approach). The median effect size for the revised protocols (r = .05) was similar to that of the RP:P protocols (r = .04) and the original RP:P replications (r = .11), and smaller than that of the original studies (r = .37). Analysis of the cumulative evidence across the original studies and the corresponding three replication attempts provided very precise estimates of the 10 tested effects and indicated that their effect sizes (median r = .07, range = .00–.15) were 78% smaller, on average, than the original effect sizes (median r = .37, range = .19–.50).Additional co-authors: Ivan Ropovik, Balazs Aczel, Lena F. Aeschbach, Luca Andrighetto, Jack D. Arnal, Holly Arrow, Peter Babincak, Bence E. Bakos, Gabriel BanĂ­k, Ernest Baskin, Radomir Belopavlovic, Michael H. Bernstein, MichaƂ BiaƂek, Nicholas G. Bloxsom, Bojana BodroĆŸa, Diane B. V. Bonfiglio, Leanne Boucher, Florian BrĂŒhlmann, Claudia C. Brumbaugh, Erica Casini, Yiling Chen, Carlo Chiorri, William J. Chopik, Oliver Christ, Antonia M. Ciunci, Heather M. Claypool, Sean Coary, Marija V. Cˇolic, W. Matthew Collins, Paul G. Curran, Chris R. Day, Anna Dreber, John E. Edlund, Filipe FalcĂŁo, Anna Fedor, Lily Feinberg, Ian R. Ferguson, MĂĄire Ford, Michael C. Frank, Emily Fryberger, Alexander Garinther, Katarzyna Gawryluk, Kayla Ashbaugh, Mauro Giacomantonio, Steffen R. Giessner, Jon E. Grahe, Rosanna E. Guadagno, Ewa HaƂasa, Rias A. Hilliard, Joachim HĂŒffmeier, Sean Hughes, Katarzyna Idzikowska, Michael Inzlicht, Alan Jern, William JimĂ©nez-Leal, Magnus Johannesson, Jennifer A. Joy-Gaba, Mathias Kauff, Danielle J. Kellier, Grecia Kessinger, Mallory C. Kidwell, Amanda M. Kimbrough, Josiah P. J. King, Vanessa S. Kolb, Sabina KoƂodziej, Marton Kovacs, Karolina Krasuska, Sue Kraus, Lacy E. Krueger, Katarzyna Kuchno, Caio Ambrosio Lage, Eleanor V. Langford, Carmel A. Levitan, Tiago JessĂ© Souza de Lima, Hause Lin, Samuel Lins, Jia E. Loy, Dylan Manfredi, Ɓukasz Markiewicz, Madhavi Menon, Brett Mercier, Mitchell Metzger, Venus Meyet, Jeremy K. Miller, Andres Montealegre, Don A. Moore, RafaƂ Muda, Gideon Nave, Austin Lee Nichols, Sarah A. Novak, Christian Nunnally, Ana Orlic, Anna Palinkas, Angelo Panno, Kimberly P. Parks, Ivana Pedovic, Emilian Pekala, Matthew R. Penner, Sebastiaan Pessers, Boban Petrovic, Thomas Pfeiffer, Damian Pienkosz, Emanuele Preti, Danka Puric, Tiago Ramos, Jonathan Ravid, Timothy S. Razza, Katrin Rentzsch, Juliette Richetin, Sean C. Rife, Anna Dalla Rosa, Kaylis Hase Rudy, Janos Salamon, Blair Saunders, PrzemysƂaw Sawicki, Kathleen Schmidt, Kurt Schuepfer, Thomas Schultze, Stefan Schulz-Hardt, Astrid SchĂŒtz, Ani N. Shabazian, Rachel L. Shubella, Adam Siegel, RĂșben Silva, Barbara Sioma, Lauren Skorb, Luana Elayne Cunha de Souza, Sara Steegen, L. A. R. Stein, R. Weylin Sternglanz, Darko Stojilovic, Daniel Storage, Gavin Brent Sullivan, Barnabas Szaszi, Peter Szecsi, Orsolya Szöke, Attila Szuts, Manuela Thomae, Natasha D. Tidwell, Carly Tocco, Ann-Kathrin Torka, Francis Tuerlinckx, Wolf Vanpaemel, Leigh Ann Vaughn, Michelangelo Vianello, Domenico Viganola, Maria Vlachou, Ryan J. Walker, Sophia C. Weissgerber, Aaron L. Wichman, Bradford J. Wiggins, Daniel Wolf, Michael J. Wood, David Zealley, Iris ĆœeĆŸelj, Mark Zrubka, and Brian A. Nose

    Many Labs 5:Testing pre-data collection peer review as an intervention to increase replicability

    Get PDF
    Replication studies in psychological science sometimes fail to reproduce prior findings. If these studies use methods that are unfaithful to the original study or ineffective in eliciting the phenomenon of interest, then a failure to replicate may be a failure of the protocol rather than a challenge to the original finding. Formal pre-data-collection peer review by experts may address shortcomings and increase replicability rates. We selected 10 replication studies from the Reproducibility Project: Psychology (RP:P; Open Science Collaboration, 2015) for which the original authors had expressed concerns about the replication designs before data collection; only one of these studies had yielded a statistically significant effect (p < .05). Commenters suggested that lack of adherence to expert review and low-powered tests were the reasons that most of these RP:P studies failed to replicate the original effects. We revised the replication protocols and received formal peer review prior to conducting new replication studies. We administered the RP:P and revised protocols in multiple laboratories (median number of laboratories per original study = 6.5, range = 3?9; median total sample = 1,279.5, range = 276?3,512) for high-powered tests of each original finding with both protocols. Overall, following the preregistered analysis plan, we found that the revised protocols produced effect sizes similar to those of the RP:P protocols (?r = .002 or .014, depending on analytic approach). The median effect size for the revised protocols (r = .05) was similar to that of the RP:P protocols (r = .04) and the original RP:P replications (r = .11), and smaller than that of the original studies (r = .37). Analysis of the cumulative evidence across the original studies and the corresponding three replication attempts provided very precise estimates of the 10 tested effects and indicated that their effect sizes (median r = .07, range = .00?.15) were 78% smaller, on average, than the original effect sizes (median r = .37, range = .19?.50)

    Childhood and character; an introduction to the study of the religious life of children,

    No full text
    Bibliography: p. 268-275.Mode of access: Internet

    Worship in the Sunday school : a study in the theory and practice of worship /

    No full text
    Bibliography: p. 204-210.Mode of access: Internet

    Studies in the nature of character /

    No full text
    Includes bibliographical references and indexes.v. 1. Hartshorne, H. and May M. A. Studies in deceit. -- v. 2. Hartshorne, H., May, M. A. and Maller, J. B. Studies in service and self-control. -- v. 3. Hartshorne, H., May, M. A. and Shuttleworth, F. K. Studies in the organization of character.Mode of access: Internet

    RP:P Pages for Replications

    No full text
    This page contains copies of the RP:P project pages for all replications in Many Labs 5. The projects are linked on the right side of this page
    corecore