699 research outputs found

    Meta-critical thinking, paradox, and probabilities

    Get PDF
    There is as much lack of clarity concerning what “critical thinking” involves, even among those charged with teaching it, as there is consensus that we need more emphasis on it in both academia and society. There is an apparent need to think critically about critical thinking, an exercise that might be called meta-critical thinking. It involves emphasizing a practice in terms of which “critical thinking” is helpfully carried out and clarifying one or more of the concepts in terms of which “critical thinking” is usually defined. The practice is distinction making and the concept that of evidence. Science advances by constructing models that explain real-world processes. Once multiple potential models have been distinguished, there remains the task of identifying which models match the real-world process better than others. Since statistical inference has in large part to do with showing how data provide support, i.e., furnish evidence, that the model/hypothesis is more or less likely while still uncertain, we turn to it to help make the concept more precise and thereby useful. In fact, two of the leading methodological paradigms—Bayesian and likelihood—can be taken to provide answers to the questions of the extent to which as well as how data provide evidence for conclusions. Examining these answers in some detail is a highly promising way to make progress. We do so by way of the analysis of three well-known statistical paradoxes—the Lottery, the Old Evidence, and Humphreys’—and the identification of distinctions on the basis of which their plausible resolutions depend. These distinctions, among others between belief and evidence and different concepts of probability, in turn have more general applications. They are applied here to two highly contested public policy issues—the efficacy of COVID vaccinations and the fossil fuel cause of climate change. Our aim is to provide some tools, they might be called “healthy habits of mind,” with which to assess statistical arguments, in particular with respect to the nature and extent of the evidence they furnish, and to illustrate their use in well-defined ways

    Peering into the Dark: Investigating dark matter and neutrinos with cosmology and astrophysics

    Get PDF
    The LCDM model of modern cosmology provides a highly accurate description of our universe. However, it relies on two mysterious components, dark matter and dark energy. The cold dark matter paradigm does not provide a satisfying description of its particle nature, nor any link to the Standard Model of particle physics. I investigate the consequences for cosmological structure formation in models with a coupling between dark matter and Standard Model neutrinos, as well as probes of primordial black holes as dark matter. I examine the impact that such an interaction would have through both linear perturbation theory and nonlinear N-body simulations. I present limits on the possible interaction strength from cosmic microwave background, large scale structure, and galaxy population data, as well as forecasts on the future sensitivity. I provide an analysis of what is necessary to distinguish the cosmological impact of interacting dark matter from similar effects. Intensity mapping of the 21 cm line of neutral hydrogen at high redshift using next generation observatories, such as the SKA, would provide the strongest constraints yet on such interactions, and may be able to distinguish between different scenarios causing suppressed small scale structure. I also present a novel type of probe of structure formation, using the cosmological gravitational wave signal of high redshift compact binary mergers to provide information about structure formation, and thus the behaviour of dark matter. Such observations would also provide competitive constraints. Finally, I investigate primordial black holes as an alternative dark matter candidate, presenting an analysis and framework for the evolution of extended mass populations over cosmological time and computing the present day gamma ray signal, as well as the allowed local evaporation rate. This is used to set constraints on the allowed population of low mass primordial black holes, and the likelihood of witnessing an evaporation

    Neutrinos from horizon to sub-galactic scales

    Get PDF
    A first determination of the mass scale set by the lightest neutrino remains a crucial outstanding challenge for cosmology and particle physics, with profound implications for the history of the Universe and physics beyond the Standard Model. In this thesis, we present the results from three methodological papers and two applications that contribute to our understanding of the cosmic neutrino background. First, we introduce a new method for the noise-suppressed evaluation of neutrino phase-space statistics. Its primary application is in cosmological N-body simulations, where it reduces the computational cost of simulating neutrinos by orders of magnitude without neglecting their nonlinear evolution. Second, using a recursive formulation of Lagrangian perturbation theory, we derive higher-order neutrino corrections and show that these can be used for the accurate and consistent initialisation of cosmological neutrino simulations. Third, we present a new code for the initialisation of neutrino particles, accounting both for relativistic effects and the full Boltzmann hierarchy. Taken together, these papers demonstrate that with the combination of the methods described therein, we can accurately simulate the evolution of the neutrino background over 13.8 Gyr from the linear and ultra-relativistic regime at z=109z=10^9 down to the non-relativistic yet nonlinear regime at z=0z=0. Moreover, they show that the accuracy of large-scale structure predictions can be controlled at the sub-percent level needed for a neutrino mass determination. In a first application of these methods, we present a forecast for direct detection of the neutrino background, taking into account the gravitational enhancement (or indeed suppression) of the local density due to the Milky Way and the observed large-scale structure within 200 Mpc/h. We determine that the large-scale structure is more important than the Milky Way for neutrino masses below 0.1 eV, predict the orientation of the neutrino dipole, and study small-scale anisotropies. We predict that the angular distribution of neutrinos is anti-correlated with the projected matter density, due to the capture or deflection of neutrinos by massive objects along the line of sight. Finally, we present the first results from a new suite of hydrodynamical simulations, which includes the largest ever simulation with neutrinos and galaxies. We study the extent to which variations in neutrino mass can be treated independently of astrophysical processes, such as feedback from supernovae and black holes. Our findings show that baryonic feedback is weakly dependent on neutrino mass, with feedback being stronger for models with larger neutrino masses. By studying individual dark matter halos, we attribute this effect to the increased baryon density relative to cold dark matter and a reduction in the binding energies of halos. We show that percent-level accurate modelling of the matter power spectrum in a cosmologically interesting parameter range is only possible if the cosmology-dependence of feedback is taken into account

    Direct searches for general dark matter-electron interactions with graphene detectors: Part I. Electronic structure calculations

    Full text link
    We develop a formalism to describe electron ejections from graphene-like targets by dark matter (DM) scattering for general forms of scalar and spin 1/2 DM-electron interactions and compare their applicability and accuracy within the density functional theory (DFT) and tight binding (TB) approaches. This formalism allows for accurate prediction of the daily modulation signal expected from DM in upcoming direct detection experiments employing graphene sheets as the target material. A key result is that the physics of the graphene sheet and that of the DM and the ejected electron factorise, allowing for the rate of ejections from all forms of DM to be obtained with a single graphene response function. We perform a comparison between the TB and DFT approaches to modeling the initial state electronic wavefunction within this framework, with DFT emerging as the more self-consistent and reliable choice due to the challenges in the embedding of an appropriate atomic contribution into the TB approach.Comment: 29 pages, 12 figures, 5 appendices. The TB and DFT codes can be found under https://github.com/temken/Darphene and https://github.com/urdshals/QEdark-EFT respectivel

    Spinors and horospheres

    Full text link
    We give an explicit bijective correspondence between between nonzero pairs of complex numbers, which we regard as spinors or spin vectors, and horospheres in 3-dimensional hyperbolic space decorated with certain spinorial directions. This correspondence builds upon work of Penrose--Rindler and Penner. We show that the natural bilinear form on spin vectors describes a certain complex-valued distance between spin-decorated horospheres, generalising Penner's lambda lengths to 3 dimensions. From this, we derive several applications. We show that the complex lambda lengths in a hyperbolic ideal tetrahedron satisfy a Ptolemy equation. We also obtain correspondences between certain spaces of hyperbolic ideal polygons and certain Grassmannian spaces, under which lambda lengths correspond to Pl\"{u}cker coordinates, illuminating the connection between Grassmannians, hyperbolic polygons, and type A cluster algebras.Comment: 24 pages, 5 figure

    Elements of Ion Linear Accelerators, Calm in The Resonances, Other_Tales

    Full text link
    The main part of this book, Elements of Linear Accelerators, outlines in Part 1 a framework for non-relativistic linear accelerator focusing and accelerating channel design, simulation, optimization and analysis where space charge is an important factor. Part 1 is the most important part of the book; grasping the framework is essential to fully understand and appreciate the elements within it, and the myriad application details of the following Parts. The treatment concentrates on all linacs, large or small, intended for high-intensity, very low beam loss, factory-type application. The Radio-Frequency-Quadrupole (RFQ) is especially developed as a representative and the most complicated linac form (from dc to bunched and accelerated beam), extending to practical design of long, high energy linacs, including space charge resonances and beam halo formation, and some challenges for future work. Also a practical method is presented for designing Alternating-Phase- Focused (APF) linacs with long sequences and high energy gain. Full open-source software is available. The following part, Calm in the Resonances and Other Tales, contains eyewitness accounts of nearly 60 years of participation in accelerator technology. (September 2023) The LINACS codes are released at no cost and, as always,with fully open-source coding. (p.2 & Ch 19.10)Comment: 652 pages. Some hundreds of figures - all images, there is no data in the figures. (September 2023) The LINACS codes are released at no cost and, as always,with fully open-source coding. (p.2 & Ch 19.10

    The universe without us: a history of the science and ethics of human extinction

    Get PDF
    This dissertation consists of two parts. Part I is an intellectual history of thinking about human extinction (mostly) within the Western tradition. When did our forebears first imagine humanity ceasing to exist? Have people always believed that human extinction is a real possibility, or were some convinced that this could never happen? How has our thinking about extinction evolved over time? Why do so many notable figures today believe that the probability of extinction this century is higher than ever before in our 300,000-year history on Earth? Exploring these questions takes readers from the ancient Greeks, Persians, and Egyptians, through the 18th-century Enlightenment, past scientific breakthroughs of the 19th century like thermodynamics and evolutionary theory, up to the Atomic Age, the rise of modern environmentalism in the 1970s, and contemporary fears about climate change, global pandemics, and artificial general intelligence (AGI). Part II is a history of Western thinking about the ethical and evaluative implications of human extinction. Would causing or allowing our extinction be morally right or wrong? Would our extinction be good or bad, better or worse compared to continuing to exist? For what reasons? Under which conditions? Do we have a moral obligation to create future people? Would past “progress” be rendered meaningless if humanity were to die out? Does the fact that we might be unique in the universe—the only “rational” and “moral” creatures—give us extra reason to ensure our survival? I place these questions under the umbrella of Existential Ethics, tracing the development of this field from the early 1700s through Mary Shelley’s 1826 novel The Last Man, the gloomy German pessimists of the latter 19th century, and post-World War II reflections on nuclear “omnicide,” up to current-day thinkers associated with “longtermism” and “antinatalism.” In the dissertation, I call the first history “History #1” and the second “History #2.” A main thesis of Part I is that Western thinking about human extinction can be segmented into five distinction periods, each of which corresponds to a unique “existential mood.” An existential mood arises from a particular set of answers to fundamental questions about the possibility, probability, etiology, and so on, of human extinction. I claim that the idea of human extinction first appeared among the ancient Greeks, but was eclipsed for roughly 1,500 years with the rise of Christianity. A central contention of Part II is that philosophers have thus far conflated six distinct types of “human extinction,” each of which has its own unique ethical and evaluative implications. I further contend that it is crucial to distinguish between the process or event of Going Extinct and the state or condition of Being Extinct, which one should see as orthogonal to the six types of extinction that I delineate. My aim with the second part of the book is to not only trace the history of Western thinking about the ethics of annihilation, but lay the theoretical groundwork for future research on the topic. I then outline my own views within “Existential Ethics,” which combine ideas and positions to yield a novel account of the conditions under which our extinction would be bad, and why there is a sense in which Being Extinct might be better than Being Extant, or continuing to exist

    Mathematical Foundations of Complex Tonality

    Get PDF
    Equal temperament, in which semitones are tuned in the irrational ratio of 21/12:12^{1/12} : 1, is best seen as a serviceable compromise, sacrificing purity for flexibility. Just intonation, in which intervals given by products of powers of 22, 33, and 55, is more natural, but of limited flexibility. We propose a new scheme in which ratios of Gaussian integers form the basis of an abstract tonal system. The tritone, so problematic in just temperament, given ambiguously by the ratios 4532\tfrac{45}{32}, 6445\tfrac{64}{45}, 3625\tfrac{36}{25}, 2518\tfrac{25}{18}, none satisfactory, is in our scheme represented by the complex ratio 1+i:11 + \rm{i} : 1. The major and minor whole tones, given by intervals of 98\tfrac{9}{8} and 109\tfrac{10}{9}, can each be factorized into products of complex semitones, giving us a major complex semitone 34(1+i)\tfrac{3}{4}(1 + \rm{i}) and a minor complex semitone 13(3+i)\tfrac{1}{3}(3 + \rm{i}). The perfect third, given by the interval 54\tfrac{5}{4}, factorizes into the product of a complex whole tone 12(1+2i)\tfrac{1}{2}(1 + 2\rm{i}) and its complex conjugate. Augmented with these supplementary tones, the resulting scheme of complex intervals based on products of powers of Gaussian primes leads very naturally to the construction of a complete system of major and minor scales in all keys.Comment: 35 pages, revise
    • 

    corecore