326 research outputs found

    Groundwater Flow and the Resulting Heat Transfer from the Sea Floor, Immediately after the Genesis Flood

    Get PDF
    This abstract provides a multi-faceted solution method to the “Heat Problem after the Genesis Flood” which is defined as follows: Most models of CPT require that large amounts of hot crustal material would be spread across the ocean floor during the flood, especially the Atlantic ocean. This would release so much heat as to possibly boil the oceans. Because of this problem, the genesis flood didn’t happen, and thus the bible is wrong and evolution is right. It is argued that long amounts of time are required to transfer any significant portion of this heat, and it is proposed that this heat remained trapped under the seafloor for hundreds of years after the flood ended. Most of the extra heat was eventually dissipated via (the very slow process of) radiation from the upper atmosphere. Since crustal material is not in thermal contact with the upper atmosphere, an analysis of the methods of heat transfer is needed. Under the circumstances resulting from CPT, it is demonstrated that hydrothermal groundwater convection beneath the seafloor is the limiting heat transfer method. The other two possible mechanisms to transfer heat across the seafloor (thermal conduction and magma convection) are also discussed, calculated, evaluated, and ultimately dismissed. Also dismissed is the idea that miracles were involved in removing this extra heat, and it is explained why the use of miracles in this situation is unnecessary and also theologically problematic. Although hydrothermal convection via groundwater flow is the main focus of the abstract, it simply brings thermal energy across the seafloor. It must then be transferred to the upper atmosphere by Earth’s weather. The climate impacts of this new heat source are not discussed here

    Effects of Hot Post-Flood Groundwater Flow from the Sea Floor

    Get PDF
    This abstract deals with the effects of large amounts (~700 ˣ 1024 Joules) of geothermal heat being slowly transferred across the seafloor for several hundred years. This is enough energy to heat the oceans by 125 °C if it was deposited instantaneously. The mechanism of how this geothermal heat is supplied to the seafloor is a separate topic that is not discussed here. What makes this different than other “warm ocean” models is that they use a one-time ocean heating event during the Genesis flood. My model uses continuous heating for centuries, while the oceans also simultaneously cool by transferring thermal energy to the atmosphere. I evaluate both models by doing an energy balance for the entire planet. For the one-time heating model, calculated ocean cooling rates are 21.6 to 31.4 °C per century, thus the ice age could have only lasted about 80 to 120 years. For the multi-century geothermal heating model, the ocean temperature vs. heat loss is calculated for zero to 1,500 years after the flood. The model starts out with no (zero °C) post-flood ocean temperature increase. After about 200 years, the deep-water ocean temperature increase maximizes at about +15 °C. By 1,000 years post-flood, deep-water ocean temperatures are similar to today. I demonstrate that the geographical distribution of this seafloor heating makes little difference in the resulting climate. Also discussed is a “maximum geothermal heat budget” that the climate can safely handle, which any proposed CPT model must comply with. If significant amounts of geothermal heat were discharged into the deep ocean, warmed water would rise from the seafloor, disrupting the thermohaline circulation. An estimate of the flow rates and the post-flood ocean circulation pattern resulting from this scenario is provided. Additionally, this moving water can erode fine particulate matter from the ocean floor, thus sediment calculations are also provided. Other post-flood impacts are discussed, such as the effect on the chemistry of ocean water. Hot water travelling through rocks can change (and be changed by) the physical and chemical makeup of those rocks (metamorphism). Hydrothermal vents are discussed, and also the formation of manganese nodules (secular science does not have a good answer to how these formed)

    Constructing quantum games from non-factorizable joint probabilities

    Get PDF
    A probabilistic framework is developed that gives a unifying perspective on both the classical and the quantum games. We suggest exploiting peculiar probabilities involved in Einstein-Podolsky-Rosen (EPR) experiments to construct quantum games. In our framework a game attains classical interpretation when joint probabilities are factorizable and a quantum game corresponds when these probabilities cannot be factorized. We analyze how non-factorizability changes Nash equilibria in two-player games while considering the games of Prisoner's Dilemma, Stag Hunt, and Chicken. In this framework we find that for the game of Prisoner's Dilemma even non-factorizable EPR joint probabilities cannot be helpful to escape from the classical outcome of the game. For a particular version of the Chicken game, however, we find that the two non-factorizable sets of joint probabilities, that maximally violates the Clauser-Holt-Shimony-Horne (CHSH) sum of correlations, indeed result in new Nash equilibria.Comment: Revised in light of referee's comments, submitted to Physical Review

    Recoil Studies in the Reaction of 12-C Ions with the Enriched Isotope 118-Sn

    Full text link
    The recoil properties of the product nuclei from the interaction of 2.2 GeV/nucleon 12-C ions from Nuclotron of the Laboratory of High Energies (LHE), Joint Institute for Nuclear Research (JINR) at Dubna with a 118-Sn target have been studied using catcher foils. The experimental data were analyzed using the mathematical formalism of the standard two-step vector model. The results for 12-C ions are compared with those for deuterons and protons. Three different Los Alamos versions of the Quark-Gluon String Model (LAQGSM) were used for comparison with our experimental data.Comment: 10 pages, 6 figures, submitted to Nucl. Phys.

    Parents of psychiatrically hospitalized children: A decade of changing perceptions

    Full text link
    Psychiatric hospitals for children have changed dramatically during the last decade. The lengths of hospitalizations have been shortened; the psychopathology of children qualifying for admission is more severe, often with neurological or biochemical components. In some hospitals, there has been an increasing emphasis on research. All of these changes have affected the staff's perceptions of the children's parents; these changes appear to have resulted in a more supportive, less critical attitude toward these parents. This may be significant in increasing parents' confidence in coping with their child's illness and their family's stress. The need for empirical, longitudinal research is emphasized.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44246/1/10560_2004_Article_BF00757586.pd

    Robot life: simulation and participation in the study of evolution and social behavior.

    Get PDF
    This paper explores the case of using robots to simulate evolution, in particular the case of Hamilton's Law. The uses of robots raises several questions that this paper seeks to address. The first concerns the role of the robots in biological research: do they simulate something (life, evolution, sociality) or do they participate in something? The second question concerns the physicality of the robots: what difference does embodiment make to the role of the robot in these experiments. Thirdly, how do life, embodiment and social behavior relate in contemporary biology and why is it possible for robots to illuminate this relation? These questions are provoked by a strange similarity that has not been noted before: between the problem of simulation in philosophy of science, and Deleuze's reading of Plato on the relationship of ideas, copies and simulacra

    An Alternative Interpretation of Statistical Mechanics

    Get PDF
    In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, a consequence which suggests interesting possibilities for developing non-equilibrium statistical mechanics and investigating inter-theoretic answers to the foundational questions of statistical mechanics

    On malfunctioning software

    Get PDF
    Artefacts do not always do what they are supposed to, due to a variety of reasons, including manufacturing problems, poor maintenance, and normal wear-and-tear. Since software is an artefact, it should be subject to malfunctioning in the same sense in which other artefacts can malfunction. Yet, whether software is on a par with other artefacts when it comes to malfunctioning crucially depends on the abstraction used in the analysis. We distinguish between “negative” and “positive” notions of malfunction. A negative malfunction, or dysfunction, occurs when an artefact token either does not (sometimes) or cannot (ever) do what it is supposed to. A positive malfunction, or misfunction, occurs when an artefact token may do what is supposed to but, at least occasionally, it also yields some unintended and undesirable effects. We argue that software, understood as type, may misfunction in some limited sense, but cannot dysfunction. Accordingly, one should distinguish software from other technical artefacts, in view of their design that makes dysfunction impossible for the former, while possible for the latter
    • 

    corecore