42,477 research outputs found

    The one-dimensional Stefan problem with non-Fourier heat conduction

    Full text link
    We investigate the one-dimensional growth of a solid into a liquid bath, starting from a small crystal, using the Guyer-Krumhansl and Maxwell-Cattaneo models of heat conduction. By breaking the solidification process into the relevant time regimes we are able to reduce the problem to a system of two coupled ordinary differential equations describing the evolution of the solid-liquid interface and the heat flux. The reduced formulation is in good agreement with numerical simulations. In the case of silicon, differences between classical and non-classical solidification kinetics are relatively small, but larger deviations can be observed in the evolution in time of the heat flux through the growing solid. From this study we conclude that the heat flux provides more information about the presence of non-classical modes of heat transport during phase-change processes.Comment: 29 pages, 6 figures, 2 tables + Supplementary Materia

    Derivation of Delay Equation Climate Models Using the Mori-Zwanzig Formalism

    Full text link
    Models incorporating delay have been frequently used to understand climate variability phenomena, but often the delay is introduced through an ad-hoc physical reasoning, such as the propagation time of waves. In this paper, the Mori-Zwanzig formalism is introduced as a way to systematically derive delay models from systems of partial differential equations and hence provides a better justification for using these delay-type models. The Mori-Zwanzig technique gives a formal rewriting of the system using a projection onto a set of resolved variables, where the rewritten system contains a memory term. The computation of this memory term requires solving the orthogonal dynamics equation, which represents the unresolved dynamics. For nonlinear systems, it is often not possible to obtain an analytical solution to the orthogonal dynamics and an approximate solution needs to be found. Here, we demonstrate the Mori-Zwanzig technique for a two-strip model of the El Nino Southern Oscillation (ENSO) and explore methods to solve the orthogonal dynamics. The resulting nonlinear delay model contains an additional term compared to previously proposed ad-hoc conceptual models. This new term leads to a larger ENSO period, which is closer to that seen in observations.Comment: Submitted to Proceedings of the Royal Society A, 25 pages, 10 figure

    Computational statistics using the Bayesian Inference Engine

    Full text link
    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimised software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organise and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasises hybrid tempered MCMC schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE is implements a full persistence or serialisation system that stores the full byte-level image of the running inference and previously characterised posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GPL.Comment: Resubmitted version. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU GP

    Aversion to ambiguity and model misspecification in dynamic stochastic environments

    Get PDF
    Preferences that accommodate aversion to subjective uncertainty and its potential misspecification in dynamic settings are a valuable tool of analysis in many disciplines. By generalizing previous analyses, we propose a tractable approach to incorporating broadly conceived responses to uncertainty. We illustrate our approach on some stylized stochastic environments. By design, these discrete time environments have revealing continuous time limits. Drawing on these illustrations, we construct recursive representations of intertemporal preferences that allow for penalized and smooth ambiguity aversion to subjective uncertainty. These recursive representations imply continuous time limiting Hamilton–Jacobi–Bellman equations for solving control problems in the presence of uncertainty.Published versio

    Unnatural Selection: A new formal approach to punctuated equilibrium in economic systems

    Get PDF
    Generalized Darwinian evolutionary theory has emerged as central to the description of economic process (e.g., Aldrich et. al., 2008). Here we demonstrate that, just as Darwinian principles provide necessary, but not sufficient, conditions for understanding the dynamics of social entities, in a similar manner the asymptotic limit theorems of information theory provide another set of necessary conditions that constrain the evolution of socioeconomic process. These latter constraints can, however, easily be formulated as a statistics-like analytic toolbox for the study of empirical data that is consistent with a generalized Darwinism, and this is no small thing

    Experimental and Theoretical Challenges in the Search for the Quark Gluon Plasma: The STAR Collaboration's Critical Assessment of the Evidence from RHIC Collisions

    Get PDF
    We review the most important experimental results from the first three years of nucleus-nucleus collision studies at RHIC, with emphasis on results from the STAR experiment, and we assess their interpretation and comparison to theory. The theory-experiment comparison suggests that central Au+Au collisions at RHIC produce dense, rapidly thermalizing matter characterized by: (1) initial energy densities above the critical values predicted by lattice QCD for establishment of a Quark-Gluon Plasma (QGP); (2) nearly ideal fluid flow, marked by constituent interactions of very short mean free path, established most probably at a stage preceding hadron formation; and (3) opacity to jets. Many of the observations are consistent with models incorporating QGP formation in the early collision stages, and have not found ready explanation in a hadronic framework. However, the measurements themselves do not yet establish unequivocal evidence for a transition to this new form of matter. The theoretical treatment of the collision evolution, despite impressive successes, invokes a suite of distinct models, degrees of freedom and assumptions of as yet unknown quantitative consequence. We pose a set of important open questions, and suggest additional measurements, at least some of which should be addressed in order to establish a compelling basis to conclude definitively that thermalized, deconfined quark-gluon matter has been produced at RHIC.Comment: 101 pages, 37 figures; revised version to Nucl. Phys.

    A High-Low Model of Daily Stock Price Ranges

    Get PDF
    We observe that daily highs and lows of stock prices do not diverge over time and, hence, adopt the cointegration concept and the related vector error correction model (VECM) to model the daily high, the daily low, and the associated daily range data. The in-sample results attest the importance of incorporating high-low interactions in modeling the range variable. In evaluating the out-of-sample forecast performance using both mean-squared forecast error and direction of change criteria, it is found that the VECM-based low and high forecasts offer some advantages over some alternative forecasts. The VECM-based range forecasts, on the other hand, do not always dominate –the forecast rankings depend on the choice of evaluation criterion and the variables being forecasted.daily high, daily low, VECM model, forecast performance, implied volatility

    Cooling Rates for Relativistic Electrons Undergoing Compton Scattering in Strong Magnetic Fields

    Full text link
    For inner magnetospheric models of hard X-ray and gamma-ray emission in high-field pulsars and magnetars, resonant Compton upscattering is anticipated to be the most efficient process for generating continuum radiation. This is due in part to the proximity of a hot soft photon bath from the stellar surface to putative radiation dissipation regions in the inner magnetosphere. Moreover, because the scattering process becomes resonant at the cyclotron frequency, the effective cross section exceeds the classical Thomson value by over two orders of magnitude, thereby enhancing the efficiency of continuum production and the cooling of relativistic electrons. This paper presents computations of the electron cooling rates for this process, which are needed for resonant Compton models of non-thermal radiation from such highly-magnetized pulsars. The computed rates extend previous calculations of magnetic Thomson cooling to the domain of relativistic quantum effects, sampled near and above the quantum critical magnetic field of 44.13 TeraGauss. This is the first exposition of fully relativistic, quantum magnetic Compton cooling rates for electrons, and it employs both the traditional Johnson and Lippman cross section, and a newer Sokolov and Ternov (ST) formulation of Compton scattering in strong magnetic fields. Such ST formalism is formally correct for treating spin-dependent effects that are important in the cyclotron resonance, and has not been addressed before in the context of cooling by Compton scattering. The QED effects are observed to profoundly lower the rates below extrapolations of the familiar magnetic Thomson results, as expected, when recoil and Klein-Nishina reductions become important.Comment: 33 pages, 11 figures, accepted for publication in The Astrophysical Journa

    Matter in extremis: ultrarelativistic nuclear collisions at RHIC

    Full text link
    We review the physics of nuclear matter at high energy density and the experimental search for the Quark-Gluon Plasma at the Relativistic Heavy Ion Collider (RHIC). The data obtained in the first three years of the RHIC physics program provide several lines of evidence that a novel state of matter has been created in the most violent, head-on collisions of AuAu nuclei at s=200\sqrt{s}=200 GeV. Jet quenching and global measurements show that the initial energy density of the strongly interacting medium generated in the collision is about two orders of magnitude larger than that of cold nuclear matter, well above the critical density for the deconfinement phase transition predicted by lattice QCD. The observed collective flow patterns imply that the system thermalizes early in its evolution, with the dynamics of its expansion consistent with ideal hydrodynamic flow based on a Quark-Gluon Plasma equation of state.Comment: 93 pages, 46 figures; final version for journal incorporating minor changes and correction
    corecore