349 research outputs found

    Objective Reconstructions of the Late Wisconsinan Laurentide Ice Sheet and the Significance of Deformable Beds

    Get PDF
    A three dimensional steady state plastic ice model; the present surface topography (on a 50 km grid); a recent concensus of the Late Wisconsinan maximum margin (PREST, 1984); and a simple map of ice yield stress are used to model the Laurentide Ice Sheet. A multi-domed, asymmetric reconstruction is computed without prior assumptions about flow lines. The effects of possible deforming beds are modelled by using the very low yield stress values suggested by MATHEWS (1974). Because of low yield stress (deforming beds) the model generates thin ice on the Prairies, Great Lakes area and, in one case, over Hudson Bay. Introduction of low yield stress (deformabie) regions also produces low surface slopes and abrupt ice flow direction changes. In certain circumstances large ice streams are generated along the boundaries between normal yield stress (non-deformable beds) and low yield stress ice (deformabie beds). Computer models are discussed in reference to the geologically-based reconstructions of SHILTS (1980) and DYKE ef al. (1982).À partir d'un modèle théorique tridimensionnel de plasticité de la glace, de la topographie actuelle (sur un canevas de 50 km2), du nouveau consensus quant à la limite maximale de la marge glaciaire (PREST, 1984) et d'une carte des seuils de plasticité de la glace, les auteurs ont élaboré des modèles de la calotte glaciaire laurentidienne. On a donc reconstitué par ordinateur une calotte asymétrique à dômes multiples, sans idée préconçue quant aux directions de l'écoulement des glaces. On a évalué les conséquences de la présence éventuelle de lits non résistants en se fondant sur les très bas seuils de plasticité de la glace proposés par MATHEWS (1974). En raison des bas seuils de plasticité (lits non résistants), les modèles démontrent qu'une glace peu épaisse couvrait les Prairies et la région des Grands Lacs, ainsi que la baie d'Hudson, dans un des deux cas. La prise en considération de régions à bas seuils de plasticité (lits non résistants) montre également la présence de pentes faibles et des changements brusques de direction de l'écoulement glaciaire. Dans certains cas, de grands courants glaciaires se manifestent le long des limites entre les endroits où les seuils de plasticité sont normaux (lits rigides) et les endroits où les seuils de plasticité sont bas (lits non résistants). Les modèles obtenus par ordinateur sont ensuite comparés aux reconstitutions de SHILTS (1980) et de DYKE et al. (1982), élaborées à partir des données géologiques.Ein dreidimensionales konstantes Modell der Eis-Plastizitat, die gegenwàrtige Oberflâchentopographie (auf einem Gitternetz von 50 km), ein neuer Konsensus ùber den maximalen glazialen Rand des spâten Wisconsin (PREST, 1984) und eine einfache Karte der Eis-Plastizitàts-Schwelle werden benutzt, um ein Modell der Laurentischen Eisdecke herzustellen. Eine vielfach gewôlbte, asymmetrische Rekonstruktion ist hergestellt worden, ohne vorgefapte Meinung ùber die Richtung des FlieBens. Die Wirkungen von môglicherweise vorhandenen nachgiebigen Betten werden mittels der sehr niedrigen Eisplastizitàtsschwelle, wie sie von MATHEWS (1974) vorgeschlagen wird, im Modell gestaltet. Wegen der niedrigen Plastizitâtsschwelle (nachgiebige Betten) zeigt das Modell dùnnes Eis in den Prairies, der Gegend der grofien Seen und in einem Fall ùber der Hudson Bay. Die Berùcksichtigung von Gebieten mit niedriger Plastizitàtsschwelle (nachgiebige Betten) fùhrt auch zu niedrigen Hàngen und abrupten Wechseln in der Richtung des EisflieBens. Unter gewissen Bedingungen bilden sich breite Eisstrôme entlang der Grenzèn zwischen Gegenden mit normaler Eisplastizitât (bestândige Betten) und geringer Eisplastizitàt (nachgiebige Betten). Durch Computer erstellte Modelle werden in Bezug auf die geologisch erarbeiteten Rekonstruktionen von SHILTS (1980) und DYKE et al. (1982) diskutiert

    Climate Noise Influences Ice Sheet Mean State

    Get PDF
    Evidence from proxy records indicates that millennial‐scale abrupt climate shifts, called Dansgaard‐Oeschger events, happened during past glacial cycles. Various studies have been conducted to uncover the physical mechanism behind them, based on the assumption that climate mean state determines the variability. However, our study shows that the Dansgaard‐Oeschger events can regulate the mean state of the Northern Hemisphere ice sheets. Sensitivity experiments show that the simulated mean state is influenced by the amplitude of the climatic noise. The most likely cause of this phenomenon is the nonlinear response of the surface mass balance to temperature. It could also cause the retreat processes to be faster than the buildup processes within a glacial cycle. We propose that the climate variability hindered ice sheet development and prevented the Earth system from entering a full glacial state from Marine Isotope Stage 4 to Marine Isotope Stage 3 about 60,000 years ago

    Developed turbulence: From full simulations to full mode reductions

    Get PDF
    Developed Navier-Stokes turbulence is simulated with varying wavevector mode reductions. The flatness and the skewness of the velocity derivative depend on the degree of mode reduction. They show a crossover towards the value of the full numerical simulation when the viscous subrange starts to be resolved. The intermittency corrections of the scaling exponents of the pth order velocity structure functions seem to depend mainly on the proper resolution of the inertial subrange. Universal scaling properties (i.e., independent of the degree of mode reduction) are found for the relative scaling exponents rho which were recently defined by Benzi et al.Comment: 4 pages, 5 eps-figures, replaces version from August 5th, 199

    A geometrical origin for the covariant entropy bound

    Full text link
    Causal diamond-shaped subsets of space-time are naturally associated with operator algebras in quantum field theory, and they are also related to the Bousso covariant entropy bound. In this work we argue that the net of these causal sets to which are assigned the local operator algebras of quantum theories should be taken to be non orthomodular if there is some lowest scale for the description of space-time as a manifold. This geometry can be related to a reduction in the degrees of freedom of the holographic type under certain natural conditions for the local algebras. A non orthomodular net of causal sets that implements the cutoff in a covariant manner is constructed. It gives an explanation, in a simple example, of the non positive expansion condition for light-sheet selection in the covariant entropy bound. It also suggests a different covariant formulation of entropy bound.Comment: 20 pages, 8 figures, final versio

    Hyperentangled States

    Get PDF
    We investigate a new class of entangled states, which we call 'hyperentangled',that have EPR correlations identical to those in the vacuum state of a relativistic quantum field. We show that whenever hyperentangled states exist in any quantum theory, they are dense in its state space. We also give prescriptions for constructing hyperentangled states that involve an arbitrarily large collection of systems.Comment: 23 pages, LaTeX, Submitted to Physical Review

    Massive Vector Mesons and Gauge Theory

    Get PDF
    We show that the requirements of renormalizability and physical consistency imposed on perturbative interactions of massive vector mesons fix the theory essentially uniquely. In particular physical consistency demands the presence of at least one additional physical degree of freedom which was not part of the originally required physical particle content. In its simplest realization (probably the only one) these are scalar fields as envisaged by Higgs but in the present formulation without the ``symmetry-breaking Higgs condensate''. The final result agrees precisely with the usual quantization of a classical gauge theory by means of the Higgs mechanism. Our method proves an old conjecture of Cornwall, Levin and Tiktopoulos stating that the renormalization and consistency requirements of spin=1 particles lead to the gauge theory structure (i.e. a kind of inverse of 't Hooft's famous renormalizability proof in quantized gauge theories) which was based on the on-shell unitarity of the SS-matrix. We also speculate on a possible future ghostfree formulation which avoids ''field coordinates'' altogether and is expected to reconcile the on-shell S-matrix point of view with the off-shell field theory structure.Comment: 53 pages, version to appear in J. Phys.

    Boundary conditions in the Unruh problem

    Get PDF
    We have analyzed the Unruh problem in the frame of quantum field theory and have shown that the Unruh quantization scheme is valid in the double Rindler wedge rather than in Minkowski spacetime. The double Rindler wedge is composed of two disjoint regions (RR- and LL-wedges of Minkowski spacetime) which are causally separated from each other. Moreover the Unruh construction implies existence of boundary condition at the common edge of RR- and LL-wedges in Minkowski spacetime. Such boundary condition may be interpreted as a topological obstacle which gives rise to a superselection rule prohibiting any correlations between rr- and ll- Unruh particles. Thus the part of the field from the LL-wedge in no way can influence a Rindler observer living in the RR-wedge and therefore elimination of the invisible "left" degrees of freedom will take no effect for him. Hence averaging over states of the field in one wedge can not lead to thermalization of the state in the other. This result is proved both in the standard and algebraic formulations of quantum field theory and we conclude that principles of quantum field theory does not give any grounds for existence of the "Unruh effect".Comment: 31 pages,1 figur

    Deterministic and stochastic descriptions of gene expression dynamics

    Full text link
    A key goal of systems biology is the predictive mathematical description of gene regulatory circuits. Different approaches are used such as deterministic and stochastic models, models that describe cell growth and division explicitly or implicitly etc. Here we consider simple systems of unregulated (constitutive) gene expression and compare different mathematical descriptions systematically to obtain insight into the errors that are introduced by various common approximations such as describing cell growth and division by an effective protein degradation term. In particular, we show that the population average of protein content of a cell exhibits a subtle dependence on the dynamics of growth and division, the specific model for volume growth and the age structure of the population. Nevertheless, the error made by models with implicit cell growth and division is quite small. Furthermore, we compare various models that are partially stochastic to investigate the impact of different sources of (intrinsic) noise. This comparison indicates that different sources of noise (protein synthesis, partitioning in cell division) contribute comparable amounts of noise if protein synthesis is not or only weakly bursty. If protein synthesis is very bursty, the burstiness is the dominant noise source, independent of other details of the model. Finally, we discuss two sources of extrinsic noise: cell-to-cell variations in protein content due to cells being at different stages in the division cycles, which we show to be small (for the protein concentration and, surprisingly, also for the protein copy number per cell) and fluctuations in the growth rate, which can have a significant impact.Comment: 23 pages, 5 figures; Journal of Statistical physics (2012

    Iceberg calving during transition from grounded to floating ice: Columbia Glacier, Alaska

    Get PDF
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/95521/1/grl27053-sup-0003-fs02.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/95521/2/grl27053-sup-0002-fs01.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/95521/3/grl27053-sup-0005-txts01.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/95521/4/grl27053.pd

    Vacuum Fluctuations, Geometric Modular Action and Relativistic Quantum Information Theory

    Full text link
    A summary of some lines of ideas leading to model-independent frameworks of relativistic quantum field theory is given. It is followed by a discussion of the Reeh-Schlieder theorem and geometric modular action of Tomita-Takesaki modular objects associated with the quantum field vacuum state and certain algebras of observables. The distillability concept, which is significant in specifying useful entanglement in quantum information theory, is discussed within the setting of general relativistic quantum field theory.Comment: 26 pages. Contribution for the Proceedings of a Conference on Special Relativity held at Potsdam, 200
    corecore