2,078 research outputs found

    Correlation measurements in high-multiplicity events

    Full text link
    Requirements for correlation measurements in high--multiplicity events are discussed. Attention is focussed on detection of so--called hot spots, two--particle rapidity correlations, two--particle momentum correlations (for quantum interferometry) and higher--order correlations. The signal--to--noise ratio may become large in the high--multiplicity limit, allowing meaningful single--event measurements, only if the correlations are due to collective behavior.Comment: MN 55455, 20 pages, KSUCNR-011-92 and TPI-MINN-92/47-T (revised). Revised to correct typo in equation (30), and to fill in a few steps in calculations. Now published as Phys. Rev. C 47 (1993) 232

    Mapping the Arnold web with a GPU-supercomputer

    Full text link
    The Arnold diffusion constitutes a dynamical phenomenon which may occur in the phase space of a non-integrable Hamiltonian system whenever the number of the system degrees of freedom is M≥3M \geq 3. The diffusion is mediated by a web-like structure of resonance channels, which penetrates the phase space and allows the system to explore the whole energy shell. The Arnold diffusion is a slow process; consequently the mapping of the web presents a very time-consuming task. We demonstrate that the exploration of the Arnold web by use of a graphic processing unit (GPU)-supercomputer can result in distinct speedups of two orders of magnitude as compared to standard CPU-based simulations.Comment: 7 pages, 4 figures, a video supplementary provided at http://www.physik.uni-augsburg.de/~seiberar/arnold/Energy15_HD_frontNback.av

    Estimation of the vertical profile of sulfur dioxide injection into the atmosphere by a volcanic eruption using satellite column measurements and inverse transport modeling

    Get PDF
    International audienceAn analytical inversion method has been developed to estimate the vertical profile of SO2 emissions from volcanic eruptions. The method uses satellite-observed total SO2 columns and an atmospheric transport model (FLEXPART) to exploit the fact that winds change with altitude ? thus, the position and shape of the volcanic plume bear information on its emission altitude. The method finds the vertical emission distribution which minimizes the total difference between simulated and observed SO2 columns while also considering a priori information. We have tested the method with the eruption of Jebel at Tair on 30 September 2007 for which a comprehensive observational data set from various satellite instruments (AIRS, OMI, SEVIRI, CALIPSO) is available. Using satellite data from the first 24 h after the eruption for the inversion, we found an emission maximum near 16 km above sea level (asl), and secondary maxima near 5, 9, 12 and 14 km a.s.l. 60% of the emission occurred above the tropopause. The emission profile obtained in the inversion was then used to simulate the transport of the plume over the following week. The modeled plume agrees very well with SO2 total columns observed by OMI, and its altitude and width agree mostly within 1?2 km with CALIPSO observations of stratospheric aerosol produced from the SO2. The inversion result is robust against various changes in both the a priori and the observations. Even when using only SEVIRI data from the first 15 h after the eruption, the emission profile was reasonably well estimated. The method is computationally very fast. It is therefore suitable for implementation within an operational environment, such as the Volcanic Ash Advisory Centers, to predict the threat posed by volcanic ash for air traffic. It could also be helpful for assessing the sulfur input into the stratosphere, be it in the context of volcanic processes or also for proposed geo-engineering techniques to counteract global warming

    What's Decidable About Sequences?

    Full text link
    We present a first-order theory of sequences with integer elements, Presburger arithmetic, and regular constraints, which can model significant properties of data structures such as arrays and lists. We give a decision procedure for the quantifier-free fragment, based on an encoding into the first-order theory of concatenation; the procedure has PSPACE complexity. The quantifier-free fragment of the theory of sequences can express properties such as sortedness and injectivity, as well as Boolean combinations of periodic and arithmetic facts relating the elements of the sequence and their positions (e.g., "for all even i's, the element at position i has value i+3 or 2i"). The resulting expressive power is orthogonal to that of the most expressive decidable logics for arrays. Some examples demonstrate that the fragment is also suitable to reason about sequence-manipulating programs within the standard framework of axiomatic semantics.Comment: Fixed a few lapses in the Mergesort exampl

    Synthetic Mudscapes: Human Interventions in Deltaic Land Building

    Get PDF
    In order to defend infrastructure, economy, and settlement in Southeast Louisiana, we must construct new land to mitigate increasing risk. Links between urban environments and economic drivers have constrained the dynamic delta landscape for generations, now threatening to undermine the ecological fitness of the entire region. Static methods of measuring, controlling, and valuing land fail in an environment that is constantly in flux; change and indeterminacy are denied by traditional inhabitation. Multiple land building practices reintroduce deltaic fluctuation and strategic deposition of fertile material to form the foundations of a multi-layered defence strategy. Manufactured marshlands reduce exposure to storm surge further inland. Virtual monitoring and communication networks inform design decisions and land use becomes determined by its ecological health. Mudscapes at the threshold of land and water place new value on former wastelands. The social, economic, and ecological evolution of the region are defended by an expanded web of growing land

    The value of multiple data set calibration versus model complexity for improving the performance of hydrological models in mountain catchments

    Get PDF
    The assessment of snow, glacier, and rainfall runoff contribution to discharge in mountain streams is of major importance for an adequate water resource management. Such contributions can be estimated via hydrological models, provided that the modeling adequately accounts for snow and glacier melt, as well as rainfall runoff. We present a multiple data set calibration approach to estimate runoff composition using hydrological models with three levels of complexity. For this purpose, the code of the conceptual runoff model HBV-light was enhanced to allow calibration and validation of simulations against glacier mass balances, satellite-derived snow cover area and measured discharge. Three levels of complexity of the model were applied to glacierized catchments in Switzerland, ranging from 39 to 103 km2. The results indicate that all three observational data sets are reproduced adequately by the model, allowing an accurate estimation of the runoff composition in the three mountain streams. However, calibration against only runoff leads to unrealistic snow and glacier melt rates. Based on these results, we recommend using all three observational data sets in order to constrain model parameters and compute snow, glacier, and rain contributions. Finally, based on the comparison of model performance of different complexities, we postulate that the availability and use of different data sets to calibrate hydrological models might be more important than model complexity to achieve realistic estimations of runoff composition

    A conservative reconstruction scheme for the interpolation of extensive quantities in the Lagrangian particle dispersion model FLEXPART

    Get PDF
    Lagrangian particle dispersion models require interpolation of all meteorological input variables to the position in space and time of computational particles. The widely used model FLEXPART uses linear interpolation for this purpose, implying that the discrete input fields contain point values. As this is not the case for precipitation (and other fluxes) which represent cell averages or integrals, a preprocessing scheme is applied which ensures the conservation of the integral quantity with the linear interpolation in FLEXPART, at least for the temporal dimension. However, this mass conservation is not ensured per grid cell, and the scheme thus has undesirable properties such as temporal smoothing of the precipitation rates. Therefore, a new reconstruction algorithm was developed, in two variants. It introduces additional supporting grid points in each time interval and is to be used with a piecewise linear interpolation to reconstruct the precipitation time series in FLEXPART. It fulfils the desired requirements by preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. The function values of the reconstruction algorithm at the sub-grid and boundary grid points constitute the degrees of freedom, which can be prescribed in various ways. With the requirements mentioned it was possible to derive a suitable piecewise linear reconstruction. To improve the monotonicity behaviour, two versions of a filter were also developed that form a part of the final algorithm. Currently, the algorithm is meant primarily for the temporal dimension. It was shown to significantly improve the reconstruction of hourly precipitation time series from 3-hourly input data. Preliminary considerations for the extension to additional dimensions are also included as well as suggestions for a range of possible applications beyond the case of precipitation in a Lagrangian particle model

    UV/Optical Detections of Candidate Tidal Disruption Events by GALEX and CFHTLS

    Get PDF
    We present two luminous UV/optical flares from the nuclei of apparently inactive early-type galaxies at z=0.37 and 0.33 that have the radiative properties of a flare from the tidal disruption of a star. In this paper we report the second candidate tidal disruption event discovery in the UV by the GALEX Deep Imaging Survey, and present simultaneous optical light curves from the CFHTLS Deep Imaging Survey for both UV flares. The first few months of the UV/optical light curves are well fitted with the canonical t^(-5/3) power-law decay predicted for emission from the fallback of debris from a tidally disrupted star. Chandra ACIS X-ray observations during the flares detect soft X-ray sources with T_bb= (2-5) x 10^5 K or Gamma > 3 and place limits on hard X-ray emission from an underlying AGN down to L_X (2-10 keV) <~ 10^41 ergs/s. Blackbody fits to the UV/optical spectral energy distributions of the flares indicate peak flare luminosities of > 10^44-10^45 ergs/s. The temperature, luminosity, and light curves of both flares are in excellent agreement with emission from a tidally disrupted main sequence star onto a central black hole of several times 10^7 msun. The observed detection rate of our search over ~ 2.9 deg^2 of GALEX Deep Imaging Survey data spanning from 2003 to 2007 is consistent with tidal disruption rates calculated from dynamical models, and we use these models to make predictions for the detection rates of the next generation of optical synoptic surveys.Comment: 28 pages, 27 figures, 11 tables, accepted to ApJ, final corrections from proofs adde

    Morphology of ledge patterns during step flow growth of metal surfaces vicinal to fcc(001)

    Get PDF
    The morphological development of step edge patterns in the presence of meandering instability during step flow growth is studied by simulations and numerical integration of a continuum model. It is demonstrated that the kink Ehrlich-Schwoebel barrier responsible for the instability leads to an invariant shape of the step profiles. The step morphologies change with increasing coverage from a somewhat triangular shape to a more flat, invariant steady state form. The average pattern shape extracted from the simulations is shown to be in good agreement with that obtained from numerical integration of the continuum theory.Comment: 4 pages, 4 figures, RevTeX 3, submitted to Phys. Rev.

    Heavy resonance production in high energy nuclear collisions

    Get PDF
    We estimate freezeout conditions for ss, cc, and bb quarks in high energy nuclear collisions. Freezeout is due either to loss of thermal contact, or to particles ``wandering'' out of the region of hot matter. We then develop a thermal recombination model in which both single-particle (quark and antiquark) and two-particle (quark-antiquark) densities are conserved. Conservation of two-particle densities is necessary because quarks and antiquarks are always produced in coincidence, so that the local two-particle density can be much larger than the product of the single-particle densities. We use the freezeout conditions and recombination model to discuss heavy resonance production at zero baryon density in high energy nuclear collisions.Comment: revtex, 15 pages, no figures, KSUCNR-009-9
    • …
    corecore