23,263 research outputs found

    EU DataGRID testbed management and support at CERN

    Full text link
    In this paper we report on the first two years of running the CERN testbed site for the EU DataGRID project. The site consists of about 120 dual-processor PCs distributed over several testbeds used for different purposes: software development, system integration, and application tests. Activities at the site included test productions of MonteCarlo data for LHC experiments, tutorials and demonstrations of GRID technologies, and support for individual users analysis. This paper focuses on node installation and configuration techniques, service management, user support in a gridified environment, and includes considerations on scalability and security issues and comparisons with "traditional" production systems, as seen from the administrator point of view.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 7 pages, LaTeX. PSN THCT00

    The soft fermion dispersion relation at next-to-leading order in hot QED

    Get PDF
    We study next-to-leading order contributions to the soft static fermion dispersion relation in hot QED. We derive an expression for the complete next-to-leading order contribution to the retarded fermion self-energy. The real and imaginary parts of this expression give the next-to-leading order contributions to the mass and damping rate of the fermionic quasi-particle. Many of the terms that are expected to contribute according to the traditional power counting argument are actually subleading. We explain why the power counting method over estimates the contribution from these terms. For the electron damping rate in QED we obtain: ÎłQED=e2T4Ď€(2.70)\gamma_{QED} = \frac{e^2 T}{4\pi}(2.70). We check our method by calculating the next-to-leading order contribution to the damping rate for the case of QCD with two flavours and three coulours. Our result agrees with the result obtained previously in the literature. The numerical evaluation of the nlo contribution to the mass is left to a future publication.Comment: 15 pages, 5 figure

    Coherent Resonat millenial-scale climate transitions triggered by massive meltwater pulses

    Get PDF
    The role of mean and stochastic freshwater forcing on the generation of millennial-scale climate variability in the North Atlantic is studied using a low-order coupled atmosphere–ocean–sea ice model. It is shown that millennial-scale oscillations can be excited stochastically, when the North Atlantic Ocean is fresh enough. This finding is used in order to interpret the aftermath of massive iceberg surges (Heinrich events) in the glacial North Atlantic, which are characterized by an excitation of Dansgaard–Oeschger events. Based on model results, it is hypothesized that Heinrich events trigger Dansgaard–Oeschger cycles and that furthermore the occurrence of Heinrich events is dependent on the accumulated climatic effect of a series of Dansgaard–Oeschger events. This scenario leads to a coupled ocean–ice sheet oscillation that shares many similarities with the Bond cycle. Further sensitivity experiments reveal that the timescale of the oscillations can be decomposed into stochastic, linear, and nonlinear deterministic components. A schematic bifurcation diagram is used to compare theoretical results with paleoclimatic data

    Edge Enhancement Investigations by Means of Experiments and Simulations

    Get PDF
    Standard neutron imaging procedures are based on the “shadow” of the transmitted radiation, attenuated by the sample material. Under certain conditions significant deviations from pure transmission can be found in the form of enhancement or depression at the edges of the samples. These effects can limit the quantification process in the related regions. Otherwise, an enhancement and improvement of visibility can be achieved e.g. in defect analysis. In systematic studies we investigated the dependency of these effects on the specific material (mainly for common metals), such as the sample-to-detector distance, the beam collimation, the material thickness and the neutron energy. The beam lines ICON and BOA at PSI and ANTARES at TU München were used for these experiments due to their capability for neutron imaging with highest possible spatial resolution (6.5 to 13.5 micro-meter pixel size, respectively) and their cold beam spectrum. Next to the experimental data we used a McStas tool for the description of refraction and reflection features at edges for comparison. Even if minor contributions by coherent in-line propagation phase contrast are underlined, the major effect can be described by refraction of the neutrons at the sample-void interface. Ways to suppress and to magnify the edge effects can be derived from these findings.Fil: Lehmann, E.. Paul Scherrer Institut; SuizaFil: Schulz, M.. Technische Universitat Munchen; AlemaniaFil: Wang, Y.. China Insititute of Atomic Energy; ChinaFil: Tartaglione, Aureliano. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentin

    On the metal-insulator transition in the two-chain model of correlated fermions

    Full text link
    The doping-induced metal-insulator transition in two-chain systems of correlated fermions is studied using a solvable limit of the t-J model and the fact that various strong- and weak-coupling limits of the two-chain model are in the same phase, i.e. have the same low-energy properties. It is shown that the Luttinger-liquid parameter K_\rho takes the universal value unity as the insulating state (half-filling) is approached, implying dominant d-type superconducting fluctuations, independently of the interaction strength. The crossover to insulating behavior of correlations as the transition is approached is discussed.Comment: 7 pages, 1 figur

    New spectral classification technique for X-ray sources: quantile analysis

    Full text link
    We present a new technique called "quantile analysis" to classify spectral properties of X-ray sources with limited statistics. The quantile analysis is superior to the conventional approaches such as X-ray hardness ratio or X-ray color analysis to study relatively faint sources or to investigate a certain phase or state of a source in detail, where poor statistics does not allow spectral fitting using a model. Instead of working with predetermined energy bands, we determine the energy values that divide the detected photons into predetermined fractions of the total counts such as median (50%), tercile (33% & 67%), and quartile (25% & 75%). We use these quantiles as an indicator of the X-ray hardness or color of the source. We show that the median is an improved substitute for the conventional X-ray hardness ratio. The median and other quantiles form a phase space, similar to the conventional X-ray color-color diagrams. The quantile-based phase space is more evenly sensitive over various spectral shapes than the conventional color-color diagrams, and it is naturally arranged to properly represent the statistical similarity of various spectral shapes. We demonstrate the new technique in the 0.3-8 keV energy range using Chandra ACIS-S detector response function and a typical aperture photometry involving background subtraction. The technique can be applied in any energy band, provided the energy distribution of photons can be obtained.Comment: 11 pages, 9 figures, accepted for publication in Ap

    Current reversal and exclusion processes with history-dependent random walks

    Get PDF
    A class of exclusion processes in which particles perform history-dependent random walks is introduced, stimulated by dynamic phenomena in some biological and artificial systems. The particles locally interact with the underlying substrate by breaking and reforming lattice bonds. We determine the steady-state current on a ring, and find current-reversal as a function of particle density. This phenomenon is attributed to the non-local interaction between the walkers through their trails, which originates from strong correlations between the dynamics of the particles and the lattice. We rationalize our findings within an effective description in terms of quasi-particles which we call front barriers. Our analytical results are complemented by stochastic simulations.Comment: 5 pages, 6 figure

    Putting bandits into context: How function learning supports decision making

    Get PDF
    The authors introduce the contextual multi-armed bandit task as a framework to investigate learning and decision making in uncertain environments. In this novel paradigm, participants repeatedly choose between multiple options in order to maximize their rewards. The options are described by a number of contextual features which are predictive of the rewards through initially unknown functions. From their experience with choosing options and observing the consequences of their decisions, participants can learn about the functional relation between contexts and rewards and improve their decision strategy over time. In three experiments, the authors explore participants’ behavior in such learning environments. They predict participants’ behavior by context-blind (mean-tracking, Kalman filter) and contextual (Gaussian process and linear regression) learning approaches combined with different choice strategies. Participants are mostly able to learn about the context-reward functions and their behavior is best described by a Gaussian process learning strategy which generalizes previous experience to similar instances. In a relatively simple task with binary features, they seem to combine this learning with a probability of improvement decision strategy which focuses on alternatives that are expected to lead to an improvement upon a current favorite option. In a task with continuous features that are linearly related to the rewards, participants seem to more explicitly balance exploration and exploitation. Finally, in a difficult learning environment where the relation between features and rewards is nonlinear, some participants are again well-described by a Gaussian process learning strategy, whereas others revert to context-blind strategies

    Surveys of Galaxy Clusters with the Sunyaev Zel'dovich Effect

    Get PDF
    We have created mock Sunyaev-Zel'dovich effect (SZE) surveys of galaxy clusters using high resolution N-body simulations. To the pure surveys we add `noise' contributions appropriate to instrument and primary CMB anisotropies. Applying various cluster finding strategies to these mock surveys we generate catalogues which can be compared to the known positions and masses of the clusters in the simulations. We thus show that the completeness and efficiency that can be achieved depend strongly on the frequency coverage, noise and beam characteristics of the instruments, as well as on the candidate threshold. We study the effects of matched filtering techniques on completeness, and bias. We suggest a gentler filtering method than matched filtering in single frequency analyses. We summarize the complications that arise when analyzing the SZE signal at a single frequency, and assess the limitations of such an analysis. Our results suggest that some sophistication is required when searching for `clusters' within an SZE map.Comment: 8 pages, 7 figure
    • …
    corecore