13,887 research outputs found

    A probability-conserving cross-section biasing mechanism for variance reduction in Monte Carlo particle transport calculations

    Full text link
    In Monte Carlo particle transport codes, it is often important to adjust reaction cross sections to reduce the variance of calculations of relatively rare events, in a technique known as non-analogous Monte Carlo. We present the theory and sample code for a Geant4 process which allows the cross section of a G4VDiscreteProcess to be scaled, while adjusting track weights so as to mitigate the effects of altered primary beam depletion induced by the cross section change. This makes it possible to increase the cross section of nuclear reactions by factors exceeding 10^4 (in appropriate cases), without distorting the results of energy deposition calculations or coincidence rates. The procedure is also valid for bias factors less than unity, which is useful, for example, in problems that involve computation of particle penetration deep into a target, such as occurs in atmospheric showers or in shielding

    Optimizing the yield of Sunyaev-Zel'dovich cluster surveys

    Full text link
    We consider the optimum depth of a cluster survey selected using the Sunyaev-Zel'dovich effect. By using simple models for the evolution of the cluster mass function and detailed modeling for a variety of observational techniques, we show that the optimum survey yield is achieved when the average size of the clusters selected is close to the size of the telescope beam. For a total power measurement, we compute the optimum noise threshold per beam as a function of the beam size and then discuss how our results can be used in more general situations. As a by-product we gain some insight into what is the most advantageous instrumental set-up. In the case of beam switching observations one is not severely limited if one manages to set the noise threshold close to the point which corresponds to the optimum yield. By defining a particular reference configuration, we show how our results can be applied to interferometer observations. Considering a variety of alternative scenarios, we discuss how robust our conclusions are to modifications in the cluster model and cosmological parameters. The precise optimum is particularly sensitive to the amplitude of fluctuations and the profile of the gas in the cluster.Comment: 16 pages, 18 figure

    Metrication study for large space telescope

    Get PDF
    Various approaches which could be taken in developing a metric-system design for the Large Space Telescope, considering potential penalties on development cost and time, commonality with other satellite programs, and contribution to national goals for conversion to the metric system of units were investigated. Information on the problems, potential approaches, and impacts of metrication was collected from published reports on previous aerospace-industry metrication-impact studies and through numerous telephone interviews. The recommended approach to LST metrication formulated in this study cells for new components and subsystems to be designed in metric-module dimensions, but U.S. customary practice is allowed where U.S. metric standards and metric components are not available or would be unsuitable. Electrical/electronic-system design, which is presently largely metric, is considered exempt from futher metrication. An important guideline is that metric design and fabrication should in no way compromise the effectiveness of the LST equipment

    The Cosmic Microwave Background and the Ionization History of the Universe

    Full text link
    Details of how the primordial plasma recombined and how the universe later reionized are currently somewhat uncertain. This uncertainty can restrict the accuracy of cosmological parameter measurements from the Cosmic Microwave Background (CMB). More positively, future CMB data can be used to constrain the ionization history using observations. We first discuss how current uncertainties in the recombination history impact parameter constraints, and show how suitable parameterizations can be used to obtain unbiased parameter estimates from future data. Some parameters can be constrained robustly, however there is clear motivation to model recombination more accurately with quantified errors. We then discuss constraints on the ionization fraction binned in redshift during reionization. Perfect CMB polarization data could in principle distinguish different histories that have the same optical depth. We discuss how well the Planck satellite may be able to constrain the ionization history, and show the currently very weak constraints from WMAP three-year data.Comment: Changes to match MNRAS accepted versio

    Central bank intervention with limited arbitrage

    Get PDF
    Shleifer and Vishny (1997) pointed out some of the practical and theoretical problems associated with assuming that rational risk-arbitrage would quickly drive asset prices back to long-run equilibrium. In particular, they showed that the possibility that asset price disequilibrium would worsen, before being corrected, tends to limit rational speculators. Uniquely, Shleifer and Vishny (1997) showed that “performance-based asset management” would tend to reduce risk-arbitrage when it is needed most, when asset prices are furthest from equilibrium. We analyze a generalized Shleifer and Vishny (1997) model for central bank intervention. We show that increasing availability of arbitrage capital has a pronounced effect on the dynamic intervention strategy of the central bank. Intervention is reduced during periods of moderate misalignment and amplified at times of extreme misalignment. This pattern is consistent with empirical observation.

    Predicting exchange rate volatility: genetic programming vs. GARCH and RiskMetrics

    Get PDF
    This article investigates the use of genetic programming to forecast out-of-sample daily volatility in the foreign exchange market. Forecasting performance is evaluated relative to GARCH(1,1) and RiskMetrics models for two currencies, DEM and JPY. Although the GARCH/RiskMetrics models appear to have a inconsistent marginal edge over the genetic program using the mean-squared-error (MSE) and R2 criteria, the genetic program consistently produces lower mean absolute forecast errors (MAE) at all horizons and for both currencies.Foreign exchange rates ; Forecasting ; Programming (Mathematics)

    Technical analysis in the foreign exchange market

    Get PDF
    This article introduces the subject of technical analysis in the foreign exchange market, with emphasis on its importance for questions of market efficiency. Technicians view their craft, the study of price patterns, as exploiting traders’ psychological regularities. The literature on technical analysis has established that simple technical trading rules on dollar exchange rates provided 15 years of positive, risk-adjusted returns during the 1970s and 80s before those returns were extinguished. More recently, more complex and less studied rules have produced more modest returns for a similar length of time. Conventional explanations that rely on risk adjustment and/or central bank intervention are not plausible justifications for the observed excess returns from following simple technical trading rules. Psychological biases, however, could contribute to the profitability of these rules. We view the observed pattern of excess returns to technical trading rules as being consistent with an adaptive markets view of the world.Foreign exchange rates
    corecore