301 research outputs found
An alternative way of plotting the data and results of models of J/psi suppression
We propose an alternative way of looking at data on anomalous J/psi
suppression. The proposed method is in principle equivalent to the one used by
the NA50 Collaboration, but it permits to visualize separate contributions of
individual processes responsible for the disintegration of J/psi's produced by
a hard process in nuclear collisions. The method can be used provided that the
time sequence of contributing mechanisms is known or assumed. It offers an
alternative graphical presentation of the onset of anomalous J/psi suppression
in Pb-Pb interactions observed by the NA50 Collaboration at the CERN SPS and
might contribute to explain why different mechanisms, such as J/psi suppression
by the Quark-Gluon Plasma and by co-movers in the Dual Parton Model or in Monte
Carlo microscopic approaches, all lead to an approximate description of
anomalous J/psi suppression.Comment: 15 pages 5 figures included via epsfi
Bose-Einstein Correlations from Random Walk Models
We argue that strong final state rescattering among the secondary particles
created in relativistic heavy ion collisions is essential to understand the
measured Bose-Einstein correlations. The recently suggested ``random walk
models'' which contain only initial state scattering are unable to reproduce
the measured magnitude and K_\perp-dependence of R_\perp in Pb+Pb collisions
and the increase of R_l with increasing size of the collision system.Comment: 5 pages, REVTEX, 1 figure included with epsf.sty, revised version to
be published in Phys.Lett.
On energy densities reached in heavy-ion collisions at the CERN SPS
We present a few estimates of energy densities reached in heavy-ion
collisions at the CERN SPS. The estimates are based on data and models of
proton-nucleus and nucleus-nucleus interactions. In all of these estimates the
maximum energy density in central Pb+Pb interactions is larger than the
critical energy density of about 0.7 GeV/fm^3 following from lattice gauge
theory computations. In estimates which we consider as realistic the maximum
energy density is about twice the critical value. In this way our analysis
gives some support to claims that deconfined matter has been produced at the
CERN SPS. Any definite statement requires a deeper understanding of formation
times of partons and hadrons in nuclear collisions. We also compare our results
with implicit energy estimates contained in earlier models of anomalous J/psi
suppression in nuclear collisions.Comment: 19 pages, 5 eps figures included, three cosmetic changes, published
versio
Adaptive Learning and Monetary Policy: Lessons from Japan
Motivated by Japan's economic experiences and policy debates over the past two decades, this paper uses a dynamic general equilibrium open economy model to examine the volatility and welfare impact of alternative monetary policies. To capture the dynamic effects of likely structural breaks in the Japanese economy, we model agents’ expectation formation process with an adaptive learning framework, and compare four Taylor-styled policy rules that reflect concerns commonly raised in Japan's actual monetary policy debate. We first show that imperfect knowledge and the associated learning process induce higher volatility in the economy, while still retaining some of the policy conclusions from rational-expectations setups. In particular, explicit exchange rate stabilization is unwarranted; moreover, under volatile foreign disturbances, policymakers should consider targeting domestic price inflation rather than consumer price inflation. However, contrary to results based on rational expectations, we show that even though highly inflation-sensitive rules do raise output volatility, they may nevertheless improve overall welfare in an adaptive learning setting by smoothing inflation fluctuations. Our findings suggest that previous policy conclusions that are based on partial equilibrium analyses, or that ignore likely deviations from rational expectations, may not be robust.
The Role of Gluon Depletion in J/psi Suppression
The depletion of gluons as the parton flux traverses a nucleus in a heavy-ion
collision can influence the production rate of heavy-quark states. Thus the
suppression of can be due to gluon depletion in the initial state in
addition to nuclear and hadronic absorption in the final state. A formalism is
developed to describe the depletion effect. It is shown that, without
constraints from other experimental facts beside the suppression data
in and collisions, it is not possible to determine the relative
importance of depletion vs absorption. Possible relevance to the enhanced
suppression seen in the data is mentioned but not studied.Comment: 12 pages + 2 figures (in ps file), LaTex, Submitted to Phys. Rev.
Abnormal ECG search in long-term electrocardiographic recordings from an animal model of heart failure
Heart failure is one of the leading causes of death in the United States. Five million Americans suffer from heart failure. Advances in portable electrocardiogram (ECG) monitoring systems and large data storage space allow the ECG to be recorded continuously for long periods. Long-term monitoring could potentially lead to better diagnosis and treatment if the progression of heart failure could be followed. The challenge is to analyze the sheer mass of data. Manual analysis using the classical methods is impossible. In this dissertation, a framework for analysis of long-term ECG recording and methods for searching an abnormal ECG are presented.;The data used in this research were collected from an animal model of heart failure. Chronic heart failure was gradually induced in rats by aldosterone infusion and a high Na and low Mg diet. The ECG was continuously recorded during the experimental period of 11-12 weeks through radiotelemetry. The ECG leads were placed subcutaneously in lead-II configuration. In the end, there were 80 GB of data from five animals. Besides the massive amount of data, noise and artifacts also caused problems in the analysis.;The framework includes data preparation, ECG beat detection, EMG noise detection, baseline fluctuation removal, ECG template generation, feature extraction, and abnormal ECG search. The raw data was converted from its original format and stored in a database for data retrieval. The beat detection technique was improved from the original algorithm so that it was less sensitive to signal baseline jump and more sensitive to beat size variation. A method for estimating a parameter required for baseline fluctuation removal is proposed. It provides a good result on test signals. A new algorithm for EMG noise detection was developed using morphological filters and moving variance. The resulting sensitivity and specificity are 94% and 100%, respectively. A procedure for ECG template generation was proposed to capture gradual change in ECG morphology and manage the matching process if numerous ECG templates are created. RR intervals and heart rate variability parameters are extracted and plotted to display progressive changes as heart failure develops. In the abnormal ECG search, premature ventricular complexes, elevated ST segment, and split-R-wave ECG are considered. New features are extracted from ECG morphology. The Fisher linear discriminant analysis is used to classify the normal and abnormal ECG. The results provide classification rate, sensitivity, and specificity of 97.35%, 96.02%, and 98.91%, respectively
Study of the Kalman filter for arrhythmia detection with intracardiac electrograms
Third generation implantable antitachycardia devices offer tiered-therapy to reverse ventricular fibrillation (VF) by defibrillation and ventricular tachycardia (VT) by low-energy cardioversion or antitachycardia pacing. The schemes for detecting cardiac arrhythmias often realize nonpathologic tachycardia as serious arrhythmias and deliver false shocks. In this study, an arrhythmia classification technique has been developed with the use of Kalman filter applied on cyclostationary autoregressive model. This new algorithm was developed with a training set of 24 arrhythmia passages and tested on a different data set of 29 arrhythmia passages. The algorithm provides 100% detection of VF on the test set. 77.8% of VTs were detected correctly while 16.7% of VTs were diagnosed as sinus rhythm and 5.5% of VTs were detected as VF
GETTING SANDY: CREATING COLLAPSING SAND EFFECTS FOR \u3ci\u3eAN ODE TO LOVE\u3c/i\u3e
This thesis presents an artistic approach of creating collapsing sand effects in Brown Bag Films\u27 animated short, An Ode To Love, directed by Matthew Darragh. A combination of rigid body simulation and fluid simulation tools, which are available in Houdini 3D animation software version 13, was used to successfully complete the task. A detailed design and implementation process to achieve the effects is documented in this work
- …