895 research outputs found
Direct CP, T and/or CPT violations in the K^0-\bar{K^0} system - Implications of the recent KTeV results on decays -
The recent results on the CP violating parameters Re(e'/e) and \Delta\phi =
\phi_{00}-\phi_{+-} reported by the KTeV Collaboration are analyzed with a view
to constrain CP, T and CPT violations in a decay process. Combining with some
relevant data compiled by the Particle Data Group, we find Re(e_2-e_0) = (0.85
+- 3.11)*10^{-4} and Im(e_2-e_0) = (3.2 +- 0.7)*10^{-4}, where Re(e_I) and
Im(e_I) represent respectively CP/CPT and CP/T violations in decay of K^0 and
\bar{K^0} into a 2\pi state with isospin I.Comment: 7 pages, No figure
Demonstration of the Dynamic Flowgraph Methodology using the Titan 2 Space Launch Vehicle Digital Flight Control System
Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps
Universality Class of Models
We point out that existing numerical data on the correlation length and
magnetic susceptibility suggest that the two dimensional model with
standard action has critical exponent , which is inconsistent with
asymptotic freedom. This value of is also different from the one of the
Wess-Zumino-Novikov-Witten model that is supposed to correspond to the
model at .Comment: 8 pages, with 3 figures included, postscript. An error concerning the
errors has been correcte
GEANT4 low energy electromagnetic models for electrons and photons
A set of physics processes has been developed in the Geant4 Simulation Toolkit to describe the electromagnetic interactions of photons and electrons with matter down to 250 eV. Preliminary comparisons of the models with experimental data show a satisfactory agreement
Findings of a review of spacecraft fire safety needs
Discussions from a workshop to guide UCLA and NASA investigators on the state of knowledge and perceived needs in spacecraft fire safety and its risk management are reviewed, for an introduction to an analytical and experimental project in this field. The report summarizes the workshop discussions and includes the visual aids used in the presentations. Probabilistic Safety Assessment (PSA) methods, which are currently not used, would be of great value to the designs and operation of future human-crew spacecraft. Key points in the discussions were the importance of understanding and testing smoldering as a likely fire scenario in space and the need for smoke damage modeling, since many fire-risk models ignore this mechanism and consider only heat damage
Risk-based Spacecraft Fire Safety Experiments
Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level
An algorithm to optimize the tracking of ionizing particles
We describe a way to simulate electromagnetic showers, automatically optimising the number of tracks which need to be simulated, in order to obtain the same physics results as a full shower simulation. This method is implemented in Geant4
Dehazing Ultrasound using Diffusion Models
Echocardiography has been a prominent tool for the diagnosis of cardiac
disease. However, these diagnoses can be heavily impeded by poor image quality.
Acoustic clutter emerges due to multipath reflections imposed by layers of
skin, subcutaneous fat, and intercostal muscle between the transducer and
heart. As a result, haze and other noise artifacts pose a real challenge to
cardiac ultrasound imaging. In many cases, especially with difficult-to-image
patients such as patients with obesity, a diagnosis from B-Mode ultrasound
imaging is effectively rendered unusable, forcing sonographers to resort to
contrast-enhanced ultrasound examinations or refer patients to other imaging
modalities. Tissue harmonic imaging has been a popular approach to combat haze,
but in severe cases is still heavily impacted by haze. Alternatively, denoising
algorithms are typically unable to remove highly structured and correlated
noise, such as haze. It remains a challenge to accurately describe the
statistical properties of structured haze, and develop an inference method to
subsequently remove it. Diffusion models have emerged as powerful generative
models and have shown their effectiveness in a variety of inverse problems. In
this work, we present a joint posterior sampling framework that combines two
separate diffusion models to model the distribution of both clean ultrasound
and haze in an unsupervised manner. Furthermore, we demonstrate techniques for
effectively training diffusion models on radio-frequency ultrasound data and
highlight the advantages over image data. Experiments on both \emph{in-vitro}
and \emph{in-vivo} cardiac datasets show that the proposed dehazing method
effectively removes haze while preserving signals from weakly reflected tissue.Comment: 10 pages, 11 figures, preprint IEEE submissio
- …