559 research outputs found
Single-ensemble-based eigen-processing methods for color flow imaging-Part I. the Hankel-SVD filter
Because of their adaptability to the slow-time signal contents, eigen-based filters have shown potential in improving the flow detection performance of color flow images. This paper proposes a new eigen-based filter called the Hankel-SVD filter that is intended to process each slow- time ensemble individually. The new filter is derived using the notion of principal Hankel component analysis, and it achieves clutter suppression by retaining only the principal components whose order is greater than the clutter eigen- space dimension estimated from a frequency-based analysis algorithm. To assess its efficacy, the Hankel-SVD filter was first applied to synthetic slow-time data (ensemble size: 10) simulated from two different sets of flow parameters that model: (1) arterial imaging (blood velocity: 0 to 38.5 cm/s, tissue motion: up to 2 mm/s, transmit frequency: 5 MHz, pulse repetition period: 0.4 ms) and 2) deep vessel imaging (blood velocity: 0 to 19.2 cm/s, tissue motion: up to 2 cm/s, transmit frequency: 2 MHz, pulse repetition period: 2.0 ms). In the simulation analysis, the post-filter clutter- to-blood signal ratio (CBR) was computed as a function of blood velocity. Results show that for the same effective stopband size (50 Hz), the Hankel-SVD filter has a narrower transition region in the post-filter CBR curve than that of another type of adaptive filter called the clutter- downmixing filter. The practical efficacy of the proposed filter was tested by application to in vivo color flow data obtained from the human carotid arteries (transmit frequency: 4 MHz, pulse repetition period: 0.333 ms, ensemble size: 10). The resulting power images show that the Hankel-SVD filter can better distinguish between blood and moving- tissue regions (about 9 dB separation in power) than the clutter-downmixing filter and a fixed-rank multi-ensemble- based eigen-filter (which showed a 2 to 3 dB separation). © 2006 IEEE.published_or_final_versio
Single-ensemble-based eigen-processing methods for color flow imaging-Part II. the matrix pencil estimator
Parametric spectral estimators can potentially be used to obtain flow estimates directly from raw slow-time ensembles whose clutter has not been suppressed. We present a new eigen-based parametric flow estimation method called the matrix pencil, whose principles are based on a matrix form under the same name. The presented method models the slow-time signal as a sum of dominant complex sinusoids in the slow-time ensemble, and it computes the principal Doppler frequencies by using a generalized eigenvalue problem formulation and matrix rank reduction principles. Both fixed-rank (rank-one, rank-two) and adaptive-rank matrix pencil flow estimators are proposed, and their potential applicability to color flow signal processing is discussed. For the adaptive-rank estimator, the nominal rank was defined as the minimum eigen-structure rank that yields principal frequency estimates with a spread greater than a prescribed bandwidth. In our initial performance evaluation, the fixed-rank matrix pencil estimators were applied to raw color flow data (transmit frequency: 5 MHz; pulse repetition period: 0.175 ms; ensemble size: 14) acquired from a steady flow phantom (70 cm/s at centerline) that was surrounded by rigid-tissue-mimicking material. These fixed-rank estimators produced velocity maps that are well correlated with the theoretical flow profile (correlation coefficient: 0.964 to 0.975). To facilitate further evaluation, the matrix pencil estimators were applied to synthetic slow-time data (transmit frequency: 5 MHz; pulse repetition period: 1.0 ms; ensemble size: 10) modeling flow scenarios without and with tissue motion (up to 1 cm/s). The bias and root-mean-squared error of the estimators were computed as a function of blood-signal-to-noise ratio and blood velocity. The matrix pencil flow estimators showed that they are comparatively less biased than most of the existing frequency-based flow estimators like the lag-one autocorrelator. © 2006 IEEE.published_or_final_versio
Lightweight process migration and memory prefetching in openMosix
We propose a lightweight process migration mechanism and an adaptive memory prefetching scheme called AMPoM (Adaptive Memory Prefetching in openMosix), whose goal is to reduce the migration freeze time in openMosix while ensuring the execution efficiency of migrants. To minimize the freeze time, our system transfers only a few pages to the destination node during process migration. After the migration, AMPoM analyzes the spatial locality of memory access and iteratively prefetches memory pages from remote to hide the latency of inter-node page faults. AMPoM adopts a unique algorithm to decide which and how many pages to prefetch. It tends to prefetch more aggressively when a sequential access pattern is developed, when the paging rate of the process is high or when the network is busy. This advanced strategy makes AMPoM highly adaptive to different application behaviors and system dynamics. The HPC Challenge benchmark results show that AMPoM can avoid 98% of migration freeze time while preventing 85-99% of page fault requests after the migration. Compared to openMosix which does not have remote page fault, AMPoM induces a modest overhead of 0-5% additional runtime. When the working set of a migrant is small, AMPoM outperforms openMosix considerably due to the reduced amount of data transfer. These results indicate that by exploiting memory access locality and prefetching, process migration can be a lightweight operation with little software overhead in remote paging. ©2008 IEEE.published_or_final_versionThe 2008 IEEE International Symposium on Parallel and Distributed Processing (IPDPS 2008), Miami, FL., 14-18 April 2008. In Proceedings of the 22nd IPDPS, 2008, p. 1-1
Memory acquisition: A 2-Take approach
When more and more people recognize the value of volatile data, live forensics gains more weight in digital forensics. It is often used in parallel with traditional pull-the-plug forensics to provide a more reliable result in forensic examination. One of the core components in live forensics is the collection and analysis of memory volatile data, during which the memory content is acquired for searching of relevant evidential data or investigating various computer processes to unveil the activities being performed by a user. However, this conventional method may have weaknesses because of the volatile nature of memory data and the absence of original data for validation. This may cause implication to the admissibility of memory data at the court of law which requires strict authenticity and reliability of evidence. In this paper, we discuss the impact of various memory acquisition methods and suggest a 2-Take approach which aims to enhance the confidence level of the acquired memory data for legal proceedings. © 2009 IEEE.published_or_final_versionThe 2009 International Workshop on Forensics for Future Generation Communication Environments (F2GC-09) in conjunction with CSA 2009, Jeju Island, Korea, 10-12 December 2009. In Proceedings of CSA, 2009, p. 1-
An adaptive multipath protocol for efficient IP handoff in mobile wireless networks
Achieving IP handoff with a short latency and minimal packet loss is essential for mobile devices that roam across IP subnets. Many existing solutions require changes to be made to the network or transport layer, and they tend to suffer from long handoff latency in either soft or hard handoff scenario, or both; and some are difficult to deploy in practice. We propose a new protocol, called the adaptive multipath protocol, to achieve efficient IP handoff. Based on link-layer signal strength measurements, two different schemes are used to handle soft and hard handoff respectively. Seamless IP handoff is achieved by using multiple transport layer connections on top of persistent link-layer connectivity during soft handoff. To achieve low handoff latency during hard handoff, a set of distributed sessions repositories (SRs), which are independent of the end hosts, are employed. Simulation results clearly support our claims. In particular, the latency for hard handoff is found to be as low as 50% of that of Fast handoff. © 2006 IEEE.published_or_final_versio
A retrospective analysis of complications of treatment of nevus of Ota with the Q-switched Alexandrite (QS Alex) and Q-switched Neodymium: Yttrium-Aluminum-Garnet (QS Nd-YAG) lasers
published_or_final_versio
Maintaining hard disk integrity with digital legal professional privilege (LPP) data
published_or_final_versio
Probing quark compositeness at hadronic colliders : the case of polarized beams
A new handed interaction between subconstituents of quarks could be at the
origin of some small parity violating effects in one-jet inclusive production.
Within a few years, the Relativistic Heavy Ion Collider (RHIC) will be used as
a polarized proton-proton collider.
In this context, we analyse the possibilities of disentangling some new
parity violating effects from the standard spin asymmetries which are expected
due to the Standard Model QCD-Weak interference. We also explore the
possibilities of placing some more stringent limits on the quark compositeness
scale thanks to measurements of such spin asymmetries.Comment: 10 pages, Latex, 4 figures to be appende
Spin asymmetries in one-jet production at RHIC with polarized proton beams : the effects of a hadrophilic Z'
We show that the measurement of some parity violating asymmetry in the
production of a large ET jet could reveal the presence of a new hadrophilic Z'
such as the one recently introduced to interpret possible departures from the
Standard Model predictions both at LEP and at CDF. Such a measurement could be
perform within a few years by the RHIC Spin Collaboration (RSC) using the
Relativistic Heavy Ion Collider (RHIC) as a polarized proton-proton collider.Comment: 10 pages, latex file and 2 ps figure
Analyzing storage media of digital camera
Digital photography has become popular in recent years. Photographs have become common tools for people to record every tiny parts of their daily life. By analyzing the storage media of a digital camera, crime investigators may extract a lot of useful information to reconstruct the events. In this work, we will discuss a few approaches in analyzing these kinds of storage media of digital cameras. A hypothetical crime case will be used as case study for demonstration of concepts. © 2009 IEEE.published_or_final_versionThe 2009 International Workshop on Forensics for Future Generation Communication Environments (F2GC-09) in conjunction with CSA 2009, Jeju Island, Korea, 10-12 December 2009. In Proceedings of CSA, 2009, p. 1-
- …
