445 research outputs found
Electrical Impedance Based Spectroscopy and Tomography Techniques for Obesity and Heart Diseases
Despite advances in diagnosis and therapy, atherosclerosis cardiovascular disease remains the leading cause of morbidity and mortality. Predicting metabolically active atherosclerotic lesions has remained an unmet clinical need. Specially, atherosclerotic plaques that are prone to rupture are of extremely high-risk and can cause detrimental heart attacks and/or strokes, leading to sudden death. It has been shown that atheroscleroses is correlated to the level of obesity of an individual [1] Usually in clinical practice, the doctor will assess a patient's "risk factor" based on his or her Body Mass Index (BMS), and measurement of the waist circumference. Meanwhile the level of fatty droplet deposits in the liver is an important bio-marker to assess the patient's risk factor, however the patient will need to undergo radiation imaging such as CT scan or MRI scan.
For the vulnerable plaques that can lead to sudden rupture, the ability to distinguish them at an early stage remains largely lacking. Therefore it is of great clinical interest to find improved diagnostic techniques to identify and localize such vulnerable plaques. Meanwhile, lipid has significantly lower electrical impedance than the rest of the vessel tissues in certain frequency bands [2]. In this thesis we explore spectroscopic and tomographic methods to characterize such plaques. In addition, with the Electrical Impedance Tomography method we will propose a novel method to detect fatty liver in an early stage with non-radiating and non-invasive manner.</p
A general class of combinatorial filters that can be minimized efficiently
State minimization of combinatorial filters is a fundamental problem that
arises, for example, in building cheap, resource-efficient robots. But exact
minimization is known to be NP-hard. This paper conducts a more nuanced
analysis of this hardness than up till now, and uncovers two factors which
contribute to this complexity. We show each factor is a distinct source of the
problem's hardness and are able, thereby, to shed some light on the role played
by (1) structure of the graph that encodes compatibility relationships, and (2)
determinism-enforcing constraints. Just as a line of prior work has sought to
introduce additional assumptions and identify sub-classes that lead to
practical state reduction, we next use this new, sharper understanding to
explore special cases for which exact minimization is efficient. We introduce a
new algorithm for constraint repair that applies to a large sub-class of
filters, subsuming three distinct special cases for which the possibility of
optimal minimization in polynomial time was known earlier. While the efficiency
in each of these three cases previously appeared to stem from seemingly
dissimilar properties, when seen through the lens of the present work, their
commonality now becomes clear. We also provide entirely new families of filters
that are efficiently reducible.Comment: 9 pages, 3 figure
A fixed-parameter tractable algorithm for combinatorial filter reduction
What is the minimal information that a robot must retain to achieve its task?
To design economical robots, the literature dealing with reduction of
combinatorial filters approaches this problem algorithmically. As lossless
state compression is NP-hard, prior work has examined, along with minimization
algorithms, a variety of special cases in which specific properties enable
efficient solution. Complementing those findings, this paper refines the
present understanding from the perspective of parameterized complexity. We give
a fixed-parameter tractable algorithm for the general reduction problem by
exploiting a transformation into minimal clique covering. The transformation
introduces new constraints that arise from sequential dependencies encoded
within the input filter -- some of these constraints can be repaired, others
are treated through enumeration. Through this approach, we identify parameters
affecting filter reduction that are based upon inter-constraint couplings
(expressed as a notion of their height and width), which add to the structural
parameters present in the unconstrained problem of minimal clique covering.Comment: 8 pages, 4 figure
- …