97 research outputs found
Quantitative magnetic resonance image analysis via the EM algorithm with stochastic variation
Quantitative Magnetic Resonance Imaging (qMRI) provides researchers insight
into pathological and physiological alterations of living tissue, with the help
of which researchers hope to predict (local) therapeutic efficacy early and
determine optimal treatment schedule. However, the analysis of qMRI has been
limited to ad-hoc heuristic methods. Our research provides a powerful
statistical framework for image analysis and sheds light on future localized
adaptive treatment regimes tailored to the individual's response. We assume in
an imperfect world we only observe a blurred and noisy version of the
underlying pathological/physiological changes via qMRI, due to measurement
errors or unpredictable influences. We use a hidden Markov random field to
model the spatial dependence in the data and develop a maximum likelihood
approach via the Expectation--Maximization algorithm with stochastic variation.
An important improvement over previous work is the assessment of variability in
parameter estimation, which is the valid basis for statistical inference. More
importantly, we focus on the expected changes rather than image segmentation.
Our research has shown that the approach is powerful in both simulation studies
and on a real dataset, while quite robust in the presence of some model
assumption violations.Comment: Published in at http://dx.doi.org/10.1214/07-AOAS157 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Longitudinal Image Analysis of Tumor/Brain Change in Contrast Uptake Induced by Radiation
This work is motivated by a quantitative Magnetic Resonance Imaging study of the differential tumor/healthy tissue change in contrast uptake induced by radiation. The goal is to determine the time in which there is maximal contrast uptake, a surrogate for permeability, in the tumor relative to healthy tissue. A notable feature of the data is its spatial heterogeneity. Zhang, Johnson, Little, and Cao (2008a and 2008b) discuss two parallel approaches to âdenoiseâ a single image of change in contrast uptake from baseline to a single follow-up visit of interest. In this work we explore the longitudinal profile of the tumor/healthy tissue change in contrast uptake. In addition to the spatial correlation, we account for temporal correlation by jointly modeling multiple images on the individual subjects over time. We fit a two-stage model. First, we propose a longitudinal image model for each subject. This model simultaneously accounts for the spatial and temporal correlation and denoises the observed images by borrowing strength both across neighboring pixels and over time. We propose to use the area under the receiver operating characteristics (ROC) curve (AUC) to summarize the differential contrast uptake between tumor and healthy tissue. In the second stage, we fit a population model on the AUC values and estimate when it achieves the maximum
Favoring Eagerness for Remaining Items: Designing Efficient, Fair, and Strategyproof Mechanisms
In the assignment problem, the goal is to assign indivisible items to agents
who have ordinal preferences, efficiently and fairly, in a strategyproof
manner. In practice, first-choice maximality, i.e., assigning a maximal number
of agents their top items, is often identified as an important efficiency
criterion and measure of agents' satisfaction. In this paper, we propose a
natural and intuitive efficiency property,
favoring-eagerness-for-remaining-items (FERI), which requires that each item is
allocated to an agent who ranks it highest among remaining items, thereby
implying first-choice maximality. Using FERI as a heuristic, we design
mechanisms that satisfy ex-post or ex-ante variants of FERI together with
combinations of other desirable properties of efficiency (Pareto-efficiency),
fairness (strong equal treatment of equals and sd-weak-envy-freeness), and
strategyproofness (sd-weak-strategyproofness). We also explore the limits of
FERI mechanisms in providing stronger efficiency, fairness, or
strategyproofness guarantees through impossibility results
Longitudinal image analysis of tumourâhealthy brain change in contrast uptake induced by radiation
The work is motivated by a quantitative magnetic resonance imaging study of the differential tumourâhealthy tissue change in contrast uptake induced by radiation. The goal is to determine the time in which there is maximal contrast uptake (a surrogate for permeability) in the tumour relative to healthy tissue. A notable feature of the data is its spatial heterogeneity. Zhang and co-workers have discussed two parallel approaches to âdenoiseâ a single image of change in contrast uptake from baseline to one follow-up visit of interest. In this work we extend the image model to explore the longitudinal profile of the tumourâhealthy tissue contrast uptake in multiple images over time. We fit a two-stage model. First, we propose a longitudinal image model for each subject. This model simultaneously accounts for the spatial and temporal correlation and denoises the observed images by borrowing strength both across neighbouring pixels and over time. We propose to use the MannâWhitney U -statistic to summarize the tumour contrast uptake relative to healthy tissue. In the second stage, we fit a population model to the U -statistic and estimate when it achieves its maximum. Our initial findings suggest that the maximal contrast uptake of the tumour core relative to healthy tissue peaks around 3 weeks after initiation of radiotherapy, though this warrants further investigation.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/79255/1/j.1467-9876.2010.00718.x.pd
NCACO-score: An effective main-chain dependent scoring function for structure modeling
<p>Abstract</p> <p>Background</p> <p>Development of effective scoring functions is a critical component to the success of protein structure modeling. Previously, many efforts have been dedicated to the development of scoring functions. Despite these efforts, development of an effective scoring function that can achieve both good accuracy and fast speed still presents a grand challenge.</p> <p>Results</p> <p>Based on a coarse-grained representation of a protein structure by using only four main-chain atoms: N, Cα, C and O, we develop a knowledge-based scoring function, called NCACO-score, that integrates different structural information to rapidly model protein structure from sequence. In testing on the Decoys'R'Us sets, we found that NCACO-score can effectively recognize native conformers from their decoys. Furthermore, we demonstrate that NCACO-score can effectively guide fragment assembly for protein structure prediction, which has achieved a good performance in building the structure models for hard targets from CASP8 in terms of both accuracy and speed.</p> <p>Conclusions</p> <p>Although NCACO-score is developed based on a coarse-grained model, it is able to discriminate native conformers from decoy conformers with high accuracy. NCACO is a very effective scoring function for structure modeling.</p
Multi-type Resource Allocation with Partial Preferences
We propose multi-type probabilistic serial (MPS) and multi-type random
priority (MRP) as extensions of the well known PS and RP mechanisms to the
multi-type resource allocation problem (MTRA) with partial preferences. In our
setting, there are multiple types of divisible items, and a group of agents who
have partial order preferences over bundles consisting of one item of each
type. We show that for the unrestricted domain of partial order preferences, no
mechanism satisfies both sd-efficiency and sd-envy-freeness. Notwithstanding
this impossibility result, our main message is positive: When agents'
preferences are represented by acyclic CP-nets, MPS satisfies sd-efficiency,
sd-envy-freeness, ordinal fairness, and upper invariance, while MRP satisfies
ex-post-efficiency, sd-strategy-proofness, and upper invariance, recovering the
properties of PS and RP
Defining and using microbial spectral databases
AbstractThis work shows how fingerprints of mass spectral patterns from microbial isolates are affected by variations in instrumental condition, by sample environment, and by sample handling factors. It describes a novel method by which pattern distortions can be mathematically corrected for variations in factors not amenable to experimental control. One uncontrollable variable is âbetween-batchâ differences in culture media. Another, relevant for determination of noncultured extracts, is differences between the cellsâ environmental experience (e.g., starved environmental extracts versus cultured standards). The method suggests that, after a single growth cycle on a solid medium (perhaps, a selective one), pyrolysis MS spectra of microbial isolates can be algorithmically compensated and an unknown isolate identified using a spectral database defined by culture on a different (perhaps, nonselective) medium. This reduces identification time to as few as 24 h from sample collection. The concept also proposes a possible way to compensate certain noncultured, nonisolated samples (e.g., cells concentrated from urine or impacted from aerosol or semi-selectively extracted by immunoaffinity methods from heavily contaminated matrices) for identification within half an hour. Using the method, microbial mass spectra from different labs can be assembled into coherent databases similar to those routinely used to identify pure compounds. This type of data treatment is applicable for rapid detection in biowarfare and bioterror events as well as in forensic, research, and clinical laboratory contexts
- âŠ