2,564 research outputs found
Recommended from our members
Performance Comparison of Knowledge-Based Dose Prediction Techniques Based on Limited Patient Data.
PurposeThe accuracy of dose prediction is essential for knowledge-based planning and automated planning techniques. We compare the dose prediction accuracy of 3 prediction methods including statistical voxel dose learning, spectral regression, and support vector regression based on limited patient training data.MethodsStatistical voxel dose learning, spectral regression, and support vector regression were used to predict the dose of noncoplanar intensity-modulated radiation therapy (4Ï€) and volumetric-modulated arc therapy head and neck, 4Ï€ lung, and volumetric-modulated arc therapy prostate plans. Twenty cases of each site were used for k-fold cross-validation, with k = 4. Statistical voxel dose learning bins voxels according to their Euclidean distance to the planning target volume and uses the median to predict the dose of new voxels. Distance to the planning target volume, polynomial combinations of the distance components, planning target volume, and organ at risk volume were used as features for spectral regression and support vector regression. A total of 28 features were included. Principal component analysis was performed on the input features to test the effect of dimension reduction. For the coplanar volumetric-modulated arc therapy plans, separate models were trained for voxels within the same axial slice as planning target volume voxels and voxels outside the primary beam. The effect of training separate models for each organ at risk compared to all voxels collectively was also tested. The mean squared error was calculated to evaluate the voxel dose prediction accuracy.ResultsStatistical voxel dose learning using separate models for each organ at risk had the lowest root mean squared error for all sites and modalities: 3.91 Gy (head and neck 4Ï€), 3.21 Gy (head and neck volumetric-modulated arc therapy), 2.49 Gy (lung 4Ï€), and 2.35 Gy (prostate volumetric-modulated arc therapy). Compared to using the original features, principal component analysis reduced the 4Ï€ prediction error for head and neck spectral regression (-43.9%) and support vector regression (-42.8%) and lung support vector regression (-24.4%) predictions. Principal component analysis was more effective in using all/most of the possible principal components. Separate organ at risk models were more accurate than training on all organ at risk voxels in all cases.ConclusionCompared with more sophisticated parametric machine learning methods with dimension reduction, statistical voxel dose learning is more robust to patient variability and provides the most accurate dose prediction method
Recommended from our members
A general solution method for moral hazard problems
Principal-agent models are pervasive in theoretical and applied economics, but their analysis has largely been limited to the "first-order approach" (FOA) where incentive compatibility is replaced by a first-order condition. This paper presents a new approach to solving a wide class of principal-agent problems that satisfy the monotone likelihood ratio property but may fail to meet the requirements of the FOA. Our approach solves the problem via tackling a max-min-max formulation over agent actions, alternate best responses by the agent, and contracts
Beyond Single-Deletion Correcting Codes: Substitutions and Transpositions
We consider the problem of designing low-redundancy codes in settings where one must correct deletions in conjunction with substitutions or adjacent transpositions; a combination of errors that is usually observed in DNA-based data storage. One of the most basic versions of this problem was settled more than 50 years ago by Levenshtein, who proved that binary Varshamov-Tenengolts codes correct one arbitrary edit error, i.e., one deletion or one substitution, with nearly optimal redundancy. However, this approach fails to extend to many simple and natural variations of the binary single-edit error setting. In this work, we make progress on the code design problem above in three such variations:
- We construct linear-time encodable and decodable length-n non-binary codes correcting a single edit error with nearly optimal redundancy log n+O(log log n), providing an alternative simpler proof of a result by Cai, Chee, Gabrys, Kiah, and Nguyen (IEEE Trans. Inf. Theory 2021). This is achieved by employing what we call weighted VT sketches, a new notion that may be of independent interest.
- We show the existence of a binary code correcting one deletion or one adjacent transposition with nearly optimal redundancy log n+O(log log n).
- We construct linear-time encodable and list-decodable binary codes with list-size 2 for one deletion and one substitution with redundancy 4log n+O(log log n). This matches the existential bound up to an O(log log n) additive term
Ethical Challenges in Data-Driven Dialogue Systems
The use of dialogue systems as a medium for human-machine interaction is an
increasingly prevalent paradigm. A growing number of dialogue systems use
conversation strategies that are learned from large datasets. There are well
documented instances where interactions with these system have resulted in
biased or even offensive conversations due to the data-driven training process.
Here, we highlight potential ethical issues that arise in dialogue systems
research, including: implicit biases in data-driven systems, the rise of
adversarial examples, potential sources of privacy violations, safety concerns,
special considerations for reinforcement learning systems, and reproducibility
concerns. We also suggest areas stemming from these issues that deserve further
investigation. Through this initial survey, we hope to spur research leading to
robust, safe, and ethically sound dialogue systems.Comment: In Submission to the AAAI/ACM conference on Artificial Intelligence,
Ethics, and Societ
Classical density functional theory & simulations on a coarse-grained model of aromatic ionic liquids.
A new classical density functional approach is developed to accurately treat a coarse-grained model of room temperature aromatic ionic liquids. Our major innovation is the introduction of charge-charge correlations, which are treated in a simple phenomenological way. We test this theory on a generic coarse-grained model for aromatic RTILs with oligomeric forms for both cations and anions, approximating 1-alkyl-3-methyl imidazoliums and BF4(-), respectively. We find that predictions by the new density functional theory for fluid structures at charged surfaces are very accurate, as compared with molecular dynamics simulations, across a range of surface charge densities and lengths of the alkyl chain. Predictions of interactions between charged surfaces are also presented
- …