6,818 research outputs found
Radiation therapy calculations using an on-demand virtual cluster via cloud computing
Computer hardware costs are the limiting factor in producing highly accurate
radiation dose calculations on convenient time scales. Because of this,
large-scale, full Monte Carlo simulations and other resource intensive
algorithms are often considered infeasible for clinical settings. The emerging
cloud computing paradigm promises to fundamentally alter the economics of such
calculations by providing relatively cheap, on-demand, pay-as-you-go computing
resources over the Internet. We believe that cloud computing will usher in a
new era, in which very large scale calculations will be routinely performed by
clinics and researchers using cloud-based resources. In this research, several
proof-of-concept radiation therapy calculations were successfully performed on
a cloud-based virtual Monte Carlo cluster. Performance evaluations were made of
a distributed processing framework developed specifically for this project. The
expected 1/n performance was observed with some caveats. The economics of
cloud-based virtual computing clusters versus traditional in-house hardware is
also discussed. For most situations, cloud computing can provide a substantial
cost savings for distributed calculations.Comment: 12 pages, 4 figure
Charged hadron beam therapy : fast computational physics methods
Charged hadron beams have been investigated for use in radiation therapy of cancer since the 1940s due to their unique potential to place tightly conformal radiation doses deep inside tissue. This is achieved by exploiting the phenomenon of the so-called Bragg peak. In both research and clinical settings, fast and accurate radiation calculations play a crucial role in charged hadron therapy physics. Unfortunately, physicists are often faced with the fundamental trade off of speed versus accuracy in their calculations. This dissertation addresses this trade off by presenting three computational physics methods for specific and general charged hadron beam therapy calculations. In this dissertation the pseudo-Monte Carlo method of track repeating is adapted for fast calculations of linear energy transfer (LET) and for fast estimation of dose in the peripheral regions of the target volume (i.e. secondary dose estimation). Additionally, the first proof-of-concept framework for carrying out massively distributed parallel Monte Carlo calculations for radiation therapy using cloud computing is presented. Performance and accuracy assessments of each calculation method are also presented
First results from the LUCID-Timepix spacecraft payload onboard the TechDemoSat-1 satellite in Low Earth Orbit
The Langton Ultimate Cosmic ray Intensity Detector (LUCID) is a payload
onboard the satellite TechDemoSat-1, used to study the radiation environment in
Low Earth Orbit (635km). LUCID operated from 2014 to 2017, collecting
over 2.1 million frames of radiation data from its five Timepix detectors on
board. LUCID is one of the first uses of the Timepix detector technology in
open space, with the data providing useful insight into the performance of this
technology in new environments. It provides high-sensitivity imaging
measurements of the mixed radiation field, with a wide dynamic range in terms
of spectral response, particle type and direction. The data has been analysed
using computing resources provided by GridPP, with a new machine learning
algorithm that uses the Tensorflow framework. This algorithm provides a new
approach to processing Medipix data, using a training set of human labelled
tracks, providing greater particle classification accuracy than other
algorithms. For managing the LUCID data, we have developed an online platform
called Timepix Analysis Platform at School (TAPAS). This provides a swift and
simple way for users to analyse data that they collect using Timepix detectors
from both LUCID and other experiments. We also present some possible future
uses of the LUCID data and Medipix detectors in space.Comment: Accepted for publication in Advances in Space Researc
Virtual Reality Games for Motor Rehabilitation
This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion
The impact of cellular characteristics on the evolution of shape homeostasis
The importance of individual cells in a developing multicellular organism is
well known but precisely how the individual cellular characteristics of those
cells collectively drive the emergence of robust, homeostatic structures is
less well understood. For example cell communication via a diffusible factor
allows for information to travel across large distances within the population,
and cell polarisation makes it possible to form structures with a particular
orientation, but how do these processes interact to produce a more robust and
regulated structure? In this study we investigate the ability of cells with
different cellular characteristics to grow and maintain homeostatic structures.
We do this in the context of an individual-based model where cell behaviour is
driven by an intra-cellular network that determines the cell phenotype. More
precisely, we investigated evolution with 96 different permutations of our
model, where cell motility, cell death, long-range growth factor (LGF),
short-range growth factor (SGF) and cell polarisation were either present or
absent. The results show that LGF has the largest positive impact on the
fitness of the evolved solutions. SGF and polarisation also contribute, but all
other capabilities essentially increase the search space, effectively making it
more difficult to achieve a solution. By perturbing the evolved solutions, we
found that they are highly robust to both mutations and wounding. In addition,
we observed that by evolving solutions in more unstable environments they
produce structures that were more robust and adaptive. In conclusion, our
results suggest that robust collective behaviour is most likely to evolve when
cells are endowed with long range communication, cell polarisation, and
selection pressure from an unstable environment
- …