226 research outputs found

    Secondary ionisations in a wall-less ion-counting nanodosimeter: quantitative analysis and the effect on the comparison of measured and simulated track structure parameters in nanometric volumes

    Get PDF
    The object of investigation in nanodosimetry is the physical characteristics of the microscopic structure of ionising particle tracks, i.e. the sequence of the interaction types and interaction sites of a primary particle and all its secondaries, which reflects the stochastic nature of the radiation interaction. In view of the upcoming radiation therapy with protons and carbon ions, the ionisation structure of the ion track is of particular interest. Owing to limitations in current detector technology, the only way to determine the ionisation cluster size distribution in a DNA segment is to simulate the particle track structure in condensed matter. This is done using dedicated computer programs based on Monte Carlo procedures simulating the interaction of the primary ions with the target. Hence, there is a need to benchmark these computer codes using suitable experimental data. Ionisation cluster size distributions produced in the nanodosimeter\u27s sensitive volume by monoenergetic protons and alpha particles (with energies between 0.1 MeV and 20 MeV) were measured at the PTB ion accelerator facilities. C3H8 and N2 were alternately used as the working gas. The measured data were compared with the simulation results obtained with the PTB Monte-Carlo code PTra [B. Grosswendt, Radiat. Environ. Biophys. 41, 103 (2002); M.U. Bug, E. Gargioni, H. Nettelbeck, W.Y. Baek, G. Hilgers, A.B. Rosenfeld, H. Rabus, Phys. Rev. E 88, 043308 (2013)]. Measured and simulated characteristics of the particle track structure are generally in good agreement for protons over the entire energy range investigated. For alpha particles with energies higher than the Bragg peak energy, a good agreement can also be seen, whereas for energies lower than the Bragg peak energy differences of as much as 25% occur. Significant deviations are only observed for large ionisation cluster sizes. These deviations can be explained by a background consisting of secondary ions. These ions are produced in the region downstream of the extraction aperture by electrons with a kinetic energy of about 2.5 keV, which are themselves released by ions of the primary ionisation cluster hitting an electrode in the ion transport system. Including this background of secondary ions in the simulated cluster size distributions leads to a significantly better agreement between measured and simulated data, especially for large ionisation clusters. Graphical abstract: [Figure not available: see fulltext.

    Secondary ionisations in a wall-less ion-counting nanodosimeter: quantitative analysis and the effect on the comparison of measured and simulated track structure parameters in nanometric volumes

    Get PDF
    The object of investigation in nanodosimetry is the physical characteristics of the microscopic structure of ionising particle tracks, i.e. the sequence of the interaction types and interaction sites of a primary particle and all its secondaries, which reflects the stochastic nature of the radiation interaction. In view of the upcoming radiation therapy with protons and carbon ions, the ionisation structure of the ion track is of particular interest. Owing to limitations in current detector technology, the only way to determine the ionisation cluster size distribution in a DNA segment is to simulate the particle track structure in condensed matter. This is done using dedicated computer programs based on Monte Carlo procedures simulating the interaction of the primary ions with the target. Hence, there is a need to benchmark these computer codes using suitable experimental data. Ionisation cluster size distributions produced in the nanodosimeter\u27s sensitive volume by monoenergetic protons and alpha particles (with energies between 0.1 MeV and 20 MeV) were measured at the PTB ion accelerator facilities. C3H8 and N2 were alternately used as the working gas. The measured data were compared with the simulation results obtained with the PTB Monte-Carlo code PTra [B. Grosswendt, Radiat. Environ. Biophys. 41, 103 (2002); M.U. Bug, E. Gargioni, H. Nettelbeck, W.Y. Baek, G. Hilgers, A.B. Rosenfeld, H. Rabus, Phys. Rev. E 88, 043308 (2013)]. Measured and simulated characteristics of the particle track structure are generally in good agreement for protons over the entire energy range investigated. For alpha particles with energies higher than the Bragg peak energy, a good agreement can also be seen, whereas for energies lower than the Bragg peak energy differences of as much as 25% occur. Significant deviations are only observed for large ionisation cluster sizes. These deviations can be explained by a background consisting of secondary ions. These ions are produced in the region downstream of the extraction aperture by electrons with a kinetic energy of about 2.5 keV, which are themselves released by ions of the primary ionisation cluster hitting an electrode in the ion transport system. Including this background of secondary ions in the simulated cluster size distributions leads to a significantly better agreement between measured and simulated data, especially for large ionisation clusters. Graphical abstract: [Figure not available: see fulltext.

    Theory of continuum percolation I. General formalism

    Full text link
    The theoretical basis of continuum percolation has changed greatly since its beginning as little more than an analogy with lattice systems. Nevertheless, there is yet no comprehensive theory of this field. A basis for such a theory is provided here with the introduction of the Potts fluid, a system of interacting ss-state spins which are free to move in the continuum. In the s1s \to 1 limit, the Potts magnetization, susceptibility and correlation functions are directly related to the percolation probability, the mean cluster size and the pair-connectedness, respectively. Through the Hamiltonian formulation of the Potts fluid, the standard methods of statistical mechanics can therefore be used in the continuum percolation problem.Comment: 26 pages, Late

    Exact solution of a one-dimensional continuum percolation model

    Full text link
    I consider a one dimensional system of particles which interact through a hard core of diameter \si and can connect to each other if they are closer than a distance dd. The mean cluster size increases as a function of the density ρ\rho until it diverges at some critical density, the percolation threshold. This system can be mapped onto an off-lattice generalization of the Potts model which I have called the Potts fluid, and in this way, the mean cluster size, pair connectedness and percolation probability can be calculated exactly. The mean cluster size is S = 2 \exp[ \rho (d -\si)/(1 - \rho \si)] - 1 and diverges only at the close packing density \rho_{cp} = 1 / \si . This is confirmed by the behavior of the percolation probability. These results should help in judging the effectiveness of approximations or simulation methods before they are applied to higher dimensions.Comment: 21 pages, Late

    Theory of continuum percolation II. Mean field theory

    Full text link
    I use a previously introduced mapping between the continuum percolation model and the Potts fluid to derive a mean field theory of continuum percolation systems. This is done by introducing a new variational principle, the basis of which has to be taken, for now, as heuristic. The critical exponents obtained are β=1\beta= 1, γ=1\gamma= 1 and ν=0.5\nu = 0.5, which are identical with the mean field exponents of lattice percolation. The critical density in this approximation is \rho_c = 1/\ve where \ve = \int d \x \, p(\x) \{ \exp [- v(\x)/kT] - 1 \}. p(\x) is the binding probability of two particles separated by \x and v(\x) is their interaction potential.Comment: 25 pages, Late

    Designing online, educational games about microbes, hand and respiratory hygiene and prudent antibiotics use for junior pupils across Europe

    Get PDF

    A Formal Ontology of Subcellular Neuroanatomy

    Get PDF
    The complexity of the nervous system requires high-resolution microscopy to resolve the detailed 3D structure of nerve cells and supracellular domains. The analysis of such imaging data to extract cellular surfaces and cell components often requires the combination of expert human knowledge with carefully engineered software tools. In an effort to make better tools to assist humans in this endeavor, create a more accessible and permanent record of their data, and to aid the process of constructing complex and detailed computational models, we have created a core of formalized knowledge about the structure of the nervous system and have integrated that core into several software applications. In this paper, we describe the structure and content of a formal ontology whose scope is the subcellular anatomy of the nervous system (SAO), covering nerve cells, their parts, and interactions between these parts. Many applications of this ontology to image annotation, content-based retrieval of structural data, and integration of shared data across scales and researchers are also described

    Feasibility of azacitidine added to standard chemotherapy in older patients with acute myeloid leukemia - a randomised SAL pilot study

    Get PDF
    INTRODUCTION: Older patients with acute myeloid leukemia (AML) experience short survival despite intensive chemotherapy. Azacitidine has promising activity in patients with low proliferating AML. The aim of this dose-finding part of this trial was to evaluate feasibility and safety of azacitidine combined with a cytarabine- and daunorubicin-based chemotherapy in older patients with AML. TRIAL DESIGN: Prospective, randomised, open, phase II trial with parallel group design and fixed sample size. PATIENTS AND METHODS: Patients aged 61 years or older, with untreated acute myeloid leukemia with a leukocyte count of <20,000/µl at the time of study entry and adequate organ function were eligible. Patients were randomised to receive azacitidine either 37.5 (dose level 1) or 75 mg/sqm (dose level 2) for five days before each cycle of induction (7+3 cytarabine plus daunorubicine) and consolidation (intermediate-dose cytarabine) therapy. Dose-limiting toxicity was the primary endpoint. RESULTS: Six patients each were randomised into each dose level and evaluable for analysis. No dose-limiting toxicity occurred in either dose level. Nine serious adverse events occurred in five patients (three in the 37.5 mg, two in the 75 mg arm) with two fatal outcomes. Two patients at the 37.5 mg/sqm dose level and four patients at the 75 mg/sqm level achieved a complete remission after induction therapy. Median overall survival was 266 days and median event-free survival 215 days after a median follow up of 616 days. CONCLUSIONS: The combination of azacitidine 75 mg/sqm with standard induction therapy is feasible in older patients with AML and was selected as an investigational arm in the randomised controlled part of this phase-II study, which is currently halted due to an increased cardiac toxicity observed in the experimental arm

    The Neuroscience Information Framework: A Data and Knowledge Environment for Neuroscience

    Get PDF
    With support from the Institutes and Centers forming the NIH Blueprint for Neuroscience Research, we have designed and implemented a new initiative for integrating access to and use of Web-based neuroscience resources: the Neuroscience Information Framework. The Framework arises from the expressed need of the neuroscience community for neuroinformatic tools and resources to aid scientific inquiry, builds upon prior development of neuroinformatics by the Human Brain Project and others, and directly derives from the Society for Neuroscience’s Neuroscience Database Gateway. Partnered with the Society, its Neuroinformatics Committee, and volunteer consultant-collaborators, our multi-site consortium has developed: (1) a comprehensive, dynamic, inventory of Web-accessible neuroscience resources, (2) an extended and integrated terminology describing resources and contents, and (3) a framework accepting and aiding concept-based queries. Evolving instantiations of the Framework may be viewed at http://nif.nih.gov, http://neurogateway.org, and other sites as they come on line

    Generalized model for dynamic percolation

    Full text link
    We study the dynamics of a carrier, which performs a biased motion under the influence of an external field E, in an environment which is modeled by dynamic percolation and created by hard-core particles. The particles move randomly on a simple cubic lattice, constrained by hard-core exclusion, and they spontaneously annihilate and re-appear at some prescribed rates. Using decoupling of the third-order correlation functions into the product of the pairwise carrier-particle correlations we determine the density profiles of the "environment" particles, as seen from the stationary moving carrier, and calculate its terminal velocity, V_c, as the function of the applied field and other system parameters. We find that for sufficiently small driving forces the force exerted on the carrier by the "environment" particles shows a viscous-like behavior. An analog Stokes formula for such dynamic percolative environments and the corresponding friction coefficient are derived. We show that the density profile of the environment particles is strongly inhomogeneous: In front of the stationary moving carrier the density is higher than the average density, ρs\rho_s, and approaches the average value as an exponential function of the distance from the carrier. Past the carrier the local density is lower than ρs\rho_s and the relaxation towards ρs\rho_s may proceed differently depending on whether the particles number is or is not explicitly conserved.Comment: Latex, 32 pages, 4 ps-figures, submitted to PR
    corecore