454 research outputs found
Modelling of gas dynamical properties of the KATRIN tritium source and implications for the neutrino mass measurement
The KATRIN experiment aims to measure the effective mass of the electron
antineutrino from the analysis of electron spectra stemming from the beta-decay
of molecular tritium with a sensitivity of 200 meV. Therefore, a daily
throughput of about 40 g of gaseous tritium is circulated in a windowless
source section. An accurate description of the gas flow through this section is
of fundamental importance for the neutrino mass measurement as it significantly
influences the generation and transport of beta-decay electrons through the
experimental setup. In this paper we present a comprehensive model consisting
of calculations of rarefied gas flow through the different components of the
source section ranging from viscous to free molecular flow. By connecting these
simulations with a number of experimentally determined operational parameters
the gas model can be refreshed regularly according to the measured operating
conditions. In this work, measurement and modelling uncertainties are
quantified with regard to their implications for the neutrino mass measurement.
We find that the systematic uncertainties related to the description of gas
flow are represented by eV,
and that the gas model is ready to be used in the analysis of upcoming KATRIN
data.Comment: 28 pages, 13 figure
Ischemic preconditioning attenuates portal venous plasma concentrations of purines following warm liver ischemia in man
Background/Aims: Degradation of adenine nucleotides to adenosine has been suggested to play a critical role in ischemic preconditioning (IPC). Thus, we questioned in patients undergoing partial hepatectomy whether (i) IPC will increase plasma purine catabolites and whether (ii) formation of purines in response to vascular clamping (Pringle maneuver) can be attenuated by prior IPC. Methods: 75 patients were randomly assigned to three groups: group I underwent hepatectomy without vascular clamping; group II was subjected to the Pringle maneuver during resection, and group III was preconditioned (10 min ischemia and 10 min reperfusion) prior to the Pringle maneuver for resection. Central, portal venous and arterial plasma concentrations of adenosine, inosine, hypoxanthine and xanthine were determined by high-performance liquid chromatography. Results: Duration of the Pringle maneuver did not differ between patients with or without IPC. Surgery without vascular clamping had only a minor effect on plasma purine transiently increased. After the Pringle maneuver alone, purine plasma concentrations were most increased. This strong rise in plasma purines caused by the Pringle maneuver, however, was significantly attenuated by IPC. When portal venous minus arterial concentration difference was calculated for inosine or hypoxanthine, the respective differences became positive in patients subjected to the Pringle maneuver and were completely prevented by preconditioning. Conclusion: These data demonstrate that (i) IPC increases formation of adenosine, and that (ii) the unwanted degradation of adenine nucleotides to purines caused by the Pringle maneuver can be attenuated by IPC. Because IPC also induces a decrease of portal venous minus arterial purine plasma concentration differences, IPC might possibly decrease disturbances in the energy metabolism in the intestine as well. Copyright (C) 2005 S. Karger AG, Basel
Proving Safety with Trace Automata and Bounded Model Checking
Loop under-approximation is a technique that enriches C programs with
additional branches that represent the effect of a (limited) range of loop
iterations. While this technique can speed up the detection of bugs
significantly, it introduces redundant execution traces which may complicate
the verification of the program. This holds particularly true for verification
tools based on Bounded Model Checking, which incorporate simplistic heuristics
to determine whether all feasible iterations of a loop have been considered.
We present a technique that uses \emph{trace automata} to eliminate redundant
executions after performing loop acceleration. The method reduces the diameter
of the program under analysis, which is in certain cases sufficient to allow a
safety proof using Bounded Model Checking. Our transformation is precise---it
does not introduce false positives, nor does it mask any errors. We have
implemented the analysis as a source-to-source transformation, and present
experimental results showing the applicability of the technique
Efficient Interpolation for the Theory of Arrays
Existing techniques for Craig interpolation for the quantifier-free fragment
of the theory of arrays are inefficient for computing sequence and tree
interpolants: the solver needs to run for every partitioning of the
interpolation problem to avoid creating -mixed terms. We present a new
approach using Proof Tree Preserving Interpolation and an array solver based on
Weak Equivalence on Arrays. We give an interpolation algorithm for the lemmas
produced by the array solver. The computed interpolants have worst-case
exponential size for extensionality lemmas and worst-case quadratic size
otherwise. We show that these bounds are strict in the sense that there are
lemmas with no smaller interpolants. We implemented the algorithm and show that
the produced interpolants are useful to prove memory safety for C programs.Comment: long version of the paper at IJCAR 201
Recommended from our members
Using a runway paradigm to assess the relative strength of rats' motivations for enrichment objects
Laboratory animals should be provided with enrichment objects in their cages; however, it is first necessary to
test whether the proposed enrichment objects provide benefits that increase the animals’ welfare. The two main
paradigms currently used to assess proposed enrichment objects are the choice test, which is limited to determining
relative frequency of choice, and consumer demand studies, which can indicate the strength of a preference but are complex to design. Here, we propose a third methodology: a runway paradigm, which can be used to assess the strength of an animal’s motivation for enrichment objects, is simpler to use than consumer demand studies, and is faster to complete than typical choice tests. Time spent with objects in a standard choice test was used to rank several enrichment objects in order to compare with the ranking found in our runway paradigm. The rats ran significantly more times, ran faster, and interacted longer with objects with which they had previously spent the most time. It was concluded that this simple methodology is suitable for measuring rats’ motivation to reach enrichment objects. This can be used to assess the preference for different types of enrichment objects or to measure reward system processes
Epigenetic deregulation of multiple S100 gene family members by differential hypomethylation and hypermethylation events in medulloblastoma
Deregulated expression of genes encoding members of the S100 family of calcium-binding proteins has been associated with the malignant progression of multiple tumour types. Using a pharmacological expression reactivation approach, we screened 16 S100 genes for evidence of epigenetic regulation in medulloblastoma, the most common malignant brain tumour of childhood. Four family members (S100A2, S100A4, S100A6 and S100A10) demonstrated evidence of upregulated expression in multiple medulloblastoma cell lines, following treatment with the DNA methyltransferase inhibitor, 5′-aza-2′-deoxycytidine. Subsequent analysis revealed methylation of critical CpG sites located within these four genes in an extended cell line panel. Assessment of these genes in the non-neoplastic cerebellum (from which medulloblastomas develop) revealed strong somatic methylation affecting S100A2 and S100A4, whereas S100A6 and S100A10 were unmethylated. Assessed against these normal tissue-specific methylation states, S100A6 and S100A10 demonstrated tumour-specific hypermethylation in medulloblastoma primary tumours (5 out of 40 and 4 out of 35, respectively, both 12%) and cell lines (both 7 out of 9, 78%), which was associated with their transcriptional silencing. Moreover, S100A6 hypermethylation was significantly associated with the aggressive large cell/anaplastic morphophenotype (P=0.026). In contrast, pro-metastatic S100A4 displayed evidence of hypomethylation relative to the normal cerebellum in a significant proportion primary tumours (7 out of 41, 17%) and cell lines (3 out of 9, 33%), which was associated with its elevated expression. In summary, these data characterise complex patterns of somatic methylation affecting S100 genes in the normal cerebellum and demonstrate their disruption causing epigenetic deregulation of multiple S100 family members in medulloblastoma development. Epigenetic events affecting S100 genes have potential clinical utility and merit further investigation as molecular biomarkers for this disease
- …