36 research outputs found
A Simulated Annealing Approach to Approximate Bayes Computations
Approximate Bayes Computations (ABC) are used for parameter inference when
the likelihood function of the model is expensive to evaluate but relatively
cheap to sample from. In particle ABC, an ensemble of particles in the product
space of model outputs and parameters is propagated in such a way that its
output marginal approaches a delta function at the data and its parameter
marginal approaches the posterior distribution. Inspired by Simulated
Annealing, we present a new class of particle algorithms for ABC, based on a
sequence of Metropolis kernels, associated with a decreasing sequence of
tolerances w.r.t. the data. Unlike other algorithms, our class of algorithms is
not based on importance sampling. Hence, it does not suffer from a loss of
effective sample size due to re-sampling. We prove convergence under a
condition on the speed at which the tolerance is decreased. Furthermore, we
present a scheme that adapts the tolerance and the jump distribution in
parameter space according to some mean-fields of the ensemble, which preserves
the statistical independence of the particles, in the limit of infinite sample
size. This adaptive scheme aims at converging as close as possible to the
correct result with as few system updates as possible via minimizing the
entropy production in the system. The performance of this new class of
algorithms is compared against two other recent algorithms on two toy examples.Comment: 20 pages, 2 figure
Cancer Biomarker Discovery: The Entropic Hallmark
Background: It is a commonly accepted belief that cancer cells modify their transcriptional state during the progression of the disease. We propose that the progression of cancer cells towards malignant phenotypes can be efficiently tracked using high-throughput technologies that follow the gradual changes observed in the gene expression profiles by employing Shannon's mathematical theory of communication. Methods based on Information Theory can then quantify the divergence of cancer cells' transcriptional profiles from those of normally appearing cells of the originating tissues. The relevance of the proposed methods can be evaluated using microarray datasets available in the public domain but the method is in principle applicable to other high-throughput methods. Methodology/Principal Findings: Using melanoma and prostate cancer datasets we illustrate how it is possible to employ Shannon Entropy and the Jensen-Shannon divergence to trace the transcriptional changes progression of the disease. We establish how the variations of these two measures correlate with established biomarkers of cancer progression. The Information Theory measures allow us to identify novel biomarkers for both progressive and relatively more sudden transcriptional changes leading to malignant phenotypes. At the same time, the methodology was able to validate a large number of genes and processes that seem to be implicated in the progression of melanoma and prostate cancer. Conclusions/Significance: We thus present a quantitative guiding rule, a new unifying hallmark of cancer: the cancer cell's transcriptome changes lead to measurable observed transitions of Normalized Shannon Entropy values (as measured by high-throughput technologies). At the same time, tumor cells increment their divergence from the normal tissue profile increasing their disorder via creation of states that we might not directly measure. This unifying hallmark allows, via the the Jensen-Shannon divergence, to identify the arrow of time of the processes from the gene expression profiles, and helps to map the phenotypical and molecular hallmarks of specific cancer subtypes. The deep mathematical basis of the approach allows us to suggest that this principle is, hopefully, of general applicability for other diseases
Dynamisches Komponententestverfahren fuer solare Warmwasseranlagen Abschlussbericht VELS 2. Technischer Bericht
This aim of the project was to generalise the dynamic short-term evaluation process for evaluating compact service water heaters for more complex and larger plants. The short-term measurements were to be evaluated separately according to the main components of the plant to be measured. The component parameters obtained in this way are used in a simulation program to forecast the annual income. A process for dynamic collector field measurement at the test rig or in situ was developed and tested. A dynamic model was developed and validated for the analysis of the store, which takes the effect of the circulating flow on the layers into account. Computer programs were produced for evaluation by means of the TRNSYS simulation program. (orig.)Ziel des Projektes war es, das dynamische Kurzzeitauswertverfahren zur Bewertung von kompakten Brauchwarmwasseranlagen auf komplexere und groessere Anlagen zu verallgemeinern. Dazu sollte die Auswertung von Kurzzeitmessungen getrennt nach den Hauptkomponenten der zu vermessenden Anlage erfolgen. Die so erhaltenen Komponentenparameter werden in einem Simulationsprogramm zur Prognose des Jahresertrages verwendet. Ein Verfahren zur dynamischen Kollektorfeldvermessung, am Teststand oder insitu, wurde entwickelt und erprobt. Zur Analyse des Speichers wurde ein dynamisches Modell entwickelt und validiert, das auch den Einfluss des Zirkulationsflusses auf die Schichtung beruecksichtigt. Computerprogramme zur Auswertung mittels des Simulationsprogrammes TRNSYS wurden erstellt. (orig.)Available from TIB Hannover: F96B143+a / FIZ - Fachinformationszzentrum Karlsruhe / TIB - Technische InformationsbibliothekSIGLEBundesministerium fuer Bildung, Wissenschaft, Forschung und Technologie, Bonn (Germany)DEGerman