38 research outputs found

    Idade Média: Tempo do mundo, tempo dos homens, tempo de Deus. [Reseña]

    Get PDF
    Reseña de: José Antônio DE CAMARGO RODRIGUES DE SOUZA (org.), Idade Média: Tempo do mundo, tempo dos homens, tempo de Deus, EST Edições, Porto Alegre 2006, 535 pp

    Sobre el cálculo del tamaño muestral

    Get PDF

    Experimental validation of gallium production and isotope-dependent positron range correction in PET

    Get PDF
    Abstract Positron range (PR) is one of the important factors that limit the spatial resolution of positron emission tomography (PET) preclinical images. Its blurring effect can be corrected to a large extent if the appropriate method is used during the image reconstruction. Nevertheless, this correction requires an accurate modelling of the PR for the particular radionuclide and materials in the sample under study. In this work we investigate PET imaging with 68Ga and 66Ga radioisotopes, which have a large PR and are being used in many preclinical and clinical PET studies. We produced a 68Ga and 66Ga phantom on a natural zinc target through (p,n) reactions using the 9-MeV proton beam delivered by the 5-MV CMAM tandetron accelerator. The phantom was imaged in an ARGUS small animal PET/CT scanner and reconstructed with a fully 3D iterative algorithm, with and without PR corrections. The reconstructed images at different time frames show significant improvement in spatial resolution when the appropriate PR is applied for each frame, by taking into account the relative amount of each isotope in the sample. With these results we validate our previously proposed PR correction method for isotopes with large PR. Additionally, we explore the feasibility of PET imaging with 68Ga and 66Ga radioisotopes in proton therapy.We acknowledge support from the Spanish MINECO through projects FPA2010-17142, FPA2013-41267-P, CSD-2007-00042 (CPAN), and the RTC-2015-3772-1 grant. We also acknowledge support from Comunidad de Madrid via the TOPUS S2013/MIT-3024 project

    On the Benefits of Transparent Compression for Cost-Effective Cloud Data Storage

    Get PDF
    International audienceInfrastructure-as-a-Service (IaaS) cloud computing has revolutionized the way we think of acquiring computational resources: it allows users to deploy virtual machines (VMs) at large scale and pay only for the resources that were actually used throughout the runtime of the VMs. This new model raises new challenges in the design and development of IaaS middleware: excessive storage costs associated with both user data and VM images might make the cloud less attractive, especially for users that need to manipulate huge data sets and a large number of VM images. Storage costs result not only from storage space utilization, but also from bandwidth consumption: in typical deployments, a large number of data transfers between the VMs and the persistent storage are performed, all under high performance requirements. This paper evaluates the trade-off resulting from transparently applying data compression to conserve storage space and bandwidth at the cost of slight computational overhead. We aim at reducing the storage space and bandwidth needs with minimal impact on data access performance. Our solution builds on BlobSeer, a distributed data management service specifically designed to sustain a high throughput for concurrent accesses to huge data sequences that are distributed at large scale. Extensive experiments demonstrate that our approach achieves large reductions (at least 40%) of bandwidth and storage space utilization, while still attaining high performance levels that even surpass the original (no compression) performance levels in several data-intensive scenarios

    The nuclear energy density functional formalism

    Full text link
    The present document focuses on the theoretical foundations of the nuclear energy density functional (EDF) method. As such, it does not aim at reviewing the status of the field, at covering all possible ramifications of the approach or at presenting recent achievements and applications. The objective is to provide a modern account of the nuclear EDF formalism that is at variance with traditional presentations that rely, at one point or another, on a {\it Hamiltonian-based} picture. The latter is not general enough to encompass what the nuclear EDF method represents as of today. Specifically, the traditional Hamiltonian-based picture does not allow one to grasp the difficulties associated with the fact that currently available parametrizations of the energy kernel E[g,g]E[g',g] at play in the method do not derive from a genuine Hamilton operator, would the latter be effective. The method is formulated from the outset through the most general multi-reference, i.e. beyond mean-field, implementation such that the single-reference, i.e. "mean-field", derives as a particular case. As such, a key point of the presentation provided here is to demonstrate that the multi-reference EDF method can indeed be formulated in a {\it mathematically} meaningful fashion even if E[g,g]E[g',g] does {\it not} derive from a genuine Hamilton operator. In particular, the restoration of symmetries can be entirely formulated without making {\it any} reference to a projected state, i.e. within a genuine EDF framework. However, and as is illustrated in the present document, a mathematically meaningful formulation does not guarantee that the formalism is sound from a {\it physical} standpoint. The price at which the latter can be enforced as well in the future is eventually alluded to.Comment: 64 pages, 8 figures, submitted to Euroschool Lecture Notes in Physics Vol.IV, Christoph Scheidenberger and Marek Pfutzner editor

    The comparative responsiveness of Hospital Universitario Princesa Index and other composite indices for assessing rheumatoid arthritis activity

    Get PDF
    Objective To evaluate the responsiveness in terms of correlation of the Hospital Universitario La Princesa Index (HUPI) comparatively to the traditional composite indices used to assess disease activity in rheumatoid arthritis (RA), and to compare the performance of HUPI-based response criteria with that of the EULAR response criteria. Methods Secondary data analysis from the following studies: ACT-RAY (clinical trial), PROAR (early RA cohort) and EMECAR (pre-biologic era long term RA cohort). Responsiveness was evaluated by: 1) comparing change from baseline (Delta) of HUPI with Delta in other scores by calculating correlation coefficients; 2) calculating standardised effect sizes. The accuracy of response by HUPI and by EULAR criteria was analyzed using linear regressions in which the dependent variable was change in global assessment by physician (Delta GDA-Phy). Results Delta HUPI correlation with change in all other indices ranged from 0.387 to 0.791); HUPI's standardized effect size was larger than those from the other indices in each database used. In ACT-RAY, depending on visit, between 65 and 80% of patients were equally classified by HUPI and EULAR response criteria. However, HUPI criteria were slightly more stringent, with higher percentage of patients classified as non-responder, especially at early visits. HUPI response criteria showed a slightly higher accuracy than EULAR response criteria when using Delta GDA-Phy as gold standard. Conclusion HUPI shows good responsiveness in terms of correlation in each studied scenario (clinical trial, early RA cohort, and established RA cohort). Response criteria by HUPI seem more stringent than EULAR''s

    An Orchestration as a Service Infrastructure Using Grid Technologies and WS-BPEL

    No full text

    Is Today’s Public Cloud Suited to Deploy Hardcore Realtime Services?

    No full text
    corecore