262 research outputs found

    MODELLING OF ATMOSPHERIC FLOW AND DISPERSION IN THE WAKE OF A CYLINDRICAL OBSTACLE

    Get PDF
    This paper presents computational simulations of atmospheric dispersion experiments conducted around isolated obstacles in the field. The computational tool used for the simulations was the code ADREA-HF, which was especially developed for the simulation of the dispersion of positively or negatively buoyant gases in complicated geometries. The field experiments simulated involve a single cylindrical obstacle normal to the mean wind direction and two upwind sources of ammonia and propane, with the ammonia source located at different lateral positions (Mavroidis et al., 2003). Concentrations and concentration fluctuations for both gases were calculated by the model and compared with the experimental results to evaluate the model performance. Specific characteristics of dispersion were investigated using the computational tool. Comparisons of experimental and model results with the case of dispersion around an isolated cubical obstacle are also presented and discussed

    FASTCUDA: Open Source FPGA Accelerator & Hardware-Software Codesign Toolset for CUDA Kernels

    Get PDF
    Using FPGAs as hardware accelerators that communicate with a central CPU is becoming a common practice in the embedded design world but there is no standard methodology and toolset to facilitate this path yet. On the other hand, languages such as CUDA and OpenCL provide standard development environments for Graphical Processing Unit (GPU) programming. FASTCUDA is a platform that provides the necessary software toolset, hardware architecture, and design methodology to efficiently adapt the CUDA approach into a new FPGA design flow. With FASTCUDA, the CUDA kernels of a CUDA-based application are partitioned into two groups with minimal user intervention: those that are compiled and executed in parallel software, and those that are synthesized and implemented in hardware. A modern low power FPGA can provide the processing power (via numerous embedded micro-CPUs) and the logic capacity for both the software and hardware implementations of the CUDA kernels. This paper describes the system requirements and the architectural decisions behind the FASTCUDA approach

    Queue Management in Network Processors

    Get PDF
    Abstract: -One of the main bottlenecks when designing a network processing system is very often its memory subsystem. This is mainly due to the state-of-the-art network links operating at very high speeds and to the fact that in order to support advanced Quality of Service (QoS), a large number of independent queues is desirable. In this paper we analyze the performance bottlenecks of various data memory managers integrated in typical Network Processing Units (NPUs). We expose the performance limitations of software implementations utilizing the RISC processing cores typically found in most NPU architectures and we identify the requirements for hardware assisted memory management in order to achieve wire-speed operation at gigabit per second rates. Furthermore, we describe the architecture and performance of a hardware memory manager that fulfills those requirements. This memory manager, although it is implemented in a reconfigurable technology, it can provide up to 6.2Gbps of aggregate throughput, while handling 32K independent queues

    Return of the Tbx5; lineage-tracing reveals ventricular cardiomyocyte-like precursors in the injured adult mammalian heart

    Get PDF
    The single curative measure for heart failure patients is a heart transplantation, which is limited due to a shortage of donors, the need for immunosuppression and economic costs. Therefore, there is an urgent unmet need for identifying cell populations capable of cardiac regeneration that we will be able to trace and monitor. Injury to the adult mammalian cardiac muscle, often leads to a heart attack through the irreversible loss of a large number of cardiomyocytes, due to an idle regenerative capability. Recent reports in zebrafish indicate that Tbx5a is a vital transcription factor for cardiomyocyte regeneration. Preclinical data underscore the cardioprotective role of Tbx5 upon heart failure. Data from our earlier murine developmental studies have identified a prominent unipotent Tbx5-expressing embryonic cardiac precursor cell population able to form cardiomyocytes, in vivo, in vitro and ex vivo. Using a developmental approach to an adult heart injury model and by employing a lineage-tracing mouse model as well as the use of single-cell RNA-seq technology, we identify a Tbx5-expressing ventricular cardiomyocyte-like precursor population, in the injured adult mammalian heart. The transcriptional profile of that precursor cell population is closer to that of neonatal than embryonic cardiomyocyte precursors. Tbx5, a cardinal cardiac development transcription factor, lies in the center of a ventricular adult precursor cell population, which seems to be affected by neurohormonal spatiotemporal cues. The identification of a Tbx5-specific cardiomyocyte precursor-like cell population, which is capable of dedifferentiating and potentially deploying a cardiomyocyte regenerative program, provides a clear target cell population for translationally-relevant heart interventional studies

    Mathematical analysis of approximate biological effective dose (BED) calculation for multi-phase radiotherapy treatment plans

    Get PDF
    Purpose: There is growing interest about biological effective dose (BED) and its application in treatment plan evaluation due to its stronger correlation with treatment outcome. An approximate biological effective dose (BEDA) equation was introduced in order to simplify BED calculations by treatment planning systems in multi-phase treatments. The purpose of this work is to reveal its mathematical properties relative to the true, multi-phase BED (BEDT) equation.Methods: The BEDT equation was derived and used to reveal the mathematical properties of BEDA. MATLAB (MathWorks, Natick, MA) was used to simulate and analyze common and extreme clinical multi-phase cases. In those cases, percent error and Bland-Altman analysis were used to study the significance of the inaccuracies of BEDA for different combinations of total doses, numbers of fractions, doses per fractions and α/β values. All the calculations were performed on a voxel-basis in order to study how dose distributions would affect the accuracy of BEDA.Results: When the voxel dose-per-fractions (DPF) delivered by both phases are equal, BEDA and BEDT are equal (0% error). In heterogeneous dose distributions, which significantly vary between the phases, there are fewer occurrences of equal DPFs and hence the imprecision of BEDA is greater. It was shown that as the α/β ratio increased the accuracy of BEDA would improve. Examining twenty-four cases, it was shown that the range of DPF ratios for 3% Perror varied from 0.32 to 7.50Gy, whereas for Perror of 1% the range varied from 0.50 to 2.96Gy.Conclusion: The DPF between the different phases should be equal in order to render BEDA accurate. OARs typically receive heterogeneous dose distributions hence the probability of equal DPFs is low. Consequently, the BEDA equation should only be used for targets or OARs that receive uniform or very similar dose distributions by the different treatment phases.---------------------------Cite this article as: Kauweloa KI, Gutierrez AN, Bergamo A, Stathakis S, Papaniko-laou N, Mavroidis P. Mathematical analysis of approximate biological effective dose (BED) calculation for multi-phase radiotherapy treatment plans. Int J Cancer Ther Oncol 2014; 2(2):020226. DOI: 10.14319/ijcto.0202.2

    Evaluation of the generalized gamma as a tool for treatment planning optimization

    Get PDF
    Purpose: The aim of that work is to study the theoretical behavior and merits of the Generalized Gamma (generalized dose response gradient) as well as to investigate the usefulness of this concept in practical radiobiological treatment planning.Methods: In this study, the treatment planning system RayStation 1.9 (Raysearch Laboratories AB, Stockholm, Sweden) was used. Furthermore, radiobiological models that provide the tumor control probability (TCP), normal tissue complication probability (NTCP), complication-free tumor control probability (P+) and the Generalized Gamma were employed. The Generalized Gammas of TCP and NTCP, respectively were calculated for given heterogeneous dose distributions to different organs in order to verify the TCP and NTCP computations of the treatment planning system. In this process, a treatment plan was created, where the target and the organs at risk were included in the same ROI in order to check the validity of the system regarding the objective function P+ and the Generalized Gamma. Subsequently, six additional treatment plans were created with the target organ and the organs at risk placed in the same or different ROIs. In these plans, the mean dose was increased in order to investigate the behavior of dose change on tissue response and on Generalized Gamma before and after the change in dose. By theoretically calculating these quantities, the agreement of different theoretical expressions compared to the values that the treatment planning system provides could be evaluated. Finally, the relative error between the real and approximate response values using the Poisson and the Probit models, for the case of having a target organ consisting of two compartments in a parallel architecture and with the same number of clonogens could be investigated and quantified. Results: The computations of the RayStation regarding the values of the Generalized Gamma and the objective function (P+) were verified by using an independent software. Furthermore, it was proved that after a small change in dose, the organ that is being affected most is the organ with the highest Generalized Gamma. Apart from that, the validity of the theoretical expressions that describe the change in response and the associated Generalized Gamma was verified but only for the case of small change in dose. Especially for the case of 50% TCP and NTCP, the theoretical values (ΔPapprox.) and those calculated by the RayStation show close agreement, which proves the high importance of the D50 parameter in specifying clinical response levels. Finally, the presented findings show that the behavior of ΔPapprox. looks sensible because, for both of the models that were used (Poisson and Probit), it significantly approaches the real ΔP around the region of 37% and 50% response. The present study managed to evaluate the mathematical expression of Generalized Gamma for the case of non-uniform dose delivery and the accuracy of the RayStation to calculate its values for different organs. Conclusion: A very important finding of this work is the establishment of the usefulness and clinical relevance of Generalized Gamma. That is because it gives the planner the opportunity to precisely determine which organ will be affected most after a small increase in dose and as a result an optimal treatment plan regarding tumor control and normal tissue complications can be found

    Information is Power? Transparency and fetishism in International Relations

    Get PDF
    International actors, state and non-state, have embraced transparency as a solution to all manner of political problems. Theoretical analyses of these processes present transparency in a fetishtic manner, in which the social relations that generate transparency are misrecognized as the product of information itself. This paper will outline the theoretical problems that arise when transparency promotion is fetishized in International Relations theory. Examining the fetishism of transparency, we will note problematic conception of politics, the public sphere, and rationality they articulate. Confusing the relationship between data, information and knowledge, fetishized treatments of transparency muddy the historical dynamics responsible for the emergence of transparency as a political practice. This alters our understanding of the relationship between global governance institutions, their constituents, and the nature of knowledge production itself. Realizing the normative promise of transparency requires a reorientation of theoretical practice towards sociologically and historically sensitive approaches to the politics of knowledge

    Low Q^2 Jet Production at HERA and Virtual Photon Structure

    Get PDF
    The transition between photoproduction and deep-inelastic scattering is investigated in jet production at the HERA ep collider, using data collected by the H1 experiment. Measurements of the differential inclusive jet cross-sections dsigep/dEt* and dsigmep/deta*, where Et* and eta* are the transverse energy and the pseudorapidity of the jets in the virtual photon-proton centre of mass frame, are presented for 0 < Q2 < 49 GeV2 and 0.3 < y < 0.6. The interpretation of the results in terms of the structure of the virtual photon is discussed. The data are best described by QCD calculations which include a partonic structure of the virtual photon that evolves with Q2.Comment: 20 pages, 5 Figure
    corecore