194,978 research outputs found

    The hadronic interaction model SIBYLL 2.3c and Feynman scaling

    Get PDF
    The Monte Carlo model Sibyll has been designed for efficient simulation of hadronic multiparticle production up to the highest energies as needed for interpreting cosmic ray measurements. For more than 15 years, version 2.1 of Sibyll has been one of the standard models for air shower simulation. Motivated by data of LHC and fixed-target experiments and a better understanding of the phenomenology of hadronic interactions, we have developed an improved version of this model, version 2.3, which has been released in 2016. In this contribution we present a revised version of this model, called Sibyll 2.3c, that is further improved by adjusting particle production spectra to match the expectation of Feynman scaling in the fragmentation region. After a brief introduction to the changes implemented in Sibyll 2.3 and 2.3c with respect to Sibyll 2.1, the current predictions of the model for the depth of shower maximum, the number of muons at ground, and the energy spectrum of muons in extensive air showers are presented.Comment: 35th International Cosmic Ray Conferenc

    Value of river discharge data for global-scale hydrological modeling

    Get PDF
    his paper investigates the value of observed river discharge data for global-scale hydrological modeling of a number of flow characteristics that are required for assessing water resources, flood risk and habitat alteration of aqueous ecosystems. An improved version of WGHM (WaterGAP Global Hydrology Model) was tuned in a way that simulated and observed long-term average river discharges at each station become equal, using either the 724-station dataset (V1) against which former model versions were tuned or a new dataset (V2) of 1235 stations and often longer time series. WGHM is tuned by adjusting one model parameter (γ) that affects runoff generation from land areas, and, where necessary, by applying one or two correction factors, which correct the total runoff in a sub-basin (areal correction factor) or the discharge at the station (station correction factor). The study results are as follows. (1) Comparing V2 to V1, the global land area covered by tuning basins increases by 5%, while the area where the model can be tuned by only adjusting γ increases by 8% (546 vs. 384 stations). However, the area where a station correction factor (and not only an areal correction factor) has to be applied more than doubles (389 vs. 93 basins), which is a strong drawback as use of a station correction factor makes discharge discontinuous at the gauge and inconsistent with runoff in the basin. (2) The value of additional discharge information for representing the spatial distribution of long-term average discharge (and thus renewable water resources) with WGHM is high, particularly for river basins outside of the V1 tuning area and for basins where the average sub-basin area has decreased by at least 50% in V2 as compared to V1. For these basins, simulated long-term average discharge would differ from the observed one by a factor of, on average, 1.8 and 1.3, respectively, if the additional discharge information were not used for tuning. The value tends to be higher in semi-arid and snow-dominated regions where hydrological models are less reliable than in humid areas. The deviation of the other simulated flow characteristics (e.g. low flow, inter-annual variability and seasonality) from the observed values also decreases significantly, but this is mainly due to the better representation of average discharge but not of variability. (3) The optimal sub-basin size for tuning depends on the modeling purpose. On the one hand, small basins between 9000 and 20 000 km2 show a much stronger improvement in model performance due to tuning than the larger basins, which is related to the lower model performance (with and without tuning), with basins over 60 000 km2 performing best. On the other hand, tuning of small basins decreases model consistency, as almost half of them require a station correction factor

    Dynamic Load Balancing Strategy for Parallel Tumor Growth Simulations

    Get PDF
    In this paper, we propose a parallel cellular automaton tumor growth model that includes load balancing of cells distribution among computational threads with the introduction of adjusting parameters. The obtained results show a fair reduction in execution time and improved speedup compared with the sequential tumor growth simulation program currently referenced in tumoral biology. The dynamic data structures of the model can be extended to address additional tumor growth characteristics such as angiogenesis and nutrient intake dependencies

    Improving elevation resolution in phased-array inspections for NDT

    Get PDF
    The Phased Array Ultrasonic Technique (PAUT) offers great advantages over the conventional ultrasound technique (UT), particularly because of beam focusing, beam steering and electronic scanning capabilities. However, the 2D images obtained have usually low resolution in the direction perpendicular to the array elements, which limits the inspection quality of large components by mechanical scanning. This paper describes a novel approach to improve image quality in these situations, by combining three ultrasonic techniques: Phased Array with dynamic depth focusing in reception, Synthetic Aperture Focusing Technique (SAFT) and Phase Coherence Imaging (PCI). To be applied with conventional NDT arrays (1D and non-focused in elevation) a special mask to produce a wide beam in the movement direction was designed and analysed by simulation and experimentally. Then, the imaging algorithm is presented and validated by the inspection of test samples. The obtained images quality is comparable to that obtained with an equivalent matrix array, but using conventional NDT arrays and equipments, and implemented in real time.Fil: Brizuela, Jose David. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Camacho, J.. Consejo Superior de Investigaciones Científicas; EspañaFil: Cosarinsky, Guillermo Gerardo. Comisión Nacional de Energía Atómica; ArgentinaFil: Iriarte, Juan Manuel. Comisión Nacional de Energía Atómica; ArgentinaFil: Cruza, Jorge F.. Consejo Superior de Investigaciones Científicas; Españ

    Discrete-event simulation of process control in low volume high value industries

    Get PDF
    This paper presents a new method of process control for set-up dominant processes. This new method known as Set-up Process Algorithm (SUPA) was compared with existing industrial practices and statistical techniques in the literature. To test the method’s robustness, a generic discrete-event simulation model was built. This model was used to test four different statistical approaches to process control. It was concluded that SUPA offers a method of process control for set-up dominant processes, which is easier to apply than classically derived SPC approaches, by using simple rules and a traffic light system based on design specification. Simulation analysis shows that SUPA: is more sensitive, at detecting an incapable process as it will monitor more units when a process is less capable; is more sensitive than PRE-Control at detecting mean shifts in a process. SUPA is also a nonparametric methodology and therefore robust against processes with non-Gaussian distributions

    A Bayesian spatial random effects model characterisation of tumour heterogeneity implemented using Markov chain Monte Carlo (MCMC) simulation

    Get PDF
    The focus of this study is the development of a statistical modelling procedure for characterising intra-tumour heterogeneity, motivated by recent clinical literature indicating that a variety of tumours exhibit a considerable degree of genetic spatial variability. A formal spatial statistical model has been developed and used to characterise the structural heterogeneity of a number of supratentorial primitive neuroecto-dermal tumours (PNETs), based on diffusionweighted magnetic resonance imaging. Particular attention is paid to the spatial dependence of diffusion close to the tumour boundary, in order to determine whether the data provide statistical evidence to support the proposition that water diffusivity in the boundary region of some tumours exhibits a deterministic dependence on distance from the boundary, in excess of an underlying random 2D spatial heterogeneity in diffusion. Tumour spatial heterogeneity measures were derived from the diffusion parameter estimates obtained using a Bayesian spatial random effects model. The analyses were implemented using Markov chain Monte Carlo (MCMC) simulation. Posterior predictive simulation was used to assess the adequacy of the statistical model. The main observations are that the previously reported relationship between diffusion and boundary proximity remains observable and achieves statistical significance after adjusting for an underlying random 2D spatial heterogeneity in the diffusion model parameters. A comparison of the magnitude of the boundary-distance effect with the underlying random 2D boundary heterogeneity suggests that both are important sources of variation in the vicinity of the boundary. No consistent pattern emerges from a comparison of the boundary and core spatial heterogeneity, with no indication of a consistently greater level of heterogeneity in one region compared with the other. The results raise the possibility that DWI might provide a surrogate marker of intra-tumour genetic regional heterogeneity, which would provide a powerful tool with applications in both patient management and in cancer research

    Lattice Boltzmann simulation of droplet behaviour in microfluidic devices

    Get PDF
    We developed a lattice Boltzmann model to investigate the droplet dynamics in microfluidic devices. In our model, a stress-free boundary condition was proposed to conserve the total mass of flow system and improve the numerical stability for flows with low Reynolds number The model was extensively validated by the benchmark cases including the Laplace law, the static contact angles at solid surface, and the droplet deformation and breakup under simple shear flow We applied our model to study the effects of the Pelcect number the Capillary number and wettability on droplet formation. The results showed that the Peclet number has little effect on droplet size though it slightly affects the time of droplet formation. In the creeping flow regime, the Capillary number plays a dominating role in the droplet generation process. Wettability of fluids affects the position of droplet detachment, the droplet shape and size, and its impact becomes more significant when the Capillary number decreases. We also found that the hydrophobic surface generally can produce smaller droplet
    corecore