103 research outputs found

    Exploring the Gamma Ray Horizon with the next generation of Gamma Ray Telescopes. Part 3: Optimizing the observation schedule of gamma-ray sources for the extraction of cosmological parameters

    Full text link
    The optimization of the observation schedule of gamma-ray emitters by the new generation of Cherenkov Telescopes to extract cosmological parameters from the measurement of the Gamma Ray Horizon at different redshifts is discussed. It is shown that improvements over 30% in the expected cosmological parameter uncertainties can be achieved if instead of equal-observation time, dedicated observation schedules are applied.Comment: 13 pages, 3 figure

    Heterogeneous Batch Distillation Processes: Real System Optimisation

    Get PDF
    In this paper, optimisation of batch distillation processes is considered. It deals with real systems with rigorous simulation of the processes through the resolution full MESH differential algebraic equations. Specific software architecture is developed, based on the BatchColumn® simulator and on both SQP and GA numerical algorithms, and is able to optimise sequential batch columns as long as the column transitions are set. The efficiency of the proposed optimisation tool is illustrated by two case studies. The first one concerns heterogeneous batch solvent recovery in a single distillation column and shows that significant economical gains are obtained along with improved process conditions. Case two concerns the optimisation of two sequential homogeneous batch distillation columns and demonstrates the capacity to optimize several sequential dynamic different processes. For such multiobjective complex problems, GA is preferred to SQP that is able to improve specific GA solutions

    La Habana del Este

    Get PDF
    The Atlas of Territorial Heritage in the Municipality of East Havana is a useful tool in order to reach a complete knowledge of an extended Cuban municipality belonging to Havana, Cuba. At the same time the atlas is useful to generate analytical bases for future urban planning interventions and transformations, which can be focused on the idea of territorial heritage as an essential resource for a self-sustaining development. The main objective is to explain the tangible and intangible components of the heritage and try to stimulate and strengthen the community’s awareness on the richness of territory and its potential. Thus communities can express themselves in the adaptation to environmental, climatic, demographic and economic changes. The Atlas was developed by an Italian/Cuban team of experts in different phases and has been updated recently. It contains the results of a deep and accurate analysis and cataloging, made up of a large number of data concerning the various areas of research. The data were organized in typologies and punctually located in maps. The multifaceted and dense richness of Cuban culture finds in this volume a confirmation, and makes possible to put into practice cognitive tools for safeguarding and valorization, a strong point for new challenges of contemporaneity

    A Constrained Sequential-Lamination Algorithm for the Simulation of Sub-Grid Microstructure in Martensitic Materials

    Full text link
    We present a practical algorithm for partially relaxing multiwell energy densities such as pertain to materials undergoing martensitic phase transitions. The algorithm is based on sequential lamination, but the evolution of the microstructure during a deformation process is required to satisfy a continuity constraint, in the sense that the new microstructure should be reachable from the preceding one by a combination of branching and pruning operations. All microstructures generated by the algorithm are in static and configurational equilibrium. Owing to the continuity constrained imposed upon the microstructural evolution, the predicted material behavior may be path-dependent and exhibit hysteresis. In cases in which there is a strict separation of micro and macrostructural lengthscales, the proposed relaxation algorithm may effectively be integrated into macroscopic finite-element calculations at the subgrid level. We demonstrate this aspect of the algorithm by means of a numerical example concerned with the indentation of an Cu-Al-Ni shape memory alloy by a spherical indenter.Comment: 27 pages with 9 figures. To appear in: Computer Methods in Applied Mechanics and Engineering. New version incorporates minor revisions from revie

    Nuclear data requirements for the ADS conceptual design EFIT: Uncertainty and sensitivity study

    Full text link
    In this paper, we assess the impact of activation cross-section uncertainties on relevant fuel cycle parameters for a conceptual design of a modular European Facility for Industrial Transmutation (EFIT) with a “double strata” fuel cycle. Next, the nuclear data requirements are evaluated so that the parameters can meet the assigned design target accuracies. Different discharge burn-up levels are considered: a low burn-up, corresponding to the equilibrium cycle, and a high burn-up level, simulating the effects on the fuel of the multi-recycling scenario. In order to perform this study, we propose a methodology in two steps. Firstly, we compute the uncertainties on the system parameters by using a Monte Carlo simulation, as it is considered the most reliable approach to address this problem. Secondly, the analysis of the results is performed by a sensitivity technique, in order to identify the relevant reaction channels and prioritize the data improvement needs. Cross-section uncertainties are taken from the EAF-2007/UN library since it includes data for all the actinides potentially present in the irradiated fuel. Relevant uncertainties in some of the fuel cycle parameters have been obtained, and we conclude with recommendations for future nuclear data measurement programs, beyond the specific results obtained with the present nuclear data files and the limited available covariance information. A comparison with the uncertainty and accuracy analysis recently published by the WPEC-Subgroup26 of the OECD using BOLNA covariance matrices is performed. Despite the differences in the transmuter reactor used for the analysis, some conclusions obtained by Subgroup26 are qualitatively corroborated, and improvements for additional cross sections are suggested

    Alkali and Alkaline Earth Metal Compounds: Core-Valence Basis Sets and Importance of Subvalence Correlation

    Full text link
    Core-valence basis sets for the alkali and alkaline earth metals Li, Be, Na, Mg, K, and Ca are proposed. The basis sets are validated by calculating spectroscopic constants of a variety of diatomic molecules involving these elements. Neglect of (3s,3p)(3s,3p) correlation in K and Ca compounds will lead to erratic results at best, and chemically nonsensical ones if chalcogens or halogens are present. The addition of low-exponent pp functions to the K and Ca basis sets is essential for smooth convergence of molecular properties. Inclusion of inner-shell correlation is important for accurate spectroscopic constants and binding energies of all the compounds. In basis set extrapolation/convergence calculations, the explicit inclusion of alkali and alkaline earth metal subvalence correlation at all steps is essential for K and Ca, strongly recommended for Na, and optional for Li and Mg, while in Be compounds, an additive treatment in a separate `core correlation' step is probably sufficient. Consideration of (1s)(1s) inner-shell correlation energy in first-row elements requires inclusion of (2s,2p)(2s,2p) `deep core' correlation energy in K and Ca for consistency. The latter requires special CCVnnZ `deep core correlation' basis sets. For compounds involving Ca bound to electronegative elements, additional dd functions in the basis set are strongly recommended. For optimal basis set convergence in such cases, we suggest the sequence CV(D+3d)Z, CV(T+2d)Z, CV(Q+dd)Z, and CV5Z on calcium.Comment: Molecular Physics, in press (W. G. Richards issue); supplementary material (basis sets in G98 and MOLPRO formats) available at http://theochem.weizmann.ac.il/web/papers/group12.htm

    puma: a Bioconductor package for propagating uncertainty in microarray analysis

    Get PDF
    BACKGROUND: Most analyses of microarray data are based on point estimates of expression levels and ignore the uncertainty of such estimates. By determining uncertainties from Affymetrix GeneChip data and propagating these uncertainties to downstream analyses it has been shown that we can improve results of differential expression detection, principal component analysis and clustering. Previously, implementations of these uncertainty propagation methods have only been available as separate packages, written in different languages. Previous implementations have also suffered from being very costly to compute, and in the case of differential expression detection, have been limited in the experimental designs to which they can be applied. RESULTS: puma is a Bioconductor package incorporating a suite of analysis methods for use on Affymetrix GeneChip data. puma extends the differential expression detection methods of previous work from the 2-class case to the multi-factorial case. puma can be used to automatically create design and contrast matrices for typical experimental designs, which can be used both within the package itself but also in other Bioconductor packages. The implementation of differential expression detection methods has been parallelised leading to significant decreases in processing time on a range of computer architectures. puma incorporates the first R implementation of an uncertainty propagation version of principal component analysis, and an implementation of a clustering method based on uncertainty propagation. All of these techniques are brought together in a single, easy-to-use package with clear, task-based documentation. CONCLUSION: For the first time, the puma package makes a suite of uncertainty propagation methods available to a general audience. These methods can be used to improve results from more traditional analyses of microarray data. puma also offers improvements in terms of scope and speed of execution over previously available methods. puma is recommended for anyone working with the Affymetrix GeneChip platform for gene expression analysis and can also be applied more generally
    corecore