462 research outputs found

    TWO-PHASE FRICTION FACTOR IN GAS-LIQUID PIPE FLOW

    Get PDF
    An improved friction factor prediction model for two-phase gas-liquid pipe flow is proposed. The model is based on a previous no-slip formulation where a mixture Reynolds number was defined. In this study, the mixture Reynolds number is modified by introducing slip-ratio information through the inclusion of void-fraction and flow-pattern dependent models. An experimental database reconstituted from the available literature and new frictional pressure-drop data for air-water horizontal flow in an I.D. 0.0204m pipe are also presented. The full database considers several different flow conditions for horizontal two-phase flow of refrigerants and air-water mixtures. It was compared to predictions of models from the literature as well as the new proposed model. We found that the proposed and Müller-Steinhagen-and-Heck methods provide better agreement for the current experimental database. It is shown that the inclusion of void-fraction information on the previous mixture Reynolds definition improves the friction-factor predictio

    Quantum Entanglement in Second-quantized Condensed Matter Systems

    Full text link
    The entanglement between occupation-numbers of different single particle basis states depends on coupling between different single particle basis states in the second-quantized Hamiltonian. Thus in principle, interaction is not necessary for occupation-number entanglement to appear. However, in order to characterize quantum correlation caused by interaction, we use the eigenstates of the single-particle Hamiltonian as the single particle basis upon which the occupation-number entanglement is defined. Using the proper single particle basis, we discuss occupation-number entanglement in important eigenstates, especially ground states, of systems of many identical particles. The discussions on Fermi systems start with Fermi gas, Hatree-Fock approximation, and the electron-hole entanglement in excitations. The entanglement in a quantum Hall state is quantified as -fln f-(1-f)ln(1-f), where f is the proper fractional part of the filling factor. For BCS superconductivity, the entanglement is a function of the relative momentum wavefunction of the Cooper pair, and is thus directly related to the superconducting energy gap. For a spinless Bose system, entanglement does not appear in the Hatree-Gross-Pitaevskii approximation, but becomes important in the Bogoliubov theory.Comment: 11 pages. Journal versio

    Radon and material radiopurity assessment for the NEXT double beta decay experiment

    Full text link
    The Neutrino Experiment with a Xenon TPC (NEXT), intended to investigate the neutrinoless double beta decay using a high-pressure xenon gas TPC filled with Xe enriched in 136Xe at the Canfranc Underground Laboratory in Spain, requires ultra-low background conditions demanding an exhaustive control of material radiopurity and environmental radon levels. An extensive material screening process is underway for several years based mainly on gamma-ray spectroscopy using ultra-low background germanium detectors in Canfranc but also on mass spectrometry techniques like GDMS and ICPMS. Components from shielding, pressure vessel, electroluminescence and high voltage elements and energy and tracking readout planes have been analyzed, helping in the final design of the experiment and in the construction of the background model. The latest measurements carried out will be presented and the implication on NEXT of their results will be discussed. The commissioning of the NEW detector, as a first step towards NEXT, has started in Canfranc; in-situ measurements of airborne radon levels were taken there to optimize the system for radon mitigation and will be shown too.Comment: Proceedings of the Low Radioactivity Techniques 2015 workshop (LRT2015), Seattle, March 201

    <i>Gaia</i> Data Release 1. Summary of the astrometric, photometric, and survey properties

    Get PDF
    Context. At about 1000 days after the launch of Gaia we present the first Gaia data release, Gaia DR1, consisting of astrometry and photometry for over 1 billion sources brighter than magnitude 20.7. Aims. A summary of Gaia DR1 is presented along with illustrations of the scientific quality of the data, followed by a discussion of the limitations due to the preliminary nature of this release. Methods. The raw data collected by Gaia during the first 14 months of the mission have been processed by the Gaia Data Processing and Analysis Consortium (DPAC) and turned into an astrometric and photometric catalogue. Results. Gaia DR1 consists of three components: a primary astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the HIPPARCOS and Tycho-2 catalogues – a realisation of the Tycho-Gaia Astrometric Solution (TGAS) – and a secondary astrometric data set containing the positions for an additional 1.1 billion sources. The second component is the photometric data set, consisting of mean G-band magnitudes for all sources. The G-band light curves and the characteristics of ∼3000 Cepheid and RR-Lyrae stars, observed at high cadence around the south ecliptic pole, form the third component. For the primary astrometric data set the typical uncertainty is about 0.3 mas for the positions and parallaxes, and about 1 mas yr−1 for the proper motions. A systematic component of ∼0.3 mas should be added to the parallax uncertainties. For the subset of ∼94 000 HIPPARCOS stars in the primary data set, the proper motions are much more precise at about 0.06 mas yr−1. For the secondary astrometric data set, the typical uncertainty of the positions is ∼10 mas. The median uncertainties on the mean G-band magnitudes range from the mmag level to ∼0.03 mag over the magnitude range 5 to 20.7. Conclusions. Gaia DR1 is an important milestone ahead of the next Gaia data release, which will feature five-parameter astrometry for all sources. Extensive validation shows that Gaia DR1 represents a major advance in the mapping of the heavens and the availability of basic stellar data that underpin observational astrophysics. Nevertheless, the very preliminary nature of this first Gaia data release does lead to a number of important limitations to the data quality which should be carefully considered before drawing conclusions from the data

    Nut production in Bertholletia excelsa across a logged forest mosaic: implications for multiple forest use

    Get PDF
    Although many examples of multiple-use forest management may be found in tropical smallholder systems, few studies provide empirical support for the integration of selective timber harvesting with non-timber forest product (NTFP) extraction. Brazil nut (Bertholletia excelsa, Lecythidaceae) is one of the world’s most economically-important NTFP species extracted almost entirely from natural forests across the Amazon Basin. An obligate out-crosser, Brazil nut flowers are pollinated by large-bodied bees, a process resulting in a hard round fruit that takes up to 14 months to mature. As many smallholders turn to the financial security provided by timber, Brazil nut fruits are increasingly being harvested in logged forests. We tested the influence of tree and stand-level covariates (distance to nearest cut stump and local logging intensity) on total nut production at the individual tree level in five recently logged Brazil nut concessions covering about 4000 ha of forest in Madre de Dios, Peru. Our field team accompanied Brazil nut harvesters during the traditional harvest period (January-April 2012 and January-April 2013) in order to collect data on fruit production. Three hundred and ninety-nine (approximately 80%) of the 499 trees included in this study were at least 100 m from the nearest cut stump, suggesting that concessionaires avoid logging near adult Brazil nut trees. Yet even for those trees on the edge of logging gaps, distance to nearest cut stump and local logging intensity did not have a statistically significant influence on Brazil nut production at the applied logging intensities (typically 1–2 timber trees removed per ha). In one concession where at least 4 trees ha-1 were removed, however, the logging intensity covariate resulted in a marginally significant (0.09) P value, highlighting a potential risk for a drop in nut production at higher intensities. While we do not suggest that logging activities should be completely avoided in Brazil nut rich forests, when a buffer zone cannot be observed, low logging intensities should be implemented. The sustainability of this integrated management system will ultimately depend on a complex series of socioeconomic and ecological interactions. Yet we submit that our study provides an important initial step in understanding the compatibility of timber harvesting with a high value NTFP, potentially allowing for diversification of forest use strategies in Amazonian Perù

    Updating the Food-Based Dietary Guidelines for the Spanish Population: The Spanish Society of Community Nutrition (SENC) Proposal

    Get PDF
    Diet-related risk factors and physical inactivity are among the leading risk factors for disability and are responsible for a large proportion of the burden of chronic non-communicable diseases. Food-based dietary guidelines (FBDGs) are useful tools for nutrition policies and public health strategies to promote healthier eating and physical activity. In this paper, we discuss the process followed in developing the dietary guidelines for the Spanish population by the Spanish Society of Community Nutrition (SENC) and further explain the collaboration with primary healthcare practitioners as presented in the context of the NUTRIMAD 2018 international congress of SENC. From a health in all policies approach, SENC convened a group of experts in nutrition and public health to review the evidence on diet-health, nutrient intake and food consumption in the Spanish population, as well as food preparation, determinants and impact of diet on environmental sustainability. The collaborative group drafted the document and designed the graphic icon, which was then subject to a consultation process, discussion, and qualitative evaluation. Next, a collaborative group was established to plan a dissemination strategy, involving delegates from all the primary healthcare scientific societies in Spain. A product of this collaboration was the release of an attractive, easy-to-understand publication

    Radiopurity control in the NEXT-100 double beta decay experiment: procedures and initial measurements

    Get PDF
    We have investigated the possibility of calibrating the PMTs of scintillation detectors, using the primary scintillation produced by X-rays to induce single photoelectron response of the PMT. The high-energy tail of this response, can be approximated to an exponential function, under some conditions. In these cases, it is possible to determine the average gain for each PMT biasing voltage from the inverse of the exponent of the exponential fit to the tail, which can be done even if the background and/or noise cover-up most of the distribution. We have compared our results with those obtained by the commonly used single electron response (SER) method, which uses a LED to induce a single photoelectron response of the PMT and determines the peak position of such response, relative to the pedestal peak (the electronic noise peak, which corresponds to 0 photoelectrons). The results of the exponential fit method agree with those obtained by the SER method when the average number of photoelectrons reaching the first dynode per light/scintillation pulse is around 1.0. The SER method has higher precision, while the exponential fit method has the advantage of being useful in situations where the PMT is already in situ, being difficult or even impossible to apply the SER method, e.g. in sealed scintillator/PMT devices

    Microorganisms and spatial distribution of the sinkholes of the Yucatan Peninsula, underestimated biotechnological potential?

    Get PDF
    Investigación basada en el potencial bio-tecnológico de las micro-especies que habitan los cenotes de la Península de YucatánAbstract Objective: To detect the spatial distribution of the sinkholes of the Peninsula of Yucatan (SPY) and identify those cenotes where microorganisms have been registered. Methods: The geographic coordinates of the SPYs were obtained from various databases, as well as from scientific publications relating to the terminology ‘sinkholes’, ‘karst systems’ and ‘cenotes’. All coordinates were transformed into the Universal Transverse Mercator reference system (UTM) with datum WGS84. An infrared composite image was created with 432 RGB bands from the Landsat 8 satellite. The points with the location of the cenotes were imported into the Software TerrSet. Results: Total 1026 coordinates of sinkholes were recorded in the Yucatan Peninsula. In 18 sinkholes (<2%), microorganisms have been recovered and identified in various taxonomic levels, and only 6 sinkholes (<0.6%) has their biotechnological potential been evaluated. Conclusions: The microorganisms that inhabit the sinkholes of the Yucatan Peninsula are a reservoir with practically unexplored biotechnological potential.CONACY

    Diagnostic accuracy of non-invasive tests for advanced fibrosis in patients with NAFLD: An individual patient data meta-analysis

    Get PDF
    Objective Liver biopsy is still needed for fibrosis staging in many patients with non-alcoholic fatty liver disease. The aims of this study were to evaluate the individual diagnostic performance of liver stiffness measurement by vibration controlled transient elastography (LSM-VCTE), Fibrosis-4 Index (FIB-4) and NAFLD (non-alcoholic fatty liver disease) Fibrosis Score (NFS) and to derive diagnostic strategies that could reduce the need for liver biopsies. Design Individual patient data meta-analysis of studies evaluating LSM-VCTE against liver histology was conducted. FIB-4 and NFS were computed where possible. Sensitivity, specificity and area under the receiver operating curve (AUROC) were calculated. Biomarkers were assessed individually and in sequential combinations. Results Data were included from 37 primary studies (n=5735; 45% women; median age: 54 years; median body mass index: 30 kg/m2; 33% had type 2 diabetes; 30% had advanced fibrosis). AUROCs of individual LSM-VCTE, FIB-4 and NFS for advanced fibrosis were 0.85, 0.76 and 0.73. Sequential combination of FIB-4 cut-offs (&lt;1.3; ≥2.67) followed by LSM-VCTE cut-offs (&lt;8.0; ≥10.0 kPa) to rule-in or rule-out advanced fibrosis had sensitivity and specificity (95% CI) of 66% (63-68) and 86% (84-87) with 33% needing a biopsy to establish a final diagnosis. FIB-4 cut-offs (&lt;1.3; ≥3.48) followed by LSM cut-offs (&lt;8.0; ≥20.0 kPa) to rule out advanced fibrosis or rule in cirrhosis had a sensitivity of 38% (37-39) and specificity of 90% (89-91) with 19% needing biopsy. Conclusion Sequential combinations of markers with a lower cut-off to rule-out advanced fibrosis and a higher cut-off to rule-in cirrhosis can reduce the need for liver biopsies
    corecore