6,122 research outputs found

    An Analysis of Renewable Energy Usage by Mobile Data Network Operators

    Get PDF
    The exponential growth in mobile data traffic has resulted in massive energy usage and therefore has increased the carbon footprint of the Internet. Data network operators have taken significant initiatives to mitigate the negative impacts of carbon emissions (CE). Renewable Energy Sources (RES) have emerged as the most promising way to reduce carbon emissions. This article presents the role of renewable energy (RE) in minimizing the environmental impacts of mobile data communications for achieving a greener environment. In this article, an analysis of some selected mobile data network operators’ energy consumption (EC) has been presented. Based on the current statistics of different mobile network operators, the future energy values are estimated. These estimations of carbon emissions are based on the predicted data traffic in the coming years and the percentage consumption of energy from renewable sources by the network operators. The analysis presented in this article would be helpful to develop and implement energy policies that accelerate the process of increasing the renewable shares in total energy requirements. Incrementing the share of renewable energy in total energy requirements can be a way forward to reach Goal 7 of the United Nations Sustainable Development Goals (SDGs)

    Composite vertices that lead to soft form factors

    Get PDF
    The momentum-space cut-off parameter Λ\Lambda of hadronic vertex functions is studied in this paper. We use a composite model where we can measure the contributions of intermediate particle propagations to Λ\Lambda. We show that in many cases a composite vertex function has a much smaller cut-off than its constituent vertices, particularly when light constituents such as pions are present in the intermediate state. This suggests that composite meson-baryon-baryon vertex functions are rather soft, i.e., they have \Lambda considerably less than 1 GeV. We discuss the origin of this softening of form factors as well as the implications of our findings on the modeling of nuclear reactions.Comment: REVTex, 19 pages, 5 figs(to be provided on request

    Strontium- and Zinc-Containing Bioactive Glass and Alginates Scaffolds.

    Get PDF
    With an increasingly elderly population, there is a proportionate increase in bone injuries requiring hospitalization. Clinicians are increasingly adopting tissue-engineering methods for treatment due to limitations in the use of autogenous and autologous grafts. The aim of this study was to synthesize a novel, bioactive, porous, mechanically stable bone graft substitute/scaffold. Strontium- and zinc-containing bioactive glasses were synthesized and used with varying amounts of alginate to form scaffolds. Differential scanning calorimetric analysis (DSC), FTIR, XRD, and NMR techniques were used for the characterization of scaffolds. SEM confirmed the adequate porous structure of the scaffolds required for osteoconductivity. The incorporation of the bioactive glass with alginate has improved the compressive strength of the scaffolds. The bioactivity of the scaffolds was demonstrated by an increase in the pH of the medium after the immersion of the scaffolds in a Tris/HCl buffer and by the formation of orthophosphate precipitate on scaffolds. The scaffolds were able to release calcium, strontium and zinc ions in the Tris/HCl buffer, which would have a positive impact on osteogenesis if tested in vivo

    Parameter Inference in the Pulmonary Circulation of Mice

    Get PDF
    This study focuses on parameter inference in a pulmonary blood cir- culation model for mice. It utilises a fluid dynamics network model that takes selected parameter values and aims to mimic features of the pulmonary haemody- namics under normal physiological and pathological conditions. This is of medical relevance as it allows monitoring of the progression of pulmonary hypertension. Constraint nonlinear optimization is successfully used to learn the parameter values

    Affective Factors on Reliability of Laboratory Tests Based on ISO 17025:2005

    Get PDF
    تعد معولية الفحوصات المختبرية الركيزة الأساسية في الجانب التطبيقي لجودة المشاريع إلانشائية، إذ لا يمكن قبول أو رفض المواد إلانشائية المستخدمة في تلك المشاريع إلاٌ بعد مرورها بالفحص المختبري الذي يتم بموجبه القبول أو الرفض لتلك المواد. أن العديد من المنظمات أخذت بالسعي نحو تحقيق الميزة التنافسية، وذلك بتقديم خدماتها بجودة عالية من خلال تطبيقها معايير إدارة الجودة الشاملة، فكان من الضروري للمختبرات إلانشائية تبني أدارة الجودة اسلوبا في عملها، لاسيما تطبيق معايير المواصفة العالمية ISO 17025:2005 مما يحسن أداء هذه المختبرات من الناحية الإدارية والفنية. يتضمن هذا البحث دراسة إحصائية لعينة من المختبرات إلانشائية وشركات المقاولات إلانشائية ودوائر تنفيذ المشاريع في بعض المؤسسات الحكومية، لتحديد العوامل المؤثرة على معولية (موثوقية) الفحوصات المختبرية، واثبتت النتائج لكل محور هي (مؤثرة، مؤثرة جدا) بحسب ما تم وضع ارقام إليها إذ ستقدم المنهجية المتبعة في البحث توصيات ومقترحات تساعد الكوادر العاملة في المختبرات بالتركيز على العوامل المؤثرة على معولية الفحوصات والتعامل معها على وفق معايير المواصفة العالمية ISO 17025:2005.The reliability of the laboratory tests is the main pillar in the applied side of the quality of the construction projects. The construction materials used in these projects cannot be accepted or rejected until they have passed the laboratory examination according to which these materials are accepted or rejected.. Many organizations have sought to achieve the competitive advantage by providing high quality services through the implementation of the overall quality management standards. It was necessary for the construction laboratories to adopt the quality management method in their work, in particular the application of the standards of ISO 17025: 2005, Which improve the performance of these laboratories in terms of administrative and technical. This research includes a statistical study of a sample of construction laboratories, construction contracting companies and project implementation entities in some government departments to determine the factors affecting the reliability of the laboratory tests. The methodology used in the research will provide recommendations and suggestions to help the laboratory staff focus on the factors influencing the Reliability of tests and handling them according to the  ISO 17025: 2005

    General method for extracting the quantum efficiency of dispersive qubit readout in circuit QED

    Full text link
    We present and demonstrate a general three-step method for extracting the quantum efficiency of dispersive qubit readout in circuit QED. We use active depletion of post-measurement photons and optimal integration weight functions on two quadratures to maximize the signal-to-noise ratio of the non-steady-state homodyne measurement. We derive analytically and demonstrate experimentally that the method robustly extracts the quantum efficiency for arbitrary readout conditions in the linear regime. We use the proven method to optimally bias a Josephson traveling-wave parametric amplifier and to quantify different noise contributions in the readout amplification chain.Comment: 10 pages, 6 figure

    Estimation des niveaux d'inondation pour une crue éclair en milieu urbain : comparaison de deux modèles hydrodynamiques sur la crue de Nîmes d'octobre 1988

    Get PDF
    Lors des crues extrêmes en ville, une forte part des écoulements reste en surface. Pour simuler ces inondations, deux modèles sont présentés : le logiciel REM2 U unidimensionnel a pour objectif de simuler la propagation des débits de crue dans l'ensemble d'un réseau de rues alors que le logiciel Rubar 20 bidimensionnel vise à fournir plus d'information sur ces écoulements. Des calculs avec ces deux logiciels ont été menés sur la crue d'octobre 1988 dans un quartier de Nîmes. Lors de cet événement, les hauteurs d'eau maximales ont dépassé deux mètres en certains points et les vitesses 2 m/s ce qui entraînait des passages en régime torrentiel. A partir des données rassemblées sur les sections en travers des rues, des maillages de calcul limités au réseau de rues ont été construits pour les deux logiciels afin de permettre un calcul détaillé. La comparaison des résultats avec les laisses de crue montre des situations très contrastées d'un point à un autre pour une hauteur d'eau maximale moyenne sur l'ensemble de la zone inondée correctement simulée. L'écart sur cette hauteur est, en moyenne, de 1 m ce qui provient des incertitudes sur les observations, sur la topographie et sur les conditions aux limites, des approximations lors de la modélisation et de particularités locales non décrites. Entre les deux logiciels, l'évolution des hauteurs et des vitesses est généralement très proche bien que, comme pour la comparaison avec les laisses de crue, des différences locales importantes sont observées.The hydraulic models that are used to simulate floods in rural areas are not adapted to model floods through urban areas, because of details that may deviate flows and create strong discontinuities in the water levels, and because of the possible water flow running in the sewage network. However, such modelling is strongly required because damage is often concentrated in urban areas. Thus, it is necessary to develop models specifically dedicated to such floods. In the southern part of France, rains may have a high intensity but floods generally last a few hours. During extreme events such as the October 1988 flood in the city of Nîmes, most of the flow remained on the ground with high water depths and high velocities, and the role of sewage network can be neglected. A 1-D model and a 2-D model were used to calculate such flows, which may become supercritical. On the catchments of the streams which cross the city of Nîmes, the rainfall was estimated as 80 mm in one hour and 250 mm in six hours in October 1988, although some uncertainties remain. The return period can be estimated between 150 and 250 years. The zone selected to test the models was an area 1.2 km long and less than 1 km wide in the north-eastern part of the city. It includes a southern part with a high density of houses. The slope from the North (upstream) to the South (downstream) was more than 1 % on average and was decreasing from North to South. Various topographical and hydrological data were obtained from the local Authorities. The basic data were composed of 258 cross sections of 69 streets with 11 to 19 points for each cross section. Observations of the limits of the flooded areas and of the peak water levels at more than 80 points can be used to validate the calculation results. The inputs consisted of two discharge hydrographs, estimated from a rainfall-discharge model from rains with a return period of 100 years, which may result in an underestimate of these inputs. These two hydrographs correspond to the two main structures that cross the railway embankment, which constitutes an impervious upstream boundary of the modelled area. Whereas the western and eastern boundaries are well delimitated by hills above maximum water levels, the downstream southern boundary is somewhat more questionable because of possibilities of backwater and inflows from neighbouring areas.The 1-D software REM2U solved the Saint Venant equations on a meshed network. At crossroads, continuities of discharge and of water heads were set. The hydraulic jump was modelled by a numerical diffusion applied wherever high water levels were found. The Lax Wendroff numerical scheme was implemented. It included a prediction step and a correction step, which implied precise solving of these very unsteady and hyperbolic problems. The software was validated on numerous test cases (Al Mikdad, 2000) which proved the adaptation to problems of calculations in a network of streets.The 2-D software Rubar 20 solves 2-D shallow water equations by an explicit second-order Van Leer type finite volume scheme on a computational grid made from triangles and quadrilaterals (Paquier, 1998). The discontinuities (hydraulic jumps for instance) are treated as ordinary points through the solving of Riemann problems. For the Nîmes case, the grid was built from the cross sections of the streets. Four grids were built with respectively 4, 5, 7 or 11 points for every cross section and these points correspond to the main characteristics of the cross section: the walls of the buildings, the sidewalks, the gutters and the middle point. The simplest crossroads were described from the crossings of the lines corresponding to these points, which provide respectively 16, 25, 49 or 121 computational cells. The space step was about 25 metres along the streets but went as low as 0.1 m in the crossroads; due to the explicit scheme, which implies that the Courant number was limited to 1, the time step was very small and a long computational time was required.The computations were performed with a uniform Strickler coefficient of 40 m1/3/s. Both 1-D and 2-D models provided results that agreed well with observed water levels. The limits of the flooded area were also quite well simulated. However, locally, the differences between calculated and observed maximum water depths were high, resulting in an average deviation of about 1 metre. The reasons for such deviations could come from three main causes. First, the uncertainty of topographical data is relatively high, because of the interpolation between measured cross sections without a detailed complementary DEM (digital elevation model). Second, the observed levels were also uncertain and reveal local situations that are not reconstructed by the hydraulic models which provided maximum water levels averaged on one cell which may not coincide with the exact location of the observations. Finally, modelling means a simplification of the processes, which implies cancelling the level variations due to some obstacles, such as cars, which are not simple to identify.In conclusion, both software packages can model a flood, even a flash flood, in an urbanised area. Research is still necessary to develop methods to fully use urban databases in order to define details more precisely. The improvements to the 1-D software should include a better modelling of storage and of crossroads with an integration of adapted relations for the head losses. 2-D software has a greater potential but the difficulty to build an optimal computational grid means a long computational time, which limits the use of such software to small areas. For both software packages, methods still need to be developed in order to represent exchanges with the sewage network, storage inside buildings and inputs directly coming from rainfall
    corecore