1,074 research outputs found

    Navier-Stokes computation of compressible turbulent flows with a second order closure, part 1

    Get PDF
    A second order closure turbulence model for compressible flows is developed and implemented in a 2D Reynolds-averaged Navier-Stokes solver. From the beginning where a kappa-epsilon turbulence model was implemented in the bidiagonal implicit method of MACCORMACK (referred to as the MAC3 code) to the final stage of implementing a full second order closure in the efficient line Gauss-Seidel algorithm, numerous work was done, individually and collectively. Besides the collaboration itself, the final product of this work is a second order closure derived from the Launder, Reece, and Rodi model to account for near wall effects, which has been called FRAME model, which stands for FRench-AMerican-Effort. During the reporting period, two different problems were worked out. The first was to provide Ames researchers with a reliable compressible boundary layer code including a wide collection of turbulence models for quick testing of new terms, both in two equations and in second order closure (LRR and FRAME). The second topic was to complete the implementation of the FRAME model in the MAC5 code. The work related to these two different contributions is reported. dilatation in presence of stron shocks. This work, which has been conducted during a work at the Center for Turbulence Research with Zeman aimed also to cros-check earlier assumptions by Rubesin and Vandromme

    Halal Tourism in the Context of Tourism Sector in Tunisia: Controversies, Challenges, and Opportunities

    Get PDF
    This paper examines the development of the tourism industry in Tunisia from its independence in 1956 to contemporary Tunisia with a special focus on the development of halal tourism. In assessing the tourism sector in Tunisia, a typology is used. This typology makes it clear that Tunisia is a prominent tourist destination in several types of tourism, including beach tourism, while in other types it remains underdeveloped. Regarding the emerging form of halal tourism, Tunisia is lagging behind in its development. This may come as a surprise given the fact that Tunisia is a popular tourist destination and a Muslim majority country. Although there are different reasons for this, I would argue that many things can be explained by looking at the political context in Tunisia. The reluctance of government actors hinders the profound development of halal tourism. The policies that have been put forward in the past by Bourguiba and Ben Ali have had an undeniable impact on the general opinion of contemporary Tunisian society on this matter. Even though Tunisia is faced with challenges, Tunisia can still be a suitable country for halal tourism while maintaining other forms of tourism.Keywords: Tunisia, halal tourism, religious tourism, opportunities and challenges

    Test code for the assessment and improvement of Reynolds stress models

    Get PDF
    An existing two-dimensional, compressible flow, Navier-Stokes computer code, containing a full Reynolds stress turbulence model, was adapted for use as a test bed for assessing and improving turbulence models based on turbulence simulation experiments. To date, the results of using the code in comparison with simulated channel flow and over an oscillating flat plate have shown that the turbulence model used in the code needs improvement for these flows. It is also shown that direct simulation of turbulent flows over a range of Reynolds numbers are needed to guide subsequent improvement of turbulence models

    Data acquisition from digital holograms of particles

    Get PDF
    A technique for data acquisition from digital holograms of particle ensembles, including preprocessing of the digital hologram, construction of a two-dimensional display of the holographic image of investigated volume, and segmentation and measurement of particle characteristics is considered. The proposed technique is realized in automatic regime and can work in real time. Results of the technique approbation using digital holograms of sand, plankton particles in water, and air bubbles in oil are presented

    Response of a supersonic boundary layer to a compression corner

    Get PDF
    On the basis of direct numerical simulations of rapidly compressed turbulence, Zeman and Coleman have developed a model to represent rapid directional compression contribution to the pressure dilatation term in the turbulent kinetic energy equation. The model has been implemented in the CFD code for simulation of supersonic compression corner flow with an extended separated region. The computational results have shown a significant improvement with respect to the baseline solution given by the standard k- epsilon turbulence model which does not contain any compressibility corrections

    Progesterone in traumatic brain injury: time to move on to phase III trials

    Get PDF
    There are several candidate neuroprotective agents that have been shown in preclinical testing to improve outcomes following traumatic brain injury (TBI). Xiao and colleagues have performed an in hospital, double blind, randomized, controlled clinical trial utilizing progesterone in the treatment of patients sustaining TBI evaluating safety and long term clinical outcomes. These data, combined with the results of the previously published ProTECT trial, show progesterone to be safe and potentially efficacious in the treatment of TBI. Larger phase III trials will be necessary to verify results prior to clinical implementation. Clinical trials networks devoted to the study of TBI are vital to the timely clinical testing of these candidate agents and need to be supported

    An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    Get PDF
    International audienceLandslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA Âź) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE Âź , Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The probability to obtain a safety factor below 1 represents the probability of occurrence of a landslide for a given triggering event. The dispersion of the distribution gives the uncertainty of the result. Finally, a map is created, displaying a probability of occurrence for each computing cell of the studied area. In order to take into account the land-uses change, a complementary module integrating the vegetation effects on soil properties has been recently developed. Last years, the model has been applied at different scales for different geomorphological environments: (i) at regional scale (1:50,000-1:25,000) in French West Indies and French Polynesian islands (ii) at local scale (i.e.:10,000) for two complex mountainous areas; (iii) at the site-specific scale (1:2,000) for one landslide. For each study the 3D geotechnical model has been adapted. The different studies have allowed : (i) to discuss the different factors included in the model especially the initial 3D geotechnical models; (ii) to precise the location of probable failure following different hydrological scenarii; (iii) to test the effects of climatic change and land-use on slopes for two cases. In that way, future changes in temperature, precipitation and vegetation cover can be analyzed, permitting to address the impacts of global change on landslides. Finally, results show that it is possible to obtain reliable information about future slope failures at different scale of work for different scenarii with an integrated approach. The final information about landslide susceptibility (i.e. probability of failure) can be integrated in landslide hazard assessment and could be an essential information source for future land planning. As it has been performed in the ANR Project SAMCO (Society Adaptation for coping with Mountain risks in a global change COntext), this analysis constitutes a first step in the chain for risk assessment for different climate and economical development scenarios, to evaluate the resilience of mountainous areas

    Modélisation conséquentielle de la consommation d'énergie d'un groupe de serveurs générant un nuage informatique et attributionnelle des bénéfices de la virtualisation

    Get PDF
    L’électricitĂ© est actuellement un Ă©lĂ©ment essentiel des sociĂ©tĂ©s modernes, utilisĂ©e dans la vie de tous les jours par l’ensemble de la population. Cependant, sa production contribue Ă  certains dĂ©gĂąts environnementaux, dont le dĂ©gagement de gaz Ă  effet de serre (GES), reconnu par une trĂšs grande majoritĂ© de la communautĂ© scientifique comme Ă©tant Ă  l’origine du rĂ©chauffement climatique. Les centres de donnĂ©es, pouvant ĂȘtre considĂ©rĂ©s comme des superordinateurs, reprĂ©sentent 1,3 % de la consommation mondiale d’électricitĂ©, ce qui est trĂšs impressionnant, en considĂ©rant la jeunesse de cette technologie. Pour diminuer leur consommation, diffĂ©rentes stratĂ©gies d’optimisation existent, il est notamment possible de citer la virtualisation. Celle-ci permet la crĂ©ation de machines virtuelles (VM), c’est-Ă -dire l’équivalent virtuel d’un systĂšme d’opĂ©ration possĂ©dant toutes les fonctionnalitĂ©s d’un serveur physique. Les serveurs, pouvant hĂ©berger plusieurs VM, sont dĂšs lors capables de rĂ©pondre simultanĂ©ment aux demandes de plusieurs clients, permettant ainsi une augmentation de l’utilisation des ressources (par rapport Ă  un serveur par client). Cependant, la virtualisation a un coĂ»t liĂ© Ă  l’utilisation d’un « hypervisor », c’est-Ă -dire d’un programme liant les VM au matĂ©riel physique des serveurs. Pour pouvoir choisir adĂ©quatement quelle stratĂ©gie d’optimisation Ă  mettre en oeuvre, il est nĂ©cessaire de possĂ©der des outils d’évaluation environnementale permettant une comparaison Ă©quitable des diffĂ©rentes approches d’optimisation, pour ainsi permettre la quantification de leurs impacts et de leurs bĂ©nĂ©fices environnementaux. L’analyse du cycle de vie (ACV) semble satisfaire l’ensemble de ces critĂšres, car c’est une mĂ©thodologie permettant de quantifier les impacts environnementaux potentiels de l’ensemble du cycle de vie d’un produit, d’un procĂ©dĂ© ou d’un service. NĂ©anmoins, l’ACV est encore une approche en dĂ©veloppement, car de nombreux aspects mĂ©thodologiques doivent ĂȘtre amĂ©liorĂ©s. Le cadre de ce mĂ©moire est la quantification des impacts environnementaux liĂ©s Ă  l’installation de centres de donnĂ©es, au Canada, ainsi que l’évaluation des bĂ©nĂ©fices environnementaux de la virtualisation. La problĂ©matique se divise ainsi en deux parties : premiĂšrement, comment modĂ©liser la consommation Ă©lectrique future de centres de donnĂ©es implantĂ©s au Canada, et deuxiĂšmement, comment quantifier les bĂ©nĂ©fices environnementaux de la virtualisation? Deux objectifs principaux ont ainsi Ă©tĂ© fixĂ©s pour rĂ©pondre aux deux parties de la problĂ©matique dĂ©veloppĂ©e ci-dessus : modĂ©liser, en combinant l’ACV consĂ©quentielle Ă  un modĂšle vii Ă©conomique, les impacts environnementaux de la consommation d'Ă©nergie de la phase d'utilisation d'un groupe de serveurs implantĂ© au Canada et rĂ©aliser une ACV permettant l’évaluation des bĂ©nĂ©fices environnementaux potentiels de la virtualisation. Deux mĂ©thodologies sont utilisĂ©es pour remplir les deux objectifs du mĂ©moire. La premiĂšre, pour Ă©valuer l’impact de l’implantation de centres de donnĂ©es au Canada sur le systĂšme Ă©nergĂ©tique nord-amĂ©ricain, dĂ©veloppe une approche consĂ©quentielle et prospective combinant l’ACV et le modĂšle Ă©conomique, E3MC (celui-ci dĂ©crit le systĂšme Ă©nergĂ©tique nord-amĂ©ricain). Une ACV est dite prospective et consĂ©quentielle lorsqu’elle Ă©tudie respectivement des systĂšmes futurs et des impacts environnementaux liĂ©s Ă  des changements dans un systĂšme. Cette analyse est rĂ©alisĂ©e en 5 Ă©tapes. PremiĂšrement, des scĂ©narios, dĂ©crivant l’évolution de la consommation des centres de donnĂ©es au cours du temps, sont dĂ©veloppĂ©s avec un scĂ©nario de rĂ©fĂ©rence dans lequel aucun centre de donnĂ©es n’est implantĂ©. DeuxiĂšmement, les scĂ©narios sont implĂ©mentĂ©s dans le modĂšle E3MC. TroisiĂšmement, les technologies marginales, c’est-Ă -dire les technologies rĂ©pondant Ă  la demande Ă©lectrique supplĂ©mentaire des nouveaux centres de donnĂ©es, sont identifiĂ©es Ă  partir des rĂ©sultats obtenus avec le modĂšle E3MC. QuatriĂšmement, les impacts environnementaux de chaque scĂ©nario sont calculĂ©s en utilisant le mix de technologies marginales fourni Ă  l’étape prĂ©cĂ©dente, la base de donnĂ©es ACV ecoinvent et la mĂ©thode d’impact « IMPACT2002+ ». Finalement, les rĂ©sultats obtenus sont interprĂ©tĂ©s et comparĂ©s Ă  ceux obtenus Ă  partir d’une approche attributionnelle. La deuxiĂšme mĂ©thodologie est une ACV prĂ©liminaire, c’est-Ă -dire utilisant des donnĂ©es approximatives, d’un service de vidĂ©oconfĂ©rences. Le systĂšme Ă©tudiĂ© est ainsi composĂ© de quatre Ă©lĂ©ments principaux : les ordinateurs et les Ă©quipements d’accĂšs Ă  internet des clients utilisant le service de vidĂ©oconfĂ©rences, la consommation d’énergie des infrastructures permettant le transfert de l’information et le systĂšme de serveurs contenant le serveur lame hĂ©bergeant le service de vidĂ©oconfĂ©rences. Un serveur lame contient tous les Ă©quipements Ă©lectroniques d’un serveur, mais arrangĂ©s de maniĂšre compacte pour minimiser l’espace nĂ©cessaire. Une Ă©tude plus approfondie du cycle de vie du systĂšme de serveurs est rĂ©alisĂ©e pour permettre une meilleure description des bĂ©nĂ©fices environnementaux que peut rĂ©aliser l’opĂ©rateur du centre de donnĂ©es. L’évaluation de la virtualisation se fait Ă  l’aide de trois scĂ©narios d’utilisation du serveur lame : dans le premier, le serveur lame hĂ©bergeant le service de vidĂ©oconfĂ©rences n’utilise pas la virtualisation. Dans le second scĂ©nario, la virtualisation est utilisĂ©e par le serveur lame. Finalement dans le troisiĂšme scĂ©nario, deux serveurs lames ayant recours Ă  la virtualisation sont utilisĂ©s pour hĂ©berger le service viii de vidĂ©oconfĂ©rences, pour ainsi amĂ©liorer sa fiabilitĂ©. Finalement, pour tester la robustesse des conclusions, diffĂ©rentes analyses de sensibilitĂ© sont rĂ©alisĂ©es. Les rĂ©sultats obtenus Ă  partir de l’approche combinant l’ACV au modĂšle E3MC indiquent que les principales sources futures d’électricitĂ© marginales sont le gaz naturel et le charbon avec une lĂ©gĂšre participation de l’hydroĂ©lectricitĂ©. De plus, la demande Ă©lectrique des nouveaux centres de donnĂ©es induit une diminution marginale importante des exportations du Canada vers les États-Unis. Cette diminution marginale des exportations provoque une production marginale compensatoire aux États-Unis Ă  base de combustible fossile. Enfin, l’augmentation de la demande Ă©lectrique des nouveaux centres de donnĂ©es provoque une augmentation de la proportion d’hydroĂ©lectricitĂ© et une diminution de celle du charbon dans le grid mix marginal. Cependant, bien que l’utilisation du modĂšle E3MC augmente le nombre de critĂšres considĂ©rĂ©s Ă  travers l’utilisation de nombreux paramĂštres Ă©conomiques, technologiques, dĂ©mographiques, environnementaux
, elle augmente Ă©galement les incertitudes et diminue la transparence de l’étude. Effectivement, l’utilisation du modĂšle E3MC, de par sa complexitĂ©, a nĂ©cessitĂ© la participation d’experts de chez Environnement Canada. Par ailleurs, le modĂšle et les donnĂ©es utilisĂ©es pour le calibrer n’ont pas pu ĂȘtre obtenus pour des raisons de confidentialitĂ©. Ainsi, pour dĂ©velopper l’utilisation de modĂšles Ă©conomiques en ACV, car ceux-ci permettent de prendre en considĂ©ration de nombreux phĂ©nomĂšnes, il serait nĂ©cessaire de faire appel Ă  diffĂ©rents experts pour rĂ©aliser un travail multidisciplinaire incluant Ă  la fois la science de l’ACV et de l’économie. Les rĂ©sultats obtenus Ă  partir de l’ACV prĂ©liminaire indiquent clairement des bĂ©nĂ©fices non nĂ©gligeables au niveau des serveurs lorsque la virtualisation est utilisĂ©e, mais les bĂ©nĂ©fices sont mitigĂ©s par la faible participation du systĂšme de serveurs aux impacts totaux d’une vidĂ©oconfĂ©rence. Effectivement, l’ACV prĂ©liminaire indique que les impacts d’une vidĂ©oconfĂ©rence sont majoritairement provoquĂ©s par la production de l’ordinateur portable et par la consommation d’électricitĂ© du transfert de donnĂ©es et de l’ordinateur portable. NĂ©anmoins, l’application d’une stratĂ©gie de virtualisation Ă  plus grande Ă©chelle, pour des applications nĂ©cessitant une plus grande capacitĂ© de calcul, pourrait apporter des gains environnementaux globaux plus importants. Par ailleurs, la virtualisation permet d’augmenter l’utilisation des serveurs existants, Ă©vitant ainsi la construction de nouveaux centres de donnĂ©es. Ainsi, l’ACV prĂ©liminaire a permis d’apporter une vision plus globale Ă  ce type d’évaluation environnementale, en considĂ©rant Ă  la fois l’entiĂšretĂ© du cycle de vie et un grand nombre de critĂšres environnementaux. Cependant, les incertitudes entourant les rĂ©sultats sont importantes et il serait nĂ©cessaire pour amĂ©liorer les ACV touchant au secteur des TIC d’augmenter la coopĂ©ration avec les industries utilisant et produisant les TIC, pour ainsi combler l’important manque de donnĂ©es prĂ©sent dans la description des TIC en ACV. ---------- Nowadays, electricity is an essential element of our modern society, people use it every day in their daily activities. However, its production is responsible for numerous environmental damages accounting for 30% of the global greenhouse gas (GHG) emissions, recognized by the majority of the scientific community to be the origin of the climatic changes. Data center can be considered as supercomputers and represent already 1,3 % of the global electricity consumption. It is a very high portion considering the youth of this technology. To decrease their consumption, different strategies of optimization exist, for example the virtualization, which makes it possible to install several operating systems known as virtual machines (VM) on a single server, increasing its utilization. But virtualization has a cost due to the utilisation of a programme, named “hypervisor”, which connects the VM to the hardware. To choose adequately the strategy to apply, it is necessary to possess an environmental evaluation tool, allowing the quantification of the environmental gains and costs of each optimization approaches. Life cycle assessment (LCA) seems to satisfy these conditions because it is a methodology that quantifies the potential environmental impacts of the entire life cycle of a product or service. However, LCA is a methodological tool still in development and many issues are in discussion. This master thesis aims to quantify the environmental impacts induced by the installation of data centers in Canada and to evaluate the environmental benefits of using virtualization. The problematic is divided in two parts: first, how to model the environmental impacts of new data centers installed in Canada, and second, how to quantify the benefits of virtualization? Two objectives are fixed to solve the two parts of the problematic: evaluate, by combining consequential LCA with an economic model, the environmental impacts of energy consumption of data centers implanted in Canada and evaluate with a LCA approach the environmental benefits of virtualization. The first part of the methodology develops a consequential and prospective approach, combining LCA with the economic model E3MC (which describe the North American energy sector and the Canadian economy), to evaluate the impact on the North American energy sector of datacenters implantation in Canada. A LCA is prospective when a future system is evaluated and it is consequential when impacts from a change due to a decision are evaluated. This analysis is xi divided in five steps. Firstly, scenarios are developed describing the evolution of datacenters consumption between 2015 and 2030, with a reference scenario where no datacenters are installed. Secondly, the scenarios are modeled in the E3MC model. Thirdly, the marginal technologies, those which supply the new datacenters, are identified with the results of the E3MC model. Fourthly, the environmental impacts are calculated with the marginal technologies, the impact method “IMPACT2002+” and the LCA data base ecoinvent. Finally the results are interpreted and compared with those obtained with an attributional approach. The second part of the methodology is a screening-LCA study where approximated data are used to draw a preliminary picture of the environmental impact of a videoconference service with call management servers relying on virtualization. The studied system is composed of four processes: the laptop and the internet access device of the customer, the energy consumption needed for the data transfer and the server hosting the videoconference. A more accurate study of the server impact is done to obtain a better description of the benefits of virtualization. Three scenarios of server utilization are modeled: in the first one the server does not use virtualisation, in the second scenario the virtualisation is used by the server and in the third scenario two servers, using the virtualisation, share the videoconferencing management to increase the reliability of the service. Finely, different sensitivity analysis are performed to test the conclusion strength. The results from the combination of LCA with the E3MC model indicate three major conclusions: first, the marginal technologies are mainly the natural gas and coal with a small participation of hydroelectricity at the end of the simulation. Second, the perturbation induced an important compensatory production from the United-States. Third, an increase of the perturbation induces an increase of the marginal production in Canada resulting in an increase of hydroelectricity and a decrease of the coal in the marginal technologies share. The model allows the consideration of a high number of parameters but increases also the uncertainties and decreases the results transparency. Indeed, the complexity of recent economic models increases the need for economists to create, use and understand them. We recommend to increase the cooperation between economists and LCA analysts in the future when economic models are used to determine marginal technologies. The screening LCA of the videoconferencing service indicates clearly a high decrease of server environmental impacts when virtualization is used, but the importance of these benefits is xii mitigated by the small participation of server to the videoconferencing service total impacts. Indeed, the total impacts are mainly caused by the customer’s laptop and the data transfer. But, the application of virtualization to a bigger scale and for applications requiring a higher computing capacity would bring more important environmental benefits. Thus the screening-LCA brought a more global vision to these types of problem by considering the entire life cycle and numerous environmental impacts. However, the uncertainties on the results are high and we recommend to involve industries producing and using ICT for future LCA on the subject to decrease the existing data gap in this sector in LCA

    Harmonic Plus Noise Model for Concatenative Speech Synthesis

    Get PDF
    This project develops the new model Harmonic Plus Noise applied for the concatenative speech synthesis. The software is composed of an analysis part (off-line process) applied on the first initial database and a synthesis part (real time process) applied on the HNM database and the prododic modifications from FESTIVAL. The future work consists of the integretion into the HMM-based speech synthesis

    Advances in hollow fiber membrane technology for high density perfusion cell culture

    Get PDF
    Continuous manufacturing, a trend for the production of biopharmaceuticals, holds significant promise for achieving higher productivity and lower cost. Next generation cell culture perfusion processes are among the most sought after to deliver on this promise. While the concept of continuous cell culture is not new, hurdles in the robustness and consistency of cell retention devices slow current adoption. The inherent lot-to-lot variability in commercially available hollow-fiber membrane (HFMs) often requires operators to adjust process parameters during the course of the perfusion process. Reproducible process intensification using HFM devices for cell retention requires membranes with consistent flux (LMH), bubble point (BP), and effective filtration area (EFA) optimized for each cell culture process. The selection of suitable membranes must depend on our knowledge of flux and BP, rather than on arbitrary membrane porosity designations. We found that the lot-to-lot distribution range of flux/BP in HFM devices is essential and must be within narrow limits to ensure consistent process performance. We present data on a novel well-characterized HFM device, SepraPorℱ, which is ideally suited to achieve reproducible results with constant process parameters
    • 

    corecore