51 research outputs found

    Effects of initial conditions and Mach number on turbulent mixing transition of shock-driven variable-density flow

    Get PDF
    This thesis presents results on the effects of initial conditions (single- and multi-mode) and incident shock wave Mach numbers (M) on several mixing characteristics in Richtmyer-Meshkov instability (RMI) evolution. These goals are achieved by performing two different experimental campaigns using a shock strength with an incident Mach number of 1.9 and 1.55. Each campaign follows the interface evolution after interaction with incident shock and reflected shock from the wall (reshock). In addition, two different initial perturbations are imposed to study RMI evolution at each Mach number. The first perturbation is a predominantly single-mode long-wavelength interface which is formed by inclining the entire tube to 80° relative to the horizontal, and thus can be considered as half the wavelength of a triangular wave. The second initial condition is a multi-mode interface, containing additional shorter wavelength perturbations due to the imposition of shear and buoyancy on the inclined perturbation of the first case. In both single- and multi-mode cases at each Mach number, the interface consists of a nitrogen-acetone mixture as the light gas over carbon dioxide as the heavy gas (Atwood number, A~0.22). The evolving density and velocity fields are measured simultaneously using planar laser-induced fluorescence (PLIF) and particle image velocimetry (PIV) techniques to provide the first detailed turbulence statistics measurements (i.e., Density, velocity, and density-velocity cross-statistics) using ensemble averaging for shock-accelerated variable density flows at M > 1.5 before and after reshock. The evolution of mixing is investigated via the density fields by computing mixed-mass and mixing layer thickness, along with mixing width, mixedness, and the density self-correlation (DSC). It is shown that the amount of mixing is dependent on both the initial conditions and the incident shock Mach number before reshock. Evolution of the density self-correlation is discussed and the relative importance of different DSC terms is shown through fields and spanwise-averaged profiles. The localized distribution of vorticity and the development of roll-up features in the flow is studied through the evolution of interface wrinkling and length of the interface edge, and indicates that the vorticity concentration shows a strong dependence on the Mach number. The contribution of different terms in the Favre-averaged Reynolds stress is shown, and while the mean density-velocity fluctuation correlation term is dominant, a high dependency on the initial condition and reshock is observed for the turbulent mass-flux term. Regarding the effects of initial conditions, density and velocity data show that a distinct memory of the initial conditions is maintained in the flow before interaction with reshock. After reshock, the influence of the long-wavelength inclined perturbation present in both initial conditions is still apparent, but the distinction between the two cases becomes less evident as smaller scales are present even in the single-mode case. Mixing transition is analyzed through two criteria: Reynolds number (Dimotakis, 2000) and time-dependent length scales (Robey et al., 2003). The Reynolds number threshold is surpassed in all cases after reshock. In addition, the Reynolds number is around the threshold range for the multi-mode, high Mach number case (M~1.9) before reshock. However, the time-dependent length-scale threshold is surpassed by all cases only at the latest time after reshock, while all cases at early times after reshock and the high Mach number case at the latest time before reshock fall around the threshold. The scaling analysis of turbulent kinetic energy spectra after reshock at the latest time, at which mixing transition analysis suggests that an inertial range has formed, indicates power scaling of -1.8±0.05 for the low Mach number case and -2.1±0.1 for the higher Mach number case. This is related to the high anisotropy observed in this flow resulting from strong, large-scale, streamwise fluctuations produced by large-scale shear. This work will help develop the capability to accurately predict and model extreme mixing, potentially leading to advances in a number of fields: energy, environment (atmospheric and oceanographic), aerospace engineering, and most pertinently, inertial confinement fusion (ICF).Ph.D

    Theoretical foundations of organizational problem solving methodologies in Operational Research

    Get PDF
    Paradigms are some principles and assumptions, which define frameworks and research priorities in each discipline. Many believe that operational research (OR) is not a science and like many other sciences does not have any paradigm. However, since OR is committed to scientific methods, therefore it contains a methodological paradigm. The purpose of this paper is to consider theoretical methodologies in the field of OR. This paper performs a review on the existing literature based on interpretive hermeneutic approach. Based on literature review, the study determines four principles and assumptions for each OR paradigms and a classification of the methods are presented. The results show that OR has four main paradigms and there are many methods in OR fields to tackle a particular problem where each problem belongs to a particular paradigm. In addition, instead of using a particular method in problem situations, we can implement a combination of methodologies

    Developing a novel method for predicting nearshore and offshore wave energy of the portuguese western coast using DELFT3D

    No full text
    Dissertação de Mestrado em Energia para a Sustentabilidade apresentada Ă  Faculdade de CiĂȘncias e Tecnologia da Universidade de Coimbr

    A Methodology for Business Model Change Inspired of Mobile Technology Entrance to Organization

    No full text
    This research has been designed in order to provide an appropriate methodology for Business Model Change and update based on applying new Information and Communication Technologies such as Mobile technology in an organization. In this way, after investigation of all related references in the field of Business Model, ICT change, business model change and applying mobile technology in organizations, the research framework has been derivate. In other phases of research, this framework has been evaluated based on expert judgment and System Dynamic approach. The result of this research includes a well defined methodology suitable for Business Model Change covering five phases and thirty seven detailed steps by relevant deliverable for each step

    A Conceptual Framework for Knowledge Architecture in Large-Scale Organizations

    No full text
    The main concern for most organizations in this age, which has been called the age of knowledge-based economy, is their success and superiority in competitive markets. Reviewing the parameters that might have been effective on the success of operationalizing a knowledge management project leads us to a potential factor as knowledge architecture. With regard to the significant effect of knowledge management in the efficiency of organizations and the inclination of current organizations toward large-scale, we try to come up with a suitable architectural framework in such organizations. The initial structure of the framework was established based on the information architectural framework of Zachman. Then, we arrived at the ideal knowledge architecture framework by adding service abstraction and architecture level dimension besides changing the field of all the cells within Zachman’s framework from information into knowledge. This research is descriptive in method and its validity is confirmed by executing a case study and soliciting the opinions of some knowledge architecture experts. The results indicate feasibility of applying framework to other knowledge-intensive large-scale firms and the findings of paper may be beneficial for architects in the knowledge area

    Identification and Ranking of Critical Success Factors of Knowledge Management Using Fuzzy Quality Function Deployment Approach: A Case Study

    No full text
    The main objective of this research is to rank the critical success factors of knowledge management using fuzzy quality function deployment (QFD) approach in MECO Company. The research has utilized a three-step qualitative-quantitative-qualitative strategy. At the first step (1st qualitative phase), the critical success factors of knowledge management and knowledge management outcomes were characterized in MECO. Then, the critical success factors of knowledge management which were identified by the first step were ranked using fuzzy quality function deployment in the second step (quantitative phase). Afterwards, some solutions were provided to realize and improve the critical success factors of knowledge management in the selected company within the third step of research (2nd qualitative phase). 11 critical success factors of knowledge management were introduced in addition to 4 knowledge management outcomes by the first qualitative phase. The result from the quantitative phase were indicative of the fact that “human resource management”, “support and leadership of management”, “organizational infrastructure”, “organizational culture” and “activities and processes” are known as five critical success factors of knowledge management in the chosen company. Furthermore, 23 solutions provided by this research in order to realize and improve the critical success factors of knowledge management in this company are achieved as a result of implementing the second qualitative phase. Based on the information accessible for the researchers, this is one of the first works which evaluates the key factors of successful knowledge management through fuzzy quality function deployment approach. It is expected that the proposed method would represent appropriate tools for enterprises which have decided to implement knowledge management because it prioritizes the critical success factors based on the knowledge management outcomes

    Developing TOPSIS method using statistical normalization for selecting knowledge management strategies

    No full text
    Purpose: Numerous companies are expecting their knowledge management (KM) to be performed effectively in order to leverage and transform the knowledge into competitive advantages. However, here raises a critical issue of how companies can better evaluate and select a favorable KM strategy prior to a successful KM implementation. Design/methodology/approach: An extension of TOPSIS, a multi-attribute decision making (MADM) technique, to a group decision environment is investigated. TOPSIS is a practical and useful technique for ranking and selection of a number of externally determined alternatives through distance measures. The entropy method is often used for assessing weights in the TOPSIS method. Entropy in information theory is a criterion uses for measuring the amount of disorder represented by a discrete probability distribution. According to decrease resistance degree of employees opposite of implementing a new strategy, it seems necessary to spot all managers’ opinion. The normal distribution considered the most prominent probability distribution in statistics is used to normalize gathered data. Findings: The results of this study show that by considering 6 criteria for alternatives Evaluation, the most appropriate KM strategy to implement in our company was ‘‘Personalization’’. Research limitations/implications: In this research, there are some assumptions that might affect the accuracy of the approach such as normal distribution of sample and community. These assumptions can be changed in future work. Originality/value: This paper proposes an effective solution based on combined entropy and TOPSIS approach to help companies that need to evaluate and select KM strategies. In represented solution, opinions of all managers is gathered and normalized by using standard normal distribution and central limit theorem.Peer Reviewe

    ANP-based decision support system for selecting ERP systems in oil industry by using balanced scorecard

    No full text
    Using IT in Today's World gurantees the existence and survival of the activities of an organization and without using them, Not only the use of modern methods in the organization becomes impossible but the possibility of competence with other organizations will be eradicated. The ERP is one of the important usages of IT in the organizations that have got a special place. In the beginning, using the BSC, the Key performance indicators are determined. Using this indicators and standards of choosing, it was determined the best ERP system that is described in research literature in order to shopping the ERP software pack. In this research regarding to this fact that these indicators can have effects on each other and these mutual impacts can have an effect on the importance degree of indicators and best alternative selection; the ANP method has been used.The result of this research is to choose the best ERP software pack available among many in a way that it is suitable with the organizational strategies & goals

    Developing TOPSIS method using statistical normalization for selecting knowledge management strategies

    Get PDF
    Purpose: Numerous companies are expecting their knowledge management (KM) to be performed effectively in order to leverage and transform the knowledge into competitive advantages. However, here raises a critical issue of how companies can better evaluate and select a favorable KM strategy prior to a successful KM implementation. Design/methodology/approach: An extension of TOPSIS, a multi-attribute decision making (MADM) technique, to a group decision environment is investigated. TOPSIS is a practical and useful technique for ranking and selection of a number of externally determined alternatives through distance measures. The entropy method is often used for assessing weights in the TOPSIS method. Entropy in information theory is a criterion uses for measuring the amount of disorder represented by a discrete probability distribution. According to decrease resistance degree of employees opposite of implementing a new strategy, it seems necessary to spot all managers’ opinion. The normal distribution considered the most prominent probability distribution in statistics is used to normalize gathered data. Findings: The results of this study show that by considering 6 criteria for alternatives Evaluation, the most appropriate KM strategy to implement in our company was ‘‘Personalization’’. Research limitations/implications: In this research, there are some assumptions that might affect the accuracy of the approach such as normal distribution of sample and community. These assumptions can be changed in future work. Originality/value: This paper proposes an effective solution based on combined entropy and TOPSIS approach to help companies that need to evaluate and select KM strategies. In represented solution, opinions of all managers is gathered and normalized by using standard normal distribution and central limit theorem
    • 

    corecore