1,540 research outputs found
Cormack Research Project: Glasgow University
The aim of this project was to investigate and improve upon existing methods of analysing data from COMITEL on the Gamma Ray Observatory for neutrons emitted during solar flares. In particular, a strategy for placing confidence intervals on neutron energy distributions, due to uncertainties on the response matrix has been developed. We have also been able to demonstrate the superior performance of one of a range of possible statistical regularization strategies. A method of generating likely models of neutron energy distributions has also been developed as a tool to this end. The project involved solving an inverse problem with noise being added to the data in various ways. To achieve this pre-existing C code was used to run Fortran subroutines which performed statistical regularization on the data
Recommended from our members
Reduced-Rank Multi-Fidelity Modeling of Complex Aerodynamic Flows
This thesis explores the efficacy of a reduced-rank bi-fidelity modeling framework when applied to aerodynamic flows characterized by high Reynolds numbers, complex geometries, interacting cross-flows, and highly unsteady turbulent separation. It asks the question, "how can we speed up parameter space exploration of aerodynamic flows whose complexity prohibits practitioners from carrying out more than a handful of costly high-fidelity simulations?"
The bi-fidelity method's viability is initially demonstrated on a two-dimensional NACA airfoil subject to geometric variation. The bi-fidelity model is able to predict pressure coefficient along the airfoil to within RANS validation error at a cost that is roughly 20x less than a full suite of high-fidelity simulations.
Two extensions to the bi-fidelity framework are then proposed and evaluated. The first is a general, formal procedure for assessing the model's viability for application to a novel flow. The second is a method for expanding the parameter space after a bi-fidelity model has already been constructed. Below a certain initial bi-fidelity model rank, the high-fidelity solutions used to construct the initial bi-fidelity model are equally useful for the expanded-space model, meaning computational resources expended on those initial high-fidelity solutions are not wasted and can be re-used.
The remainder of the thesis concerns the aggressive diffuser, which is a complex three-dimensional system representing an industrially-relevant flow to which the bi-fidelity modeling framework can be applied. A high-fidelity DDES model of the diffuser is constructed and extensively validated against time- and phase-averaged experimental data provided by Prof. Amitay's group at Rensselaer Polytechnic Institute (RPI). The dynamics of this flow are documented in detail using the DDES simulation's phase-averaged fields, and potential avenues for improving the flow control are suggested based on our enhanced understanding of the dynamics.
Finally, the reduced-rank bi-fidelity modeling framework is applied to a closely-related aggressive diffuser that uses a segmented tangential blower and corner suction to control separation. On average, one high-fidelity model evaluation requires over 800,000 core-hours on an IBM Blue Gene/Q supercomputer, which is over 350 times higher than the cost of one low-fidelity model evaluation. Bi-fidelity models of ranks 10 through 19 achieve qualitatively favorable predictions of both time- and ensemble-averaged velocity magnitude at the diffuser exit plane on this complex, industrially-relevant three-dimensional system at a cost reduction over a traditional brute-force approach of one to two orders of magnitude.
In summary, this thesis demonstrates that the reduced-rank bi-fidelity modeling framework can achieve favorable predictive accuracy at over an order of magnitude reduction in simulation cost when applied not only to simple two-dimensional aerodynamic test cases, but also to complex aerodynamic flows at the forefront of our modeling capabilities. Most importantly, this method is applicable to such systems where only a handful of high-fidelity model evaluations can be carried out due to their outsize computational cost. These results motivate further study of this bi-fidelity modeling framework and its application to industrially-relevant systems of interest to practitioners.</p
Synthesizing Qualitative Evidence: A Roadmap for Information Systems Research
Qualitative synthesis research is an approach that consolidates the output of different qualitative studies to create new subject knowledge. Such work can help reveal more powerful explanations than that seen in a single study, thereby generating increased levels of understanding of a given phenomenon and greater research finding generalizability. Based on a review of the literature and a survey of qualitative researchers, we found that the information systems (IS) domain lacks a clear understanding of qualitative synthesis methods and, as a result, has largely failed to take advantage of this powerful, high-potential methodological opportunity. To address this shortcoming, this paper is the first to provide a rigorous overview of the full suite of 35 qualitative synthesis methods, as well as guidelines that include a three-tiered selection framework. By using the guidelines and framework in tandem, IS researchers are able to select the qualitative synthesis method most appropriate for a given research study, particularly when the research objective involves knowledge integration/aggregation, interpretation/theory development, and/or informing IS practice
Lean Resource Scheduling Algorithm with Maximized Resource Utilization Using Iterative Local Search
This thesis presents a lean resource scheduling algorithm which merges traditional machine scheduling problems with Lean Manufacturing concepts to determine the resource levels, such as employee headcount or number of machines used in production, and the corresponding schedule which minimize resource idle time while keeping scheduled makespan within a neighborhood around the takt time. The algorithm begins by solving a relaxed problem to find a satisfactory makespan via iterative local search, then solving a secondary problem to minimize the idle time subject to a makespan neighborhood constraint.
Experiments were conducted on a randomly generated dataset with six different factors, and both the overall program run time and the amount of idle time reduction between the first feasible solution and final solution were measured. The algorithm executes in a relatively short time, even for moderately large problem instances, and the idle time reductions are promising at a grand average of twenty-five percent reduction.
The results of the algorithm are promising on the test sets, although the method has not been tested in a practical case study. Given the promising results, further study on the underlying model, algorithm performance, and testing in a practical application are recommended
Lean Resource Scheduling Algorithm with Maximized Resource Utilization Using Iterative Local Search
This thesis presents a lean resource scheduling algorithm which merges traditional machine scheduling problems with Lean Manufacturing concepts to determine the resource levels, such as employee headcount or number of machines used in production, and the corresponding schedule which minimize resource idle time while keeping scheduled makespan within a neighborhood around the takt time. The algorithm begins by solving a relaxed problem to find a satisfactory makespan via iterative local search, then solving a secondary problem to minimize the idle time subject to a makespan neighborhood constraint.
Experiments were conducted on a randomly generated dataset with six different factors, and both the overall program run time and the amount of idle time reduction between the first feasible solution and final solution were measured. The algorithm executes in a relatively short time, even for moderately large problem instances, and the idle time reductions are promising at a grand average of twenty-five percent reduction.
The results of the algorithm are promising on the test sets, although the method has not been tested in a practical case study. Given the promising results, further study on the underlying model, algorithm performance, and testing in a practical application are recommended
Reviewing the Past for a Better Future: Reevaluating the IT Project Retrospective
This paper provides a commentary on previous research to inform our understanding of IT project retrospectives. The literature surrounding project retrospective outcomes, measurement and processes are discussed, and critical factors necessary for project retrospective success are considered. Consequently, semi-structured interviews are undertaken with experienced project managers to determine levels of agreement between research and practitioner disciplines. Outcome findings include multiple project retrospective definitions being used, differing project retrospective outcomes being desired, thirteen project retrospective processes being advocated, and no project retrospective measurements given to confirm whether these outcomes have been successfully achieved. Subsequently, project retrospective processes are presented such that each process has the capability to deliver on any outcome irrespective of its nature. Further research is suggested necessary to pursue a more rigorous and relevant conceptual understanding of the IT project retrospective construct
A Wearable System that Knows Who Wears It
Body-area networks of pervasive wearable devices are increasingly used for health monitoring, personal assistance, entertainment, and home automation. In an ideal world, a user would simply wear their desired set of devices with no configuration necessary: the devices would discover each other, recognize that they are on the same person, construct a secure communications channel, and recognize the user to which they are attached. In this paper we address a portion of this vision by offering a wearable system that unobtrusively recognizes the person wearing it. Because it can recognize the user, our system can properly label sensor data or personalize interactions. \par Our recognition method uses bioimpedance, a measurement of how tissue responds when exposed to an electrical current. By collecting bioimpedance samples using a small wearable device we designed, our system can determine that (a)the wearer is indeed the expected person and (b) the device is physically on the wearer\u27s body. Our recognition method works with 98% balanced-accuracy under a cross-validation of a day\u27s worth of bioimpedance samples from a cohort of 8 volunteer subjects. We also demonstrate that our system continues to recognize a subset of these subjects even several months later. Finally, we measure the energy requirements of our system as implemented on a Nexus S smart phone and custom-designed module for the Shimmer sensing platform
The Delphi Method Research Strategy in Studies of Information Systems
In this paper, we discuss the nature and use of the Delphi methodology in information systems research. More specifically, we explore how and why it may be used. We discuss criteria for evaluating Delphi research and define characteristics useful for categorizing the studies. We review Delphi application use in IS research over the last 23 years, summarize lessons learned from prior studies, offer suggestions for improvement, and present guidelines for employing this distinctly useful qualitative method in future information systems research studies
Planning for Failure: An Exploratory Study of a Proactive IS Project Recovery Team
Despite extensive research on project management over the past several decades, numerous cases of IS project failure continue to surface, undermining organizational performance in almost every industry. The ongoing nature of this issue obliges the IS discipline to consider alternative approaches to avoiding failure before it’s too late. In this paper, a proactive approach to project recovery is presented – one that involves a full-time recovery team responsible for turning around IS projects in distress. Using the findings gleaned from an in-depth case study inquiry, this paper analyzes the composition and structure of a dedicated project recovery team in a global organization. The investigation revealed (1) a process model of IS project recovery that comprises seven stages of evolution, (2) requisite attributes and skills of project recovery specialists, and (3) the differences between project recovery and project management. The implications arising from this novel study for both research and practice are discussed
- …