3,768 research outputs found

    Correction to: Erbium 3-µm fiber lasers

    Get PDF

    Value innovation modelling: Design thinking as a tool for business analysis and strategy

    Get PDF
    This paper explores the use of multiple perspective problem framing (English 2008) as a tool to reveal hidden value and commercial opportunity for business. Creative thinking involves the interrelationship of parameters held open and fluid within the cognitive span of the creative mind. The recognition of new associations can create new value that can lead to innovation in designed products, intellectual property and business strategy. The ‘Ideas-lab’ process is based on the proposition that a company’s capacity for innovation is dependent on the way the business is able to see its problems and opportunities. In this process the attributes of a company and the experience of the researchers are considered as the parameters of a design problem. It is therefore important to acknowledge the commercial experience of the project researchers, all of whom have a proven track record in helping businesses develop, exploit and protect their know how. Semi structured interviews were carried out with key individuals in 34 companies. The resulting data was assessed on a company-by-company basis through a process of multiple perspective problem framing, enabling key nodes, patterns and relationships to be identified and explored. A ‘Cornerstones of Innovation’ report was prepared to inform each company of the observations made by the researchers. The paper describes the methods adopted and summarises the feedback from participating companies. Case studies are highlighted to demonstrate ways in which the process influenced the actions of particular businesses, and the commercial outcomes that resulted. Finally the researchers reflect on the structure of the Ideas-lab process

    Strong Cesaro Summability Factors

    Get PDF

    A Parallel Solution Adaptive Implementation of the Direct Simulation Monte Carlo Method

    Get PDF
    This thesis deals with the direct simulation Monte Carlo (DSMC) method of analysing gas flows. The DSMC method was initially proposed as a method for predicting rarefied flows where the Navier-Stokes equations are inaccurate. It has now been extended to near continuum flows. The method models gas flows using simulation molecules which represent a large number of real molecules in a probabilistic simulation to solve the Boltzmann equation. Molecules are moved through a simulation of physical space in a realistic manner that is directly coupled to physical time such that unsteady flow characteristics are modelled. Intermolecular collisions and moleculesurface collisions are calculated using probabilistic, phenomenological models. The fundamental assumption of the DSMC method is that the molecular movement and collision phases can be decoupled over time periods that are smaller than the mean collision time. Two obstacles to the wide spread use of the DSMC method as an engineering tool are in the areas of simulation configuration, which is the configuration of the simulation parameters to provide a valid solution, and the time required to obtain a solution. For complex problems, the simulation will need to be run multiple times, with the simulation configuration being modified between runs to provide an accurate solution for the previous run's results, until the solution converges. This task is time consuming and requires the user to have a good understanding of the DSMC method. Furthermore, the computational resources required by a DSMC simulation increase rapidly as the simulation approaches the continuum regime. Similarly, the computational requirements of three-dimensional problems are generally two orders of magnitude more than two-dimensional problems. These large computational requirements significantly limit the range of problems that can be practically solved on an engineering workstation or desktop computer. The first major contribution of this thesis is in the development of a DSMC implementation that automatically adapts the simulation. Rather than modifying the simulation configuration between solution runs, this thesis presents the formulation of algorithms that allow the simulation configuration to be automatically adapted during a single run. These adaption algorithms adjust the three main parameters that effect the accuracy of a DSMC simulation, namely the solution grid, the time step and the simulation molecule number density. The second major contribution extends the parallelisation of the DSMC method. The implementation developed in this thesis combines the capability to use a cluster of computers to increase the maximum size of problem that can be solved while simultaneously allowing excess computational resources to decrease the total solution time. Results are presented to verify the accuracy of the underlying DSMC implementation, the utility of the solution adaption algorithms and the efficiency of the parallelisation implementation

    A Parallel Solution Adaptive Implementation of the Direct Simulation Monte Carlo Method

    Get PDF
    This thesis deals with the direct simulation Monte Carlo (DSMC) method of analysing gas flows. The DSMC method was initially proposed as a method for predicting rarefied flows where the Navier-Stokes equations are inaccurate. It has now been extended to near continuum flows. The method models gas flows using simulation molecules which represent a large number of real molecules in a probabilistic simulation to solve the Boltzmann equation. Molecules are moved through a simulation of physical space in a realistic manner that is directly coupled to physical time such that unsteady flow characteristics are modelled. Intermolecular collisions and moleculesurface collisions are calculated using probabilistic, phenomenological models. The fundamental assumption of the DSMC method is that the molecular movement and collision phases can be decoupled over time periods that are smaller than the mean collision time. Two obstacles to the wide spread use of the DSMC method as an engineering tool are in the areas of simulation configuration, which is the configuration of the simulation parameters to provide a valid solution, and the time required to obtain a solution. For complex problems, the simulation will need to be run multiple times, with the simulation configuration being modified between runs to provide an accurate solution for the previous run's results, until the solution converges. This task is time consuming and requires the user to have a good understanding of the DSMC method. Furthermore, the computational resources required by a DSMC simulation increase rapidly as the simulation approaches the continuum regime. Similarly, the computational requirements of three-dimensional problems are generally two orders of magnitude more than two-dimensional problems. These large computational requirements significantly limit the range of problems that can be practically solved on an engineering workstation or desktop computer. The first major contribution of this thesis is in the development of a DSMC implementation that automatically adapts the simulation. Rather than modifying the simulation configuration between solution runs, this thesis presents the formulation of algorithms that allow the simulation configuration to be automatically adapted during a single run. These adaption algorithms adjust the three main parameters that effect the accuracy of a DSMC simulation, namely the solution grid, the time step and the simulation molecule number density. The second major contribution extends the parallelisation of the DSMC method. The implementation developed in this thesis combines the capability to use a cluster of computers to increase the maximum size of problem that can be solved while simultaneously allowing excess computational resources to decrease the total solution time. Results are presented to verify the accuracy of the underlying DSMC implementation, the utility of the solution adaption algorithms and the efficiency of the parallelisation implementation

    Discovery of a lipid synthesising organ in the auditory system of an insect

    Get PDF
    Weta possess typical Ensifera ears. Each ear comprises three functional parts: two equally sized tympanal membranes, an underlying system of modified tracheal chambers, and the auditory sensory organ, the crista acustica. This organ sits within an enclosed fluid-filled channel–previously presumed to be hemolymph. The role this channel plays in insect hearing is unknown. We discovered that the fluid within the channel is not actually hemolymph, but a medium composed principally of lipid from a new class. Three-dimensional imaging of this lipid channel revealed a previously undescribed tissue structure within the channel, which we refer to as the olivarius organ. Investigations into the function of the olivarius reveal de novo lipid synthesis indicating that it is producing these lipids in situ from acetate. The auditory role of this lipid channel was investigated using Laser Doppler vibrometry of the tympanal membrane, which shows that the displacement of the membrane is significantly increased when the lipid is removed from the auditory system. Neural sensitivity of the system, however, decreased upon removal of the lipid–a surprising result considering that in a typical auditory system both the mechanical and auditory sensitivity are positively correlated. These two results coupled with 3D modelling of the auditory system lead us to hypothesize a model for weta audition, relying strongly on the presence of the lipid channel. This is the first instance of lipids being associated with an auditory system outside of the Odentocete cetaceans, demonstrating convergence for the use of lipids in hearing

    Energy recycling versus lifetime quenching in erbium-doped 3-µm fiber lasers

    Get PDF
    Based on recently published spectroscopic measurements of the relevant energy-transfer parameters, we performed a detailed analysis of the population mechanisms and the characteristics of the output from Er3+-singly-doped and Er3+, Pr3+-codoped ZBLAN fiber lasers operating at 3 um, for various Er3+ concentrations and pump powers. Whereas both approaches resulted in similar laser performance at Er3+ concentrations 4 mol.% and pump powers 10 W absorbed, it is theoretically shown here that the Er3+-singly-doped system will be advantageous for higher Er3+ concentrations and pump powers. In this case, energy recycling by energy-transfer upconversion from the lower to the upper laser level can increase the slope efficiency to values greater than the Stokes efficiency, as is associated with a number of Er3+-doped crystal lasers. Output powers at 3 um on the order of 10 W are predicted

    Erbium 3-µm fiber lasers

    Get PDF
    With its recent breakthrough in terms of output power, the erbium 3- mfiber laser has become an object of intense scientific research and an increasingly attractive tool for medical applications. This paper reviews the research on the erbium 3-um fiber laser since its first demonstration. Its development is seen in relationship to the early success of the corresponding crystal laser system, to the foundations that were laid by the investigation of its spectroscopy and population mechanisms, and the recent technological developments in related fields

    Wiki Design for Student Assignments: Should it be Prescribed or Emergent?

    Get PDF
    In this paper we examine how to approach the question of information and site design in the use of wikis for student group assignments. The popular literature about Wikis proposes that they allow for “emergent, user-driven design”. We develop a model in order to analyse what approach to design might be appropriate in student group work. We gave one class of students a prescribed assignment layout with clear instructions regarding navigation menus and another group the same assignment with little or no guidance about how to design their site. Initial results show that prescribing the design increases perceptions of self-efficacy. Whilst self-efficacy is correlated with higher perceived quality of the site and with use of a greater range of wiki functions, there is no correlation with perceived usefulness of the wiki as a tool
    corecore