7,617 research outputs found
Quantum dots based superluminescent diodes and photonic crystal surface emitting lasers
This thesis reports the design, fabrication, and electrical and optical characterisations of GaAs-based quantum dot (QD) photonic devices, specifically focusing on superluminescent diodes (SLDs) and photonic crystal surface-emitting lasers (PCSELs). The integration of QD active regions in these devices is advantageous due to their characteristics such as temperature insensitivity, feedback insensitivity, and ability to utilise the ground state (GS) and excited state (ES) of the dots.
In an initial study concerning the fabrication of QD-SLDs, the influence of ridge waveguide etch depth on the electrical and optical properties of the devices are investigated. It is shown that the output power and modal gain from shallow etched ridge waveguide is higher than those of deep etched waveguides. Subsequently, the thermal performance of the devices is analysed. With increased temperature over 170 ÂşC, the spectral bandwidth is dramatically increased by thermally excited carrier transition in excited states of the dots.
Following this, an investigation of a high dot density hybrid quantum well/ quantum dot (QW/QD) active structure for broadband, high-modal gain SLDs is presented. The influence of the number of QD layers on the modal gain of hybrid QW/QD structures is analysed. It is shown that higher number of dot layer provides higher modal gain value, however, there is lack of emission from QW due to the requirement of large number of carriers to saturate the QD. Additionally, a comparison is made between “unchirped QD” and “ chirped QD” of hybrid QW/QD structure in terms of modal gain and spectral bandwidth. It is showed that “chirped” of the QD can improve the “flatness” of the spectral bandwidth.
Lastly, the use of self-assembled InAs QD as the active material in epitaxially regrown GaAs-based PCSELs is explored for the first time. Initially, it is shown that both GS and ES lasing can be achieved for QD-PCSELs by changing the grating period of the photonic crystal (PC). The careful design of these grating periods allows lasing from neighbouring devices at GS ( ~1230 nm) and ES (~1140 nm), 90 nm apart in wavelength. Following this, the effect of device area, PC etch depth, PC atom shape (circle or triangle or orientation) on lasing performance is presented. It is shown that lower threshold current density and higher slope efficiencies is achieved with increasing the device size. The deeper PC height device has higher output power due to more suitable height and minimal distance to active region. The triangular atom shape has slightly higher slope efficiency compared to triangular atom shape which is attributed to breaking in-plane symmetry and increase out-of-plane emission
The determinants of value addition: a crtitical analysis of global software engineering industry in Sri Lanka
It was evident through the literature that the perceived value delivery of the global software
engineering industry is low due to various facts. Therefore, this research concerns global
software product companies in Sri Lanka to explore the software engineering methods and
practices in increasing the value addition. The overall aim of the study is to identify the key
determinants for value addition in the global software engineering industry and critically
evaluate the impact of them for the software product companies to help maximise the value
addition to ultimately assure the sustainability of the industry.
An exploratory research approach was used initially since findings would emerge while the
study unfolds. Mixed method was employed as the literature itself was inadequate to
investigate the problem effectively to formulate the research framework. Twenty-three face-to-face online interviews were conducted with the subject matter experts covering all the
disciplines from the targeted organisations which was combined with the literature findings as
well as the outcomes of the market research outcomes conducted by both government and nongovernment institutes. Data from the interviews were analysed using NVivo 12. The findings
of the existing literature were verified through the exploratory study and the outcomes were
used to formulate the questionnaire for the public survey. 371 responses were considered after
cleansing the total responses received for the data analysis through SPSS 21 with alpha level
0.05. Internal consistency test was done before the descriptive analysis. After assuring the
reliability of the dataset, the correlation test, multiple regression test and analysis of variance
(ANOVA) test were carried out to fulfil the requirements of meeting the research objectives.
Five determinants for value addition were identified along with the key themes for each area.
They are staffing, delivery process, use of tools, governance, and technology infrastructure.
The cross-functional and self-organised teams built around the value streams, employing a
properly interconnected software delivery process with the right governance in the delivery
pipelines, selection of tools and providing the right infrastructure increases the value delivery.
Moreover, the constraints for value addition are poor interconnection in the internal processes,
rigid functional hierarchies, inaccurate selections and uses of tools, inflexible team
arrangements and inadequate focus for the technology infrastructure. The findings add to the
existing body of knowledge on increasing the value addition by employing effective processes,
practices and tools and the impacts of inaccurate applications the same in the global software
engineering industry
Investigation of microparticle behavior in Newtonian, viscoelastic, and shear-thickening flows in straight microchannels
Sorting and separation of small substances such as cells, microorganisms, and micro- and nano-particles from a heterogeneous mixture is a common sample preparation step in many areas of biology, biotechnology, and medicine. Portability and inexpensive design of microfluidic-based sorting systems have benefited many of these biomedical applications. Accordingly, we have investigated microparticle hydrodynamics in fluids with various rheological behaviors (i.e., Newtonian, shear-thinning viscoelastic and shear-thickening non-Newtonian) flowing in straight microchannels. Numerical models were developed to simulate particles trajectories in Newtonian water and shear-thinning polyethylene oxide (PEO) solutions. The validated models were then used to perform numerical parametric studies and non-dimensional analysis on the Newtonian inertia-magnetic and shear-thinning elasto-inertal focusing regimes. Finally, the straight microfluidic device that was tested for Newtonian water and shear-thinning viscoelastic PEO solution, were adopted to experimentally study microparticle behavior in SiO2/Water shear-thickening nanofluid
Predictive Maintenance of Critical Equipment for Floating Liquefied Natural Gas Liquefaction Process
Predictive Maintenance of Critical Equipment for Liquefied Natural Gas Liquefaction Process
Meeting global energy demand is a massive challenge, especially with the quest of more affinity towards sustainable and cleaner energy. Natural gas is viewed as a bridge fuel to a renewable energy. LNG as a processed form of natural gas is the fastest growing and cleanest form of fossil fuel. Recently, the unprecedented increased in LNG demand, pushes its exploration and processing into offshore as Floating LNG (FLNG). The offshore topsides gas processes and liquefaction has been identified as one of the great challenges of FLNG. Maintaining topside liquefaction process asset such as gas turbine is critical to profitability and reliability, availability of the process facilities. With the setbacks of widely used reactive and preventive time-based maintenances approaches, to meet the optimal reliability and availability requirements of oil and gas operators, this thesis presents a framework driven by AI-based learning approaches for predictive maintenance. The framework is aimed at leveraging the value of condition-based maintenance to minimises the failures and downtimes of critical FLNG equipment (Aeroderivative gas turbine).
In this study, gas turbine thermodynamics were introduced, as well as some factors affecting gas turbine modelling. Some important considerations whilst modelling gas turbine system such as modelling objectives, modelling methods, as well as approaches in modelling gas turbines were investigated. These give basis and mathematical background to develop a gas turbine simulated model. The behaviour of simple cycle HDGT was simulated using thermodynamic laws and operational data based on Rowen model. Simulink model is created using experimental data based on Rowen’s model, which is aimed at exploring transient behaviour of an industrial gas turbine. The results show the capability of Simulink model in capture nonlinear dynamics of the gas turbine system, although constraint to be applied for further condition monitoring studies, due to lack of some suitable relevant correlated features required by the model.
AI-based models were found to perform well in predicting gas turbines failures. These capabilities were investigated by this thesis and validated using an experimental data obtained from gas turbine engine facility. The dynamic behaviours gas turbines changes when exposed to different varieties of fuel. A diagnostics-based AI models were developed to diagnose different gas turbine engine’s failures associated with exposure to various types of fuels. The capabilities of Principal Component Analysis (PCA) technique have been harnessed to reduce the dimensionality of the dataset and extract good features for the diagnostics model development.
Signal processing-based (time-domain, frequency domain, time-frequency domain) techniques have also been used as feature extraction tools, and significantly added more correlations to the dataset and influences the prediction results obtained. Signal processing played a vital role in extracting good features for the diagnostic models when compared PCA. The overall results obtained from both PCA, and signal processing-based models demonstrated the capabilities of neural network-based models in predicting gas turbine’s failures. Further, deep learning-based LSTM model have been developed, which extract features from the time series dataset directly, and hence does not require any feature extraction tool. The LSTM model achieved the highest performance and prediction accuracy, compared to both PCA-based and signal processing-based the models.
In summary, it is concluded from this thesis that despite some challenges related to gas turbines Simulink Model for not being integrated fully for gas turbine condition monitoring studies, yet data-driven models have proven strong potentials and excellent performances on gas turbine’s CBM diagnostics. The models developed in this thesis can be used for design and manufacturing purposes on gas turbines applied to FLNG, especially on condition monitoring and fault detection of gas turbines. The result obtained would provide valuable understanding and helpful guidance for researchers and practitioners to implement robust predictive maintenance models that will enhance the reliability and availability of FLNG critical equipment.Petroleum Technology Development Funds (PTDF) Nigeri
From wallet to mobile: exploring how mobile payments create customer value in the service experience
This study explores how mobile proximity payments (MPP) (e.g., Apple Pay) create customer value in the service experience compared to traditional payment methods (e.g. cash and card). The main objectives were firstly to understand how customer value manifests as an outcome in the MPP service experience, and secondly to understand how the customer activities in the process of using MPP create customer value. To achieve these objectives a conceptual framework is built upon the Grönroos-Voima Value Model (Grönroos and Voima, 2013), and uses the Theory of Consumption Value (Sheth et al., 1991) to determine the customer value constructs for MPP, which is complimented with Script theory (Abelson, 1981) to determine the value creating activities the consumer does in the process of paying with MPP.
The study uses a sequential exploratory mixed methods design, wherein the first qualitative stage uses two methods, self-observations (n=200) and semi-structured interviews (n=18). The subsequent second quantitative stage uses an online survey (n=441) and Structural Equation Modelling analysis to further examine the relationships and effect between the value creating activities and customer value constructs identified in stage one. The academic contributions include the development of a model of mobile payment services value creation in the service experience, introducing the concept of in-use barriers which occur after adoption and constrains the consumers existing use of MPP, and revealing the importance of the mobile in-hand momentary condition as an antecedent state. Additionally, the customer value perspective of this thesis demonstrates an alternative to the dominant Information Technology approaches to researching mobile payments and broadens the view of technology from purely an object a user interacts with to an object that is immersed in consumers’ daily life
Oberflächenemittierende Laser mit vertikaler Kavität (VCSELs) und VCSEL-Arrays für Kommunikation und Sensorik
Future generations of optical wireless communication and sensing systems require compact, low-cost, reliable, and highly efficient light sources capable of transmitting modulated beams across free space at gigabit per second (Gbps) data rates and pulsed beams with sub-nanosecond rise and fall times. The infrared vertical cavity surface emitting laser (VCSEL) is exactly one such light source. Fifth generation (5G) systems promise to connect billions of people and trillions of Internet of Things gadgets and sensors at 1 to beyond 20 Gbps via newly auctioned millimeter wave (30 GHz to 300 GHz) spectral bands. By circa 2030 sixth generation (6G) systems envision vast broadband capacity with zero latency – enabling real-time virtual and mixed realities, human-machine interfaces, autonomous vehicles, and much more. The 6G technology adds terahertz wave emitters including infrared VCSELs and VCSEL arrays to vastly increase data rates, boost energy and spectral efficiency, and take advantage of available and unregulated spectral bands. I design, fabricate, and test new experimental VCSEL diodes and novel two-dimensional (2D) VCSEL diode arrays. I study the physics and performance trade-offs of VCSEL light emitters aimed at 5G and 6G optical wireless communication and sensing applications. Via in-house computer modeling and simulation programs, I design VCSEL epitaxial structures – composed of nanometer-thick aluminum-gallium-arsenide, indium-gallium arsenide, and gallium-arsenide-phosphide layers – with peak target emission wavelengths of 940 and 980 nanometers. A commercial foundry grows my experimental VCSEL epitaxial wafers by metal-organic vapor phase epitaxy on 3-inch diameter gallium-arsenide substrates. In my university cleanroom, I fabricate my VCSELs as quarter wafer test pieces using a new VCSEL Array 2018 mask set which contains single VCSELs, and several variations of novel 2D electrically parallel triple (3-element), septuple (7-element), and novemdecuple (19-element) geometric device designs. My fabricated devices feature high frequency, coplanar ground-signal-ground metal contact pads, and top-epitaxial-surface emission. I perform all device tests in my university laser diode laboratory via direct, on-wafer electrical probing under computer control, starting with continuous wave light output power-current-voltage sweeps via a calibrated photodiode-integrating sphere and variable current source. For emission spectra and small-signal frequency response measurements, I collect the emitted VCSEL light with a standard OM1 multiple mode optical fiber (MMF) – connected to either an optical spectrum analyzer or a photoreceiver. For on-wafer data transmission tests across OM1 MMF patch cords, I modulate my VCSELs with nonreturn to zero, pseudorandom bit patterns in the form of 2-level pulse amplitude modulation. I achieve record combinations of optical output power, bandwidth, and efficiency for my large oxide aperture diameter (larger than 20 micrometers) VCSELs and for my VCSEL arrays. For example, I demonstrate 200 milliwatts of optical output power, a bandwidth of 18 GHz, and a wall plug efficiency of 35 percent with a 19-element VCSEL array. I set several records for error free data transmission, for example, 40 Gbps for my triple and septuple VCSEL arrays and 25 Gbps for my novemdecuple VCSEL arrays, well beyond the previous record of 10 Gbps. My work is the first to investigate trade-offs in the highly nontrivial physics of VCSEL arrays aimed at high power and high bandwidth arrays for free space data transmission – producing new guiding principles for further device optimization and product development.Zukünftige Generationen optischer drahtloser Kommunikations- und Sensorsysteme erfordern kompakte, kostengünstige, zuverlässige und hocheffiziente Lichtquellen, die modulierte Strahlen mit Datenraten von Gigabit pro Sekunde (Gbps) und gepulste Strahlen mit Anstieg- und Abfallzeiten im Sub-Nanosekundenbereich über den freien Raum übertragen können. Infrarote, oberflächenemittierende Laser mit vertikaler Kavität (VCSEL) sind genau eine solche Lichtquelle. Systeme der fünften Generation (5G) versprechen, Milliarden von Menschen und Billionen von Geräten und Sensoren für das Internet der Dinge mit 1 bis über 20 Gbps über neu versteigerte Millimeterwellen-Spektralbänder (30 GHz bis 300 GHz) zu verbinden. Bis etwa 2030 sehen Systeme der sechsten Generation (6G) eine enorme Breitbandkapazität ohne Latenzzeit vor – sie ermöglichen virtuelle und gemischte Realitäten in Echtzeit, Mensch-Maschine-Schnittstellen, autonome Fahrzeuge und vieles mehr. Die 6G-Technologie fügt Terahertz-Wellensender hinzu, einschließlich Infrarot-VCSELs und VCSEL-Arrays, um die Datenraten signifikant zu erhöhen, die Energie- und Spektraleffizienz zu steigern und die verfügbaren und noch unregulierten Spektralbänder zu nutzen. In der vorliegenden Arbeit werden neue experimentelle VCSEL-Dioden und neuartige zweidimensionale (2D) VCSEL-Diodenarrays entworfen, hergestellt und getestet. Die Physik der VCSEL-Lichtemittern, welche auf 5G- und 6G-optische drahtlose Kommunikations- und Sensoranwendungen ausgerichtet sind, wird untersucht und Performance-Tradeoffs für die angedachten Anwendungen werden identifiziert und analysiert. Über hauseigene Computermodellierungs- und Simulationsprogramme wurden epitaktische VCSEL-Strukturen – bestehend aus nanometerdicken Aluminium-Gallium-Arsenid-, Indium-Gallium-Arsenid- und Gallium-Arsenid-Phosphid-Schichten – mit Peak-Zielemissionswellenlängen von 940 und 980 Nanometern entworfen. Ein kommerzieller Hersteller hat die experimentellen VCSEL-Epitaxiewafer durch metallorganische Gasphasenepitaxie auf Gallium-Arsenid-Substraten mit einem Durchmesser von 3 Zoll gewachsen. In einem Reinraum an der Universität wurden die VCSELs als Viertelwafer-Teststücke mit einem neuen VCSEL Array 2018-Maskensatz gefertigt, der einzelne VCSELs und mehrere Variationen von neuartigen elektrisch parallelen 2D-Tripel- (3-Element), Septuple- (7-Element) und Novemdecuple- (19-Elemente) Strukturdesigns enthält. Bei den prozessierten Strukturen handelt es sich um Top-Emitter mit hochfrequenzkompatiblen koplanare Masse-Signal-Masse-Metallkontaktpads. Alle Device-Tests wurden computergesteuert in einem universitären Laserdiodenlabor durch direktes elektrisches On-Wafer Probing durchgeführt, beginnend mit Dauerstrich-Lichtausgangsleistung-Strom-Spannungs-Sweeps über eine kalibrierte Photodioden-Integrationskugel und eine variable Stromquelle. Für Emissionsspektren und Kleinsignal-Frequenzgangmessungen wurde das emittierte VCSEL-Licht mit einer standardmäßigen OM1-Multimode-Glasfaser (MMF) eingesammelt – verbunden mit einem optischen Spektrumanalysator oder einem Fotoempfänger. Für On-Wafer-Datenübertragungstests über OM1-MMF-Patchkabel wurden die VCSELs mit pseudozufälligen Bitmustern im Non-Return-To-Zero Format mit 2-Level-Pulsamplitudenmodulation moduliert. In dieser Arbeit werden bisher unerreichte Kombinationen von optischer Ausgangsleistung, Bandbreite und Effizienz für VCSEL und VCSEL-Arrays mit großer Oxid-Apertur (größer als 20 Mikrometer) demonstriert. Beispielsweise werden 200 Milliwatt optische Ausgangsleistung, eine Bandbreite von 18 GHz und eine Konversionseffizienz elektrischer zu optischer Leistung von 35 Prozent mit einem 19-Element-VCSEL-Array erreicht. Zudem werden mehrere Rekorde für fehlerfreie Datenübertragung aufgestellt, zum Beispiel 40 Gbps für Triple- und Septuple-VCSEL-Arrays und 25 Gbps für Novemdecuple-VCSEL-Arrays, weit über den bisherigen Stand der Technik von 10 Gbps hinaus. Diese Arbeit ist die erste, die Trade-Offs in der hochgradig nichttrivialen Physik von VCSEL-Arrays untersucht, die auf Arrays mit hoher Leistung und hoher Bandbreite für die Datenübertragung im freien Raum abzielen – und damit neue Leitprinzipien für die weitere Bauelementoptimierung und Produktentwicklung schafft.DFG, 43659573, SFB 787: Halbleiter - Nanophotonik: Materialien, Modelle, Bauelement
Photography and Aesthetics: a critical study on visual and textual narratives in the lifework of Sergio LarraĂn and its impact in 20th century Europe and Latin America
The main focus of this study is a theoretical exploration of critical approaches applicable to the work of the Chilean photographer Sergio LarraĂn (1931-2012). It presents analytical tools to contextualise and understand the importance and impact of his work in photographic studies and his portrayal of twentieth-century Latin American and European culture. It inspects in depth a large portion of his photo work, which is still only partially published and mostly reduced to his "active" period as a photojournalist, aside from the personal photographic exploration of his early and late career (C. Mena). This extended material creates a broader scope for understanding his photographs and him as a canonical photographer. This study analyses the photographer's trajectory as discourses of recollection of historical memory in time (Mauad) to trace LarraĂn's collective memory associated with his visual production. Such analysis helps decode his visual imagery and his projection and impact on the European and Latin American culture. This strategy helps solve a two fold problem: firstly, it generates an interpretive consistency to understand the Chilean's photographic practice; secondly, it explores the power of images as an aesthetic experience in the installation of nationalist ideologies and the creation of imaginaries (B. Anderson 163)
How to Be a God
When it comes to questions concerning the nature of Reality, Philosophers and Theologians have the answers.
Philosophers have the answers that can’t be proven right. Theologians have the answers that can’t be proven wrong.
Today’s designers of Massively-Multiplayer Online Role-Playing Games create realities for a living. They can’t spend centuries mulling over the issues: they have to face them head-on. Their practical experiences can indicate which theoretical proposals actually work in practice.
That’s today’s designers. Tomorrow’s will have a whole new set of questions to answer.
The designers of virtual worlds are the literal gods of those realities. Suppose Artificial Intelligence comes through and allows us to create non-player characters as smart as us. What are our responsibilities as gods? How should we, as gods, conduct ourselves?
How should we be gods
A productive response to legacy system petrification
Requirements change. The requirements of a legacy information system change, often in unanticipated ways, and at a more rapid pace than the rate at which the information system itself can be evolved to support them. The capabilities of a legacy system progressively fall further and further behind their evolving requirements, in a degrading process termed petrification. As systems petrify, they deliver diminishing business value, hamper business effectiveness, and drain organisational resources. To address legacy systems, the first challenge is to understand how to shed their resistance to tracking requirements change. The second challenge is to ensure that a newly adaptable system never again petrifies into a change resistant legacy system. This thesis addresses both challenges. The approach outlined herein is underpinned by an agile migration process - termed Productive Migration - that homes in upon the specific causes of petrification within each particular legacy system and provides guidance upon how to address them. That guidance comes in part from a personalised catalogue of petrifying patterns, which capture recurring themes underlying petrification. These steer us to the problems actually present in a given legacy system, and lead us to suitable antidote productive patterns via which we can deal with those problems one by one. To prevent newly adaptable systems from again degrading into legacy systems, we appeal to a follow-on process, termed Productive Evolution, which embraces and keeps pace with change rather than resisting and falling behind it. Productive Evolution teaches us to be vigilant against signs of system petrification and helps us to nip them in the bud. The aim is to nurture systems that remain supportive of the business, that are adaptable in step with ongoing requirements change, and that continue to retain their value as significant business assets
Graphical scaffolding for the learning of data wrangling APIs
In order for students across the sciences to avail themselves of modern data streams, they must first know how to wrangle data: how to reshape ill-organised, tabular data into another format, and how to do this programmatically, in languages such as Python and R. Despite the cross-departmental demand and the ubiquity of data wrangling in analytical workflows, the research on how to optimise the instruction of it has been minimal. Although data wrangling as a programming domain presents distinctive challenges - characterised by on-the-fly syntax lookup and code example integration - it also presents opportunities. One such opportunity is how tabular data structures are easily visualised. To leverage the inherent visualisability of data wrangling, this dissertation evaluates three types of graphics that could be employed as scaffolding for novices: subgoal graphics, thumbnail graphics, and parameter graphics. Using a specially built e-learning platform, this dissertation documents a multi-institutional, randomised, and controlled experiment that investigates the pedagogical effects of these. Our results indicate that the graphics are well-received, that subgoal graphics boost the completion rate, and that thumbnail graphics improve navigability within a command menu. We also obtained several non-significant results, and indications that parameter graphics are counter-productive. We will discuss these findings in the context of general scaffolding dilemmas, and how they fit into a wider research programme on data wrangling instruction
- …