3,152 research outputs found

    INVESTIGATING THE APPLICATION OF SERVICE MODULARISATION IN AN INDUSTRY: - “A CASE STUDY OF A MOBILE OPERATOR IN SIERRA LEONE (AIRTEL) AND A MONEY TRANSFER SERVICE IN THE UNITED KINGDOM (WESTERN UNION)”

    Get PDF
    Abstract There is an increasing trend in the need for services, and organisations are now moving towards modularisation of their services for efficiency, competitive advantage and to satisfy their customers. Modularisation activities focuses on standardisation, customisation and the reduction of complexity in its system to achieve efficiency, cost minimisation and better service for business growth. The main objective of this study is to examine how theory and practice relates; which involves bringing out concepts of modularity from literatures and examine practical cases in the service industry. Interest in the field of modularity is emerging, in order to discover how the implementation of modularity can add value to research and practical application. Therefore, this dissertation seeks to arrive at strategic decisions that should be considered in the implementation of modularisation and identifying the factors that influences the implementation of modularisation in a money transfer service. To achieve this, two case studies were selected, a Mobile Money Transfer Service System in Sierra Leone and the Western Union Money Transfer Service in the United Kingdom. According to the results the application of modularisation in a service organisation can enhance service efficiency, customers’ satisfaction and contributes greatly in business development by gaining competitive advantage and increase in profit

    Statistical modelling to predict silicosis risk in deceased Southern African gold miners without medical evaluation

    Get PDF
    The Qhubeka Trust was established in 2016 in a legal settlement on behalf of former gold miners seeking compensation for silicosis contracted on the South African mines. Settlements resulting from lawsuits on behalf of gold miners aim to provide fair compensation. However, occupational exposure and medical records kept by South African mining companies for their employees have been very limited. Some claimants to the Qhubeka Trust died before medical evaluation was possible, thus potentially disadvantaging their dependants from receiving any compensation. With medical evaluation no longer possible, a statistical approach to this problem was developed. The records for claimants with medical evaluation were used to develop a logistic regression prediction model for the likelihood of silicosis, based on the potential predictors: cumulative exposure to respirable dust, age, years since first exposure, years of life lost prematurely, vital status at 31 December 2019, and a history of tuberculosis diagnosis. The prediction model allowed estimation of the likelihood of silicosis for each miner who had died without medical evaluation and is a novel approach in this setting. In addition, we were able to quantitatively evaluate the trade-offs of different silicosis risk classification thresholds in terms of true and false positives and negatives. Significance:• A statistical approach can be used for risk estimation in settings where the outcome of interest is unknown for some members of a class.• The likelihood of silicosis in deceased miners without medical evaluation in the Qhubeka Trust can be accurately estimated, using information from finalised claims.• Strategies for classifying the silicosis status of deceased miners without medical evaluation in theQhubeka Trust can be assessed in a rigorous, quantitative framework

    Cross-Cutting Computational Modeling Project: Exploration Medical Station Analysis

    Get PDF
    Astronauts will be away from Earth-based medical care for long periods during future exploration missions. Thus, it will be necessary for the astronauts to perform various medical tasks to monitor and maintain their health in the microgravity environment of space. Performance of these tasks will be constrained due to the limited volume available to perform the task, the absence of gravity and the limited resources and capabilities available in the medical work area. It is therefore necessary to evaluate exploration medical workstation designs for how well the designs will support crew performance of medical tasks. This evaluation featured two trained medical caregivers (99th percentile male, 26th percentile female) performing emergent care procedures (alone and in tandem) on a medical manikin. The procedures came from the The procedures came from the International Space Station Medical Checklist, and they are designed for spaceflight. The objectives of the evaluation included determining the operational volume required to perform the tasks, examining the effect of constraining the operational volume with partitions, determining candidate locations for foot restraints and equipment placements and determining the effect of single vs. dual caregiver on the operational volume.A marker-based motion capture system collected the motion data, which enabled computation of operational volumes and foot placement maps using custom Python code. Additional data collected included heart rate, time to perform the procedures, and feedback from the caregivers in the form of the NASA Task Load Index (TLX), the US Government System Usability Survey, and an open-ended questionnaire

    Fast and Fourier: Extreme Mass Ratio Inspiral Waveforms in the Frequency Domain

    Full text link
    Extreme Mass Ratio Inspirals (EMRIs) are one of the key sources for future space-based gravitational wave interferometers. Measurements of EMRI gravitational waves are expected to determine the characteristics of their sources with sub-percent precision. However, their waveform generation is challenging due to the long duration of the signal and the high harmonic content. Here, we present the first ready-to-use Schwarzschild eccentric EMRI waveform implementation in the frequency domain for use with either graphics processing units (GPUs) or central processing units (CPUs). We present the overall waveform implementation and test the accuracy and performance of the frequency domain waveforms against the time domain implementation. On GPUs, the frequency domain waveform takes in median 0.0440.044 seconds to generate and is twice as fast to compute as its time domain counterpart when considering massive black hole masses ≥2×106 M⊙\geq 2 \times 10^6 \,{\rm M_\odot} and initial eccentricities e0>0.2e_0 > 0.2. On CPUs, the median waveform evaluation time is 55 seconds, and it is five times faster in the frequency domain than in the time domain. Using a sparser frequency array can further speed up the waveform generation, reaching up to 0.3 0.3 seconds. This enables us to perform, for the first time, EMRI parameter inference with fully relativistic waveforms on CPUs. Future EMRI models which encompass wider source characteristics (particularly black hole spin and generic orbit geometries) will require significantly more harmonics. Frequency-domain models will be essential analysis tools for these astrophysically realistic and important signals.Comment: 23 pages, 6 figure

    Spectroscopic investigations of intermediates in the reaction of cytochrome P450BM3–F87G with surrogate oxygen atom donors

    Get PDF
    Rapid mixing of substrate-free ferric cytochrome P450BM3–F87G with m-chloroperoxybenzoic acid (mCPBA) resulted in the sequential formation of two high-valent intermediates. The first was spectrally similar to compound I species reported previously for P450CAM and CYP 119 using mCPBA as an oxidant, and it featured a low intensity Soret absorption band characterized by shoulder at 370 nm. This is the first direct observation of a P450 compound I intermediate in a type II P450 enzyme. The second intermediate, which was much more stable at pH values below 7.0, was characterized by an intense Soret absorption peak at 406 nm, similar to that seen with P450CAM [T. Spolitak, J.H. Dawson, D.P. Ballou, J. Biol. Chem. 280 (2005) 20300–20309]. Double mixing experiments in which NADPH was added to the transient 406 nm-absorbing intermediate resulted in rapid regeneration of the resting ferric state, with the flavins of the flavoprotein domain in their reduced state. EPR results were consistent with this stable intermediate species being a cytochrome c peroxidase compound ES-like species containing a protein-based radical, likely localized on a nearby Trp or Tyr residue in the active site. Iodosobenzene, peracetic acid, and sodium m-periodate also generated the intermediate at 406 nm, but not the 370 nm intermediate, indicating a probable kinetic barrier to accumulating compound I in reactions with these oxidants. The P450 ES intermediate has not been previously reported using iodosobenzene or m-periodate as the oxygen donor

    Hyperboloidal discontinuous time-symmetric numerical algorithm with higher order jumps for gravitational self-force computations in the time domain

    Full text link
    Within the next decade the Laser Interferometer Space Antenna (LISA) is due to be launched, providing the opportunity to extract physics from stellar objects and systems, such as \textit{Extreme Mass Ratio Inspirals}, (EMRIs) otherwise undetectable to ground based interferometers and Pulsar Timing Arrays (PTA). Unlike previous sources detected by the currently available observational methods, these sources can \textit{only} be simulated using an accurate computation of the gravitational self-force. Whereas the field has seen outstanding progress in the frequency domain, metric reconstruction and self-force calculations are still an open challenge in the time domain. Such computations would not only further corroborate frequency domain calculations and models, but also allow for full self-consistent evolution of the orbit under the effect of the self-force. Given we have \textit{a priori} information about the local structure of the discontinuity at the particle, we will show how to construct discontinuous spatial and temporal discretisations by operating on discontinuous Lagrange and Hermite interpolation formulae and hence recover higher order accuracy. In this work we demonstrate how this technique in conjunction with well-suited gauge choice (hyperboloidal slicing) and numerical (discontinuous collocation with time symmetric) methods can provide a relatively simple method of lines numerical algorithm to the problem. This is the first of a series of papers studying the behaviour of a point-particle prescribing circular geodesic motion in Schwarzschild in the \textit{time domain}. In this work we describe the numerical machinery necessary for these computations and show not only our work is capable of highly accurate flux radiation measurements but it also shows suitability for evaluation of the necessary field and it's derivatives at the particle limit

    The Ionization Fraction in Dense Molecular Gas II: Massive Cores

    Full text link
    We present an observational and theoretical study of the ionization fraction in several massive cores located in regions that are currently forming stellar clusters. Maps of the emission from the J = 1-> O transitions of C18O, DCO+, N2H+, and H13CO+, as well as the J = 2 -> 1 and J = 3 -> 2 transitions of CS, were obtained for each core. Core densities are determined via a large velocity gradient analysis with values typically 10^5 cm^-3. With the use of observations to constrain variables in the chemical calculations we derive electron fractions for our overall sample of 5 cores directly associated with star formation and 2 apparently starless cores. The electron abundances are found to lie within a small range, -6.9 < log10(x_e) < -7.3, and are consistent with previous work. We find no difference in the amount of ionization fraction between cores with and without associated star formation activity, nor is any difference found in electron abundances between the edge and center of the emission region. Thus our models are in agreement with the standard picture of cosmic rays as the primary source of ionization for molecular ions. With the addition of previously determined electron abundances for low mass cores, and even more massive cores associated with O and B clusters, we systematically examine the ionization fraction as a function of star formation activity. This analysis demonstrates that the most massive sources stand out as having the lowest electron abundances (x_e < 10^-8).Comment: 35 pages (8 figures), using aaspp4.sty, to be published in Astrophysical Journa
    • …
    corecore