20 research outputs found

    Developing and teaching of a world-class online project management curriculum

    Full text link
      The evolution of the internet and collaboration tools have made it possible to enhance the range of online education, and make it universally accessible and eminently affordable. Around 2000, the faculty at Boston University’s Metropolitan College proposed an online master’s degree in project management, using the emerging learning management systems. The program grew quickly from 40 to 200 students, and was one of the first in the United States to be accredited by the Project Management Institute’s Global Accreditation Committee. This academic model has now been extended to other disciplines and programs.It was expected from the outset that the BU online and classroom academic experiences would be completely equivalent. This presented several challenges, the first of which was developing online equivalents for the face-to-face pedagogical course components. Second, writing online courses, recording videos and developing innovative discussion topics is time-consuming, and we quickly realised that only fulltime faculty had the commitment and motivation to devote the required effort to produce quality courses. Finally, the technological resources associated with course development and course operation required significant investment, beyond the faculty time, currently estimated at around $60,000 per course.We surveyed our students and alumni every two years and now have enough data to describe accurately the evolution in attitudes to online education.As one of the earlier and premier adopters of a rigorous academic online education model, BU has a vested interest to contribute to the growing debate about the academic quality and rigour of online education, the application of high pedagogical standards, and the innovative use of online teaching frameworks and tools. This paper will address and document these issues and assist in raising awareness of emerging “best practice” in the online education domain.&nbsp

    Managing the trade-off implications of global supply

    Get PDF
    The cost versus response trade-off is a growing logistics issue due to many markets being increasingly characterized by demand uncertainty and shorter product life cycles. This is exacerbated further with supply increasingly moving to low cost global sources. However, the poor response implications of global supply are often not addressed or even acknowledged when undertaking such decisions. Consequently, various practical approaches to minimising, postponing or otherwise managing the impact of the demand uncertainty are often only adopted retrospectively. Even though such generic solutions are documented through case examples we lack effective tools and concepts to support the proactive identification and resolution of such trade-offs. This paper reports on case-based theory building research, involving three cases from the UK and USA used in developing a conceptual model with associated tools, in support of such a process

    EOQ extensions exploiting the Lambert W function

    No full text
    We analyse several extensions to the Economic Order Quantity (EOQ) model: when the inventory deteriorates over time; when the demand contains a stock dependent term; and when the present value, or discounted cost, is included. We derive exact analytical expressions for the order that minimises the total cost, and in each case the Lambert W function arises, adding to the growing list of useful applications for this recently rediscovered function. The analytical solutions have immediate practical and pedagogic applications. [Received 06 September 2007; Revised 13 January 2008; Accepted 01 April 2008]economic order quantity; EOQ model; inventory deterioration; present value; Lambert W function; discounted cost; stock dependent terms.

    Improving the accuracy of project estimates at completion using the Gompertz function

    Get PDF
    or ongoing projects, nonlinear regression-based growth models allow for re- fined duration and cost estimates at completion. In particular, the Gompertz sigmoidal function has been used in curve fitting and proven suitable in forecast- ing S-shaped cost profiles for projects experiencing overruns. In this paper, we follow the standard approach to both Earned Value Management and Earned Schedule and use the Gompertz function for the planned, earned, and actual cost profiles. A simple, linear expression is derived for the forecast of the du- ration estimate and the theoretical formula is validated by application to many synthetic project data sets. The model's predictions are shown to be accurate, stable and reliable, thus validating the theoretical concepts and demonstrat- ing their practical relevance. We conclude with practical guidance for project managers

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    International audienceThis document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore