913 research outputs found

    Disentangling scale approaches in governance research: comparing monocentric, multilevel, and adaptive governance

    Get PDF
    The question of how to govern the multiscale problems in today’s network society is an important topic in the fields of public administration, political sciences, and environmental sciences. How scales are defined, studied, and dealt with varies substantially within and across these fields. This paper aims to reduce the existing conceptual confusion regarding scales by disentangling three representative approaches that address both governance and scaling: monocentric governance, multilevel governance, and adaptive governance. It does so by analyzing the differences in (1) underlying views on governing, (2) assumptions about scales, (3) dominant problem definitions regarding scales, and (4) preferred responses for dealing with multiple scales. Finally, this paper identifies research opportunities within and across these approaches

    The Knowledge Application and Utilization Framework Applied to Defense COTS: A Research Synthesis for Outsourced Innovation

    Get PDF
    Purpose -- Militaries of developing nations face increasing budget pressures, high operations tempo, a blitzing pace of technology, and adversaries that often meet or beat government capabilities using commercial off-the-shelf (COTS) technologies. The adoption of COTS products into defense acquisitions has been offered to help meet these challenges by essentially outsourcing new product development and innovation. This research summarizes extant research to develop a framework for managing the innovative and knowledge flows. Design/Methodology/Approach – A literature review of 62 sources was conducted with the objectives of identifying antecedents (barriers and facilitators) and consequences of COTS adoption. Findings – The DoD COTS literature predominantly consists of industry case studies, and there’s a strong need for further academically rigorous study. Extant rigorous research implicates the importance of the role of knowledge management to government innovative thinking that relies heavily on commercial suppliers. Research Limitations/Implications – Extant academically rigorous studies tend to depend on measures derived from work in information systems research, relying on user satisfaction as the outcome. Our findings indicate that user satisfaction has no relationship to COTS success; technically complex governmental purchases may be too distant from users or may have socio-economic goals that supersede user satisfaction. The knowledge acquisition and utilization framework worked well to explain the innovative process in COTS. Practical Implications – Where past research in the commercial context found technological knowledge to outweigh market knowledge in terms of importance, our research found the opposite. Managers either in government or marketing to government should be aware of the importance of market knowledge for defense COTS innovation, especially for commercial companies that work as system integrators. Originality/Value – From the literature emerged a framework of COTS product usage and a scale to measure COTS product appropriateness that should help to guide COTS product adoption decisions and to help manage COTS product implementations ex post

    Right-Sized Risk-Based Deployment of a Cots Chromatography Data System

    Get PDF
    As technology advances, computer software has taken a large position in the modern laboratory. The exponential growth of data produced in biopharmaceutical laboratories today has forced the need for moving from capturing data on paper or storing it in spreadsheets and small, non-robust databases to the need for having an automated and secure data management platform. In the November edition of the 2003 Scientific Computing & Instrumentation LIMS Guide, M. Elliott (2003) pointed out that traditionally laboratories have looked to Laboratory Information Management Systems (LIMS) to assist in managing the ever increasing information workload. In the not so distant past, these LIMS and other systems were custom systems that largely delivered every user requirement, specific to each company’s internal processes. However, new regulations and reporting requirements have stretched this model and the reality of longterm maintenance costs have brought about the integration of systems within laboratories, not only to collect data but also manage these systems in a way that insures long-term preservation and knowledge retention. This integration is not without its challenges, especially when it occurs in a heavily regulated industry such as pharmaceuticals. While there are certainly technical challenges associated with this integration, this strict regulatory environment particularly requires expensive, tedious validation of most software. Into the software validation mine field has entered the risk-based verbiage recently espoused by the United States Food and Drug Administration (FDA). This verbiage might either be the bane or panacea for an industry that is trying hard to focus on making the next block-buster drug, not on developing internal software. So, how does a large pharmaceutical company meet tightening FDA guidelines and accomplish their true drug discovery goal? The solution might be in another type of integration- namely integrating laboratory processes, risk-based software validation, and a Commercial-off-the-shelf (COTS) system. The resulting blend will nearly certainly hold more initial deployment pain for the laboratory, as the COTS system cannot be modified to completely fit the current laboratory processes. Often, however, the validation and compliance benefits might greatly outweigh the initial costs. The thesis project consisted of developing a right-sized, risk-based validation package for a COTS chromatography data system (CDS) and the subsequent deployment of the validated software. Validation included first developing a detailed risk assessment to guide right-sizing the validation effort, taking current regulatory guidance on riskbased software validation into account. This is the approach of a large pharmaceutical company that is seeking to minimize direct involvement in software development, while minimizing the significant risks that come from software, whether developed internally or by an outside vendor. This project explored the various ways risk-based validation and COTS software vendor management can reduce validation, deployment and maintenance costs, especially those associated with the testing and on-going maintenance of a COTS package

    Yes, we should! EU priorities for 2019-2024. EPC Challenge Europe Issue 24, April 2019

    Get PDF
    The European Union (EU) is not alone in its struggle to grapple with the major headaches of our times. The Western world as a whole is affected. Inside the Union, the crises in and of its national societies and democracies have radiated to the EU level. Half of the member states have minority governments. If they are politically weak in their own countries, how can the Union be strong? The EU is, after all, also the sum of its member states

    Book Review Button, K.J. and Hensher, D.A. (eds.) Handbook of transport strategy, policy and institutions

    Get PDF
    They’ve done it again! Elsevier recently published the 6th Handbook for Transport, edited by Ken Button and David Hensher, the Handbook of Transport Strategy, Policy and Institutions. The handbook contains 834 pages, pleasantly split into 46 chapters of some 15-20 pages each (the perfect size for a train commuter with one hour in-vehicle time per one-way trip), and is organized into sections covering: • Institutional settings and markets • Planning perspectives • Asset management and funding • Regulatory issues • Evaluation frameworks • National studies Reviewing books is one of those jobs academics now and then do because it is more or less a moral task. It is very time consuming and does not result in any academic credits. The board of the European Journal of Transport and Infrastructure Research (EJTIR) was keen to publish a book review of this handbook. Unfortunately the first three persons we requested to do the job kindly suggested asking someone else, so I finally decided to do it myself, with, to be honest, a bit of hesitation

    SparsePak: A Formatted Fiber Field-Unit for The WIYN Telescope Bench Spectrograph. II. On-Sky Performance

    Full text link
    We present a performance analysis of SparsePak and the WIYN Bench Spectrograph for precision studies of stellar and ionized gas kinematics of external galaxies. We focus on spectrograph configurations with echelle and low-order gratings yielding spectral resolutions of ~10000 between 500-900nm. These configurations are of general relevance to the spectrograph performance. Benchmarks include spectral resolution, sampling, vignetting, scattered light, and an estimate of the system absolute throughput. Comparisons are made to other, existing, fiber feeds on the WIYN Bench Spectrograph. Vignetting and relative throughput are found to agree with a geometric model of the optical system. An aperture-correction protocol for spectrophotometric standard-star calibrations has been established using independent WIYN imaging data and the unique capabilities of the SparsePak fiber array. The WIYN point-spread-function is well-fit by a Moffat profile with a constant power-law outer slope of index -4.4. We use SparsePak commissioning data to debunk a long-standing myth concerning sky-subtraction with fibers: By properly treating the multi-fiber data as a ``long-slit'' it is possible to achieve precision sky subtraction with a signal-to-noise performance as good or better than conventional long-slit spectroscopy. No beam-switching is required, and hence the method is efficient. Finally, we give several examples of science measurements which SparsePak now makes routine. These include Hα\alpha velocity fields of low surface-brightness disks, gas and stellar velocity-fields of nearly face-on disks, and stellar absorption-line profiles of galaxy disks at spectral resolutions of ~24,000.Comment: To appear in ApJSupp (Feb 2005); 19 pages text; 7 tables; 27 figures (embedded); high-resolution version at http://www.astro.wisc.edu/~mab/publications/spkII_pre.pd

    SERVITIZATION OF MANUFACTURING: are we turning a blind eye?

    Get PDF
    Servitization is a current organisational and, arguably a societal phenomenon which has received much attention in Marketing, Management and Operations literature. The term describes a process of business model reconfiguration which allows traditional manufacturers to change their focus from producing tangible goods to producing outcome-based services. The phenomenon is driven by development of new technologies such as Internet of Things (IoT), communication and integration platforms. The effects of servitization are manifold and wide-reaching, including re- und de-skilling of employees, reconfiguration of supplier-customer relationships, changes to consumer behaviours and creation of new virtual entry-barriers for manufacturers from developing countries. Yet, the Information Systems researches have not engaged in the debate. Comprehensive literature reviews from 2013, 2017 and 2019 mention no IS journals. The aim of this paper is to introduce the servitization phenomenon to the IS community and to spark a debate on our collective involvement
    corecore