913 research outputs found
Recommended from our members
Using ERP as a basis for Enterprise application integration
Architecting and implementing e-Business supply chain solutions across and within the modern day enterprise, is now becoming a necessity in order to maintain competitive and be adaptable to market needs. As such, the integration of information and processes is a vital step, using technologies such as using Enterprise Resource Planning (ERP), Supply Chain Management (SCM) and enterprise portal platforms. The effective sharing of resource planning and other enterprise related data across and within the enterprise is typically seen as a facet of a business to business (B2B) platform. However, such infrastructures typically involve a tight integration across intra and inter-organisational systems. This paper examines an Enterprise Application Integration (EAI) initiative taken by a global manufacturer of industrial automation products, which attempted to utilise ERP as an integration tool across its internal B2B infrastructure, to achieve such an aim. This paper discusses those integration considerations and complexities, experienced by the case company upon embarking on an EAI integration programme through the adoption of a core ERP as a catalyst for organizational change. In doing so the authors present an analysis of the inherent risks and limitations of this approach in terms of previously published literature in the field, relating to technology-driven organizational change and EAI impact and adoption frameworks
Disentangling scale approaches in governance research: comparing monocentric, multilevel, and adaptive governance
The question of how to govern the multiscale problems in today’s network society is an important topic in the fields of public administration, political sciences, and environmental sciences. How scales are defined, studied, and dealt with varies substantially within and across these fields. This paper aims to reduce the existing conceptual confusion regarding scales by disentangling three representative approaches that address both governance and scaling: monocentric governance, multilevel governance, and adaptive governance. It does so by analyzing the differences in (1) underlying views on governing, (2) assumptions about scales, (3) dominant problem definitions regarding scales, and (4) preferred responses for dealing with multiple scales. Finally, this paper identifies research opportunities within and across these approaches
The Knowledge Application and Utilization Framework Applied to Defense COTS: A Research Synthesis for Outsourced Innovation
Purpose -- Militaries of developing nations face increasing budget pressures, high operations tempo, a blitzing pace of technology, and adversaries that often meet or beat government capabilities using commercial off-the-shelf (COTS) technologies. The adoption of COTS products into defense acquisitions has been offered to help meet these challenges by essentially outsourcing new product development and innovation. This research summarizes extant research to develop a framework for managing the innovative and knowledge flows. Design/Methodology/Approach – A literature review of 62 sources was conducted with the objectives of identifying antecedents (barriers and facilitators) and consequences of COTS adoption. Findings – The DoD COTS literature predominantly consists of industry case studies, and there’s a strong need for further academically rigorous study. Extant rigorous research implicates the importance of the role of knowledge management to government innovative thinking that relies heavily on commercial suppliers. Research Limitations/Implications – Extant academically rigorous studies tend to depend on measures derived from work in information systems research, relying on user satisfaction as the outcome. Our findings indicate that user satisfaction has no relationship to COTS success; technically complex governmental purchases may be too distant from users or may have socio-economic goals that supersede user satisfaction. The knowledge acquisition and utilization framework worked well to explain the innovative process in COTS. Practical Implications – Where past research in the commercial context found technological knowledge to outweigh market knowledge in terms of importance, our research found the opposite. Managers either in government or marketing to government should be aware of the importance of market knowledge for defense COTS innovation, especially for commercial companies that work as system integrators. Originality/Value – From the literature emerged a framework of COTS product usage and a scale to measure COTS product appropriateness that should help to guide COTS product adoption decisions and to help manage COTS product implementations ex post
Right-Sized Risk-Based Deployment of a Cots Chromatography Data System
As technology advances, computer software has taken a large position in the modern
laboratory. The exponential growth of data produced in biopharmaceutical laboratories
today has forced the need for moving from capturing data on paper or storing it in
spreadsheets and small, non-robust databases to the need for having an automated and
secure data management platform. In the November edition of the 2003 Scientific
Computing & Instrumentation LIMS Guide, M. Elliott (2003) pointed out that
traditionally laboratories have looked to Laboratory Information Management Systems
(LIMS) to assist in managing the ever increasing information workload. In the not so
distant past, these LIMS and other systems were custom systems that largely delivered
every user requirement, specific to each company’s internal processes. However, new
regulations and reporting requirements have stretched this model and the reality of longterm
maintenance costs have brought about the integration of systems within laboratories,
not only to collect data but also manage these systems in a way that insures long-term
preservation and knowledge retention. This integration is not without its challenges,
especially when it occurs in a heavily regulated industry such as pharmaceuticals. While
there are certainly technical challenges associated with this integration, this strict
regulatory environment particularly requires expensive, tedious validation of most
software. Into the software validation mine field has entered the risk-based verbiage
recently espoused by the United States Food and Drug Administration (FDA). This
verbiage might either be the bane or panacea for an industry that is trying hard to focus
on making the next block-buster drug, not on developing internal software.
So, how does a large pharmaceutical company meet tightening FDA guidelines and
accomplish their true drug discovery goal? The solution might be in another type of
integration- namely integrating laboratory processes, risk-based software validation, and
a Commercial-off-the-shelf (COTS) system. The resulting blend will nearly certainly
hold more initial deployment pain for the laboratory, as the COTS system cannot be
modified to completely fit the current laboratory processes. Often, however, the
validation and compliance benefits might greatly outweigh the initial costs.
The thesis project consisted of developing a right-sized, risk-based validation
package for a COTS chromatography data system (CDS) and the subsequent deployment
of the validated software. Validation included first developing a detailed risk assessment
to guide right-sizing the validation effort, taking current regulatory guidance on riskbased
software validation into account. This is the approach of a large pharmaceutical
company that is seeking to minimize direct involvement in software development, while
minimizing the significant risks that come from software, whether developed internally or
by an outside vendor. This project explored the various ways risk-based validation and
COTS software vendor management can reduce validation, deployment and maintenance
costs, especially those associated with the testing and on-going maintenance of a COTS
package
Recommended from our members
An exploration of the IGA method for efficient reservoir simulation
Novel numerical methods present exciting opportunities to improve the efficiency of reservoir simulators. Because potentially significant gains to computational speed and
accuracy may be obtained, it is worthwhile explore alternative computational algorithms
for both general and case-by-case application to the discretization of the equations of porous media flow, fluid-structure interaction, and/or production. In the present
work, the fairly new concept of isogeometric analysis (IGA) is evaluated for its suitability
to reservoir simulation via direct comparison with the industry standard finite difference (FD) method and 1st order standard finite element method (SFEM). To this end, two main studies are carried out to observe IGA’s performance with regards to geometrical modeling and ability to capture steep saturation fronts. The first study explores IGA’s ability to model complex reservoir geometries, observing L2 error convergence rates under a variety of refinement schemes. The numerical experimental setup includes an 'S' shaped line sink of varying curvature from which water is produced in a 2D homogenous domain. The accompanying study simplifies the domain to 1D, but adds in multiphase physics that traditionally introduce difficulties associated with modeling of a moving saturation front. Results overall demonstrate promise for the IGA method to be a particularly effective tool in handling geometrically difficult features while also managing typically challenging numerical phenomena.Petroleum and Geosystems Engineerin
Yes, we should! EU priorities for 2019-2024. EPC Challenge Europe Issue 24, April 2019
The European Union (EU) is not alone
in its struggle to grapple with the major
headaches of our times. The Western
world as a whole is affected. Inside the
Union, the crises in and of its national
societies and democracies have radiated
to the EU level. Half of the member states
have minority governments. If they are
politically weak in their own countries,
how can the Union be strong? The EU is,
after all, also the sum of its member states
Book Review Button, K.J. and Hensher, D.A. (eds.) Handbook of transport strategy, policy and institutions
They’ve done it again! Elsevier recently published the 6th Handbook for Transport, edited by Ken Button and David Hensher, the Handbook of Transport Strategy, Policy and Institutions. The handbook contains 834 pages, pleasantly split into 46 chapters of some 15-20 pages each (the perfect size for a train commuter with one hour in-vehicle time per one-way trip), and is organized into sections covering: • Institutional settings and markets • Planning perspectives • Asset management and funding • Regulatory issues • Evaluation frameworks • National studies Reviewing books is one of those jobs academics now and then do because it is more or less a moral task. It is very time consuming and does not result in any academic credits. The board of the European Journal of Transport and Infrastructure Research (EJTIR) was keen to publish a book review of this handbook. Unfortunately the first three persons we requested to do the job kindly suggested asking someone else, so I finally decided to do it myself, with, to be honest, a bit of hesitation
SparsePak: A Formatted Fiber Field-Unit for The WIYN Telescope Bench Spectrograph. II. On-Sky Performance
We present a performance analysis of SparsePak and the WIYN Bench
Spectrograph for precision studies of stellar and ionized gas kinematics of
external galaxies. We focus on spectrograph configurations with echelle and
low-order gratings yielding spectral resolutions of ~10000 between 500-900nm.
These configurations are of general relevance to the spectrograph performance.
Benchmarks include spectral resolution, sampling, vignetting, scattered light,
and an estimate of the system absolute throughput. Comparisons are made to
other, existing, fiber feeds on the WIYN Bench Spectrograph. Vignetting and
relative throughput are found to agree with a geometric model of the optical
system. An aperture-correction protocol for spectrophotometric standard-star
calibrations has been established using independent WIYN imaging data and the
unique capabilities of the SparsePak fiber array. The WIYN
point-spread-function is well-fit by a Moffat profile with a constant power-law
outer slope of index -4.4. We use SparsePak commissioning data to debunk a
long-standing myth concerning sky-subtraction with fibers: By properly treating
the multi-fiber data as a ``long-slit'' it is possible to achieve precision sky
subtraction with a signal-to-noise performance as good or better than
conventional long-slit spectroscopy. No beam-switching is required, and hence
the method is efficient. Finally, we give several examples of science
measurements which SparsePak now makes routine. These include H
velocity fields of low surface-brightness disks, gas and stellar
velocity-fields of nearly face-on disks, and stellar absorption-line profiles
of galaxy disks at spectral resolutions of ~24,000.Comment: To appear in ApJSupp (Feb 2005); 19 pages text; 7 tables; 27 figures
(embedded); high-resolution version at
http://www.astro.wisc.edu/~mab/publications/spkII_pre.pd
SERVITIZATION OF MANUFACTURING: are we turning a blind eye?
Servitization is a current organisational and, arguably a societal phenomenon which has received much attention in Marketing, Management and Operations literature. The term describes a process of business model reconfiguration which allows traditional manufacturers to change their focus from producing tangible goods to producing outcome-based services. The phenomenon is driven by development of new technologies such as Internet of Things (IoT), communication and integration platforms. The effects of servitization are manifold and wide-reaching, including re- und de-skilling of employees, reconfiguration of supplier-customer relationships, changes to consumer behaviours and creation of new virtual entry-barriers for manufacturers from developing countries. Yet, the Information Systems researches have not engaged in the debate. Comprehensive literature reviews from 2013, 2017 and 2019 mention no IS journals. The aim of this paper is to introduce the servitization phenomenon to the IS community and to spark a debate on our collective involvement
- …