7,118 research outputs found
Design dis-integration Silent, Partial, and Disparate Design
Michael Porter’s frameworks for analysing and planning competitive differentiation (Porter 1980, 1985) are established ‘textbook’ tools, widely taught to business students today. As the claim of design’s strategic importance is increasingly heard, we ask where does design fit in established strategy thinking?
This paper documents a proposed conceptual model based on Porter’s value chain model for strategic planning. The concept outlined is the result of the first stage of a larger study of design’s potential role at strategic level and the difficulties faced by organisations in exploiting design strategically. This exploratory phase comprised a review of literature on design management and models of strategy, followed by nineteen interviews with senior design professionals. These then informed a novel revision of the value chain diagram reflecting the strategic role of design, and the identification of three key phenomena concerning design integration (silent design, partial design and disparate design). These phenomena are also represented in modified versions of the value chain.
This overall project follows a research approach based on the design research method and on procedural action research, and aims to develop a tool or method to help organisations increase design integration. This project is ongoing, and the results will be published separately.
Keywords:
Strategic; value chain; silent; partial; disparate; integrated</p
Contingent Valuation, Hypothetical Bias, and Experimental Economics
Although the contingent valuation method has been widely used to value a diverse array of nonmarket environmental and natural resource commodities, recent empirical evidence suggests it may not accurately estimate real economic values. The hypothetical nature of environmental valuation surveys typically results in responses that are significantly greater than actual payments. Economists have had mixed success in developing techniques designed to control for this "hypothetical bias." This paper highlights the role of experimental economics in addressing hypothetical bias, and identifies a gap in the existing literature by focusing on the underlying causes of this bias. Most of the calibration techniques used today lack a theoretical justification, and therefore these procedures need to be used with caution. We argue that future experimental research should investigate the reasons hypothetical bias persists. A better understanding of the causes should enhance the effectiveness of calibration techniques.Environmental Economics and Policy,
Recommended from our members
Automating class definitions from OWL to English
Text definitions for entities within bio-ontologies are a cor-nerstone of the effort to gain a consensus in understanding and usage of those ontologies. Writing these definitions is, however, a considerable effort and there is often a lag be-tween specification of the entities in the ontology and the development of the text-based definitions. As well as these text definitions, there can also be logical descriptions and definitions of an ontology's entities. The goal of natural lan-guage generation (NLG) from ontologies is to take the logi-cal description of entities and generate fluent natural lan-guage. We should be able to use NLG to automatically pro-vide text-based definitions from an ontology that has logical descriptions of its entities and thus avoid the bottleneck of authoring these definitions by hand. In this paper we present some early work in using NLG to provide such text definitions for the Experimental factor Ontology (EFO). We present our results, discuss issues in generating text definitions, and highlight some future work
Recommended from our members
MAZI Deliverable Report D2.5: – Design, progress and evaluation of the Deptford CreekNet pilot (version 2)
In this deliverable, the second in a series of three, we report on progress in the Creeknet pilot. We describe progress towards tasks identified in the Description of Work (DoW) for Task 2.2, focusing on activities in Year 2 (2017: months 13-24) and look forward to Year 3. The Creeknet pilot consists of four phases. This year, our focus has been on consolidating initial contacts made in Year 1 (Phase 1), and continuing community engagement activities alongside carrying out an initial deployment of the MAZI toolkit with a number of engaged community groups and individuals (Phase 2). In the second half of the year, as the toolkit was developed and an integrated set of tool established these groups and others were invited to engage in further trials, and feedback was gathered to further inform onward development (Phase 3). We have continued with our efforts to build upon existing relationships in Deptford Creek and further afield to help us explore the different ways in which DIY networking in the broadest sense and the use of the MAZI toolkit in particular might help address local challenges. We have reassessed some of our foci through seeking out new opportunities for engagement and trialling the MAZI toolkit. A major activity was planning and running the two day MAZI London Cross-fertilisation symposium. This created the opportunity for Creeknet participants to share their experiences and engage with the other MAZI pilots, bringing together existing community contacts in Deptford Creek, and MAZI partners, and attracted new contributors. Through our activities, working with the emerging MAZI toolkit that evolved through several iterations during the year, we have better understood local circumstances and the complexity involved in the conceptualisation of ‘DIY networking’ - it cannot be assumed to be a single notion. We have identified that both social and technological concerns can restrict its uptake, and consider routes to overcoming these challenges. We provide analysis of work carried out so far, and look towards the future activities
Specialty chemicals manufacturing SMEs Toolbox to support Environmental and Sustainable Systems (TESS)
Measuring Expert Performance at Manually Classifying Domain Entities under Upper Ontology Classes
Classifying entities in domain ontologies under upper ontology classes is a
recommended task in ontology engineering to facilitate semantic
interoperability and modelling consistency. Integrating upper ontologies this
way is difficult and, despite emerging automated methods, remains a largely
manual task.
Little is known about how well experts perform at upper ontology integration.
To develop methodological and tool support, we first need to understand how
well experts do this task. We designed a study to measure the performance of
human experts at manually classifying classes in a general knowledge domain
ontology with entities in the Basic Formal Ontology (BFO), an upper ontology
used widely in the biomedical domain.
We conclude that manually classifying domain entities under upper ontology
classes is indeed very difficult to do correctly. Given the importance of the
task and the high degree of inconsistent classifications we encountered, we
further conclude that it is necessary to improve the methodological framework
surrounding the manual integration of domain and upper ontologies
"Steel Production in Scotland: Strategic Considerations for the 1990s" : a review
This paper serves to review a recent report prepared for Strathclyde Regional Council by Glasgow University on the subject of the future of plate production in Scotland
A flat trend of star-formation rate with X-ray luminosity of galaxies hosting AGN in the SCUBA-2 Cosmology Legacy Survey
© 2019 The Author(s) Published by Oxford University Press on behalf of the Royal Astronomical Society.Feedback processes from active galactic nuclei (AGN) are thought to play a crucial role in regulating star formation in massive galaxies. Previous studies using Herschel have resulted in conflicting conclusions as to whether star formation is quenched, enhanced, or not affected by AGN feedback. We use new deep 850 μm observations from the SCUBA-2 Cosmology Legacy Survey (S2CLS) to investigate star formation in a sample of X-ray selected AGN, probing galaxies up to L 0.5-7keV = 10 46 erg s -1. Here, we present the results of our analysis on a sample of 1957 galaxies at 1 < z < 3, using both S2CLS and ancilliary data at seven additional wavelengths (24-500 μm) from Herschel and Spitzer. We perform a stacking analysis, binning our sample by redshift and X-ray luminosity. By fitting analytical spectral energy distributions (SEDs) to decompose contributions from cold and warm dust, we estimate star formation rates (SFRs) for each 'average' source. We find that the average AGN in our sample resides in a star-forming host galaxy, with SFRs ranging from 80 to 600 M ⊙ yr -1. Within each redshift bin, we see no trend of SFR with X-ray luminosity, instead finding a flat distribution of SFR across ∼3 orders of magnitude of AGN luminosity. By studying instantaneous X-ray luminosities and SFRs, we find no evidence that AGN activity affects star formation in host galaxies.Peer reviewedFinal Accepted Versio
- …