760 research outputs found

    Routines and representations at work - observing the architecture of conceptual design

    Get PDF
    routines, representations, artifacts, product development, workplace observation, evolutionary economics, chip manufacturing

    An investigation of the effects of IT investment on firm performance: The role of complementarity.

    Get PDF
    The concept of complementarity has been introduced into IT-based firm performance research in order to address inconsistent magnitudes of the impacts from IT investments across studies. This dissertation seeks to understand the scope of IT investment complementarities, to examine the different ways in which different complementarities impact the payoff from an IT investment, and to empirically test the effects of complementary investments in the context of investments in SCM and CRM. The knowledge-based view of the firm (KBV) is employed in order to understand a boundary and different roles of complementarity. The KBV sees organizational capabilities from the aggregation of knowledge into capabilities and the deployment of knowledge assets in the form of capabilities. Knowledge aggregation requires individuals' specialized knowledge (human capital) and the aggregation mechanisms of structural, social, and community capital. The combination of these three forms of capital, together with human capital, constitutes organizational capabilities. Once constituted, the complementary deployment of capabilities is important. Foundational capability must be in place in order for the focal IT investment to deliver value, synergistic capability amplifies the economic benefits of the focal IT investment, and management capability is managers' organizing vision and capability to successfully deploy the focal IT investment.The research findings show that three forms of structural, community, and human capital have highly significant impacts on firm performance measured by Net Cash Flow, Gross Profit, and EBITDA. Synergistic capabilities and management capabilities are found to be highly significant in moderating between three forms of capital and firm performance measurements.The data for this study were drawn from secondary data sources: Annual Reports, Press Releases, and news articles. The dependent variables are drawn from COMPUSTAT. The data collection method for the independent variables was a keyword search. The research sampling frame is confined within a single value chain however distinctively different industry categories are represented within this value chain. This sampling strategy yielded a total of 111 firms that had invested in SCM and 45 firms that had invested in CRM

    Report of the Stanford Linked Data Workshop

    No full text
    The Stanford University Libraries and Academic Information Resources (SULAIR) with the Council on Library and Information Resources (CLIR) conducted at week-long workshop on the prospects for a large scale, multi-national, multi-institutional prototype of a Linked Data environment for discovery of and navigation among the rapidly, chaotically expanding array of academic information resources. As preparation for the workshop, CLIR sponsored a survey by Jerry Persons, Chief Information Architect emeritus of SULAIR that was published originally for workshop participants as background to the workshop and is now publicly available. The original intention of the workshop was to devise a plan for such a prototype. However, such was the diversity of knowledge, experience, and views of the potential of Linked Data approaches that the workshop participants turned to two more fundamental goals: building common understanding and enthusiasm on the one hand and identifying opportunities and challenges to be confronted in the preparation of the intended prototype and its operation on the other. In pursuit of those objectives, the workshop participants produced:1. a value statement addressing the question of why a Linked Data approach is worth prototyping;2. a manifesto for Linked Libraries (and Museums and Archives and …);3. an outline of the phases in a life cycle of Linked Data approaches;4. a prioritized list of known issues in generating, harvesting & using Linked Data;5. a workflow with notes for converting library bibliographic records and other academic metadata to URIs;6. examples of potential “killer apps” using Linked Data: and7. a list of next steps and potential projects.This report includes a summary of the workshop agenda, a chart showing the use of Linked Data in cultural heritage venues, and short biographies and statements from each of the participants

    Death and Deterrence Redux: Science, Law and Causal Reasoning on Capital Punishment

    Get PDF
    A recent cohort of studies report deterrent effects of capital punishment that substantially exceed almost all previous estimates of lives saved by execution. Some of the new studies go further to claim that pardons, commutations, and exonerations cause murders to increase, as does trial delay. This putative life-life tradeoff is the basis for claims by legal academics and advocates of a moral imperative to aggressively prosecute capital crimes, brushing off evidentiary doubts as unreasonable cautions that place potential beneficiaries at risk of severe harm. Challenges to this new deterrence literature find that the evidence is too unstable and unreliable to support policy choices on capital punishment. This article identifies numerous technical and conceptual errors in the new deterrence studies that further erode their reliability: inappropriate methods of statistical analysis, failures to consider several factors such as drug epidemics that drive murder rates, missing data on key variables in key states, the tyranny of a few outlier states and years, weak to non-existent tests of concurrent effects of incarceration, inadequate instruments to disentangle statistical confounding of murder rates with death sentences and other punishments, failure to consider the general performance of the criminal justice system as a competing deterrent, artifactual results from truncated time frames, and the absence of any direct test of the components of contemporary theoretical constructions of deterrence. Re-analysis of one of the data sets shows that even simple adjustments to the data produce contradictory results, while alternate statistical methods produce contrary estimates. But the central mistake in this enterprise is one of causal reasoning: the attempt to draw causal inferences from a flawed and limited set of observational data, the absence of direct tests of the moving parts of the deterrence story, and the failure to address important competing influences on murder. There is no reliable, scientifically sound evidence that pits execution against a robust set of competing explanations to identify whether it exerts a deterrent effect that is uniquely and sufficiently powerful to overwhelm the recurring epidemic cycles of murder. This and other rebukes remind us to invoke tough, neutral social science standards and commonsense causal reasoning before expanding the use of execution with its attendant risks and costs

    Death and Deterrence Redux: Science, Law and Causal Reasoning on Capital Punishment

    Get PDF
    A recent cohort of studies report deterrent effects of capital punishment that substantially exceed almost all previous estimates of lives saved by execution. Some of the new studies go further to claim that pardons, commutations, and exonerations cause murders to increase, as does trial delay. This putative life-life tradeoff is the basis for claims by legal academics and advocates of a moral imperative to aggressively prosecute capital crimes, brushing off evidentiary doubts as unreasonable cautions that place potential beneficiaries at risk of severe harm. Challenges to this new deterrence literature find that the evidence is too unstable and unreliable to support policy choices on capital punishment. This article identifies numerous technical and conceptual errors in the new deterrence studies that further erode their reliability: inappropriate methods of statistical analysis, failures to consider several factors such as drug epidemics that drive murder rates, missing data on key variables in key states, the tyranny of a few outlier states and years, weak to non-existent tests of concurrent effects of incarceration, inadequate instruments to disentangle statistical confounding of murder rates with death sentences and other punishments, failure to consider the general performance of the criminal justice system as a competing deterrent, artifactual results from truncated time frames, and the absence of any direct test of the components of contemporary theoretical constructions of deterrence. Re-analysis of one of the data sets shows that even simple adjustments to the data produce contradictory results, while alternate statistical methods produce contrary estimates. But the central mistake in this enterprise is one of causal reasoning: the attempt to draw causal inferences from a flawed and limited set of observational data, the absence of direct tests of the moving parts of the deterrence story, and the failure to address important competing influences on murder. There is no reliable, scientifically sound evidence that pits execution against a robust set of competing explanations to identify whether it exerts a deterrent effect that is uniquely and sufficiently powerful to overwhelm the recurring epidemic cycles of murder. This and other rebukes remind us to invoke tough, neutral social science standards and commonsense causal reasoning before expanding the use of execution with its attendant risks and costs

    Knowledge Reuse for Customization: Metamodels in an Open Design Community for 3d Printing

    Full text link
    Theories of knowledge reuse posit two distinct processes: reuse for replication and reuse for innovation. We identify another distinct process, reuse for customization. Reuse for customization is a process in which designers manipulate the parameters of metamodels to produce models that fulfill their personal needs. We test hypotheses about reuse for customization in Thingiverse, a community of designers that shares files for three-dimensional printing. 3D metamodels are reused more often than the 3D models they generate. The reuse of metamodels is amplified when the metamodels are created by designers with greater community experience. Metamodels make the community's design knowledge available for reuse for customization-or further extension of the metamodels, a kind of reuse for innovation

    Interweaving Objects, Gestures, and Talk in Context

    Get PDF
    International audienceIn a large French hospital, a group of professional experts (including physicians and software engineers) are working on the computerization of a blood-transfusion traceability device. By focusing on a particular moment in this slow process of design, we analyze their collaborative practices during a work session. The analysis takes a praxeological and interactionist approach and is inspired by discussions on the role of artifacts in social practices currently developed within various research frameworks in this field: activity theory, distributed cognition, conversation analysis, and actor-network theory. After a brief presentation of the place of objects and artifacts in these ways of approaching action and human cognition, we show how the collective activity analyzed here is generated by the interweaving of discursive, gestural, and artifactual resources
    • …
    corecore