8 research outputs found

    Supply chains need to develop immunity to natural disasters

    Get PDF
    The major disruptions of the global economy are a function of events that are part of the natural evolution of supply chains, writes Rob Handfiel

    Using not-for-profit innovation networks to transition new technologies across the Valley of Death

    Get PDF
    Purpose This paper aims to seek answers to the question: What are the relevant factors that allow not-for-profit innovation networks to successfully transition new technologies from proof-of-concept to commercialisation? Design/methodology/approach This question is examined using the knowledge-based view and network orchestration theory. Data are collected from 35 interviews with managers and engineers working within seven centres that comprise the High Value Manufacturing Catapult (HVMC). These centres constitute a not-for-profit innovation network where suppliers, customers and competitors collaborate to help transition new technologies across the “Valley of Death” (the gap between establishing a proof of concept and commercialisation). Findings Network orchestration theory suggests that a hub firm facilitates the exchange of knowledge amongst network members (knowledge mobility), to enable these members to profit from innovation (innovation appropriability). The hub firm ensures positive network growth, and also allows for the entry and exit of network members (network stability). This study of not-for-profit innovation networks suggests the role of a network orchestrator is to help ensure that intellectual property becomes a public resource that enhances the productivity of the domestic economy. The authors observed how network stability was achieved by the HVMC's seven centres employing a loosely-coupled hybrid network configuration. This configuration however ensured that new technology development teams, comprised of suppliers, customers and competitors, remained tightly-coupled to enable co-development of innovative technologies. Matching internal technical and sectoral expertise with complementary experience from network members allowed knowledge to flow across organisational boundaries and throughout the network. Matrix organisational structures and distributed decision-making authority created opportunities for knowledge integration to occur. Actively moving individuals and teams between centres also helped to diffuse knowledge to network members, while regular meetings between senior management ensured network coordination and removed resource redundancies. Originality/value The study contributes to knowledge-based theory by moving beyond existing understanding of knowledge integration in firms, and identified how knowledge is exchanged and aggregated within not-for-profit innovation networks. The findings contribute to network orchestration theory by challenging the notion that network orchestrators should enact and enforce appropriability regimes (patents, licences, copyrights) to allow members to profit from innovations. Instead, the authors find that not-for-profit innovation networks can overcome the frictions that appropriability regimes often create when exchanging knowledge during new technology development. This is achieved by pre-defining the terms of network membership/partnership and setting out clear pathways for innovation scaling, which embodies newly generated intellectual property as a public resource. The findings inform a framework that is useful for policy makers, academics and managers interested in using not-for-profit networks to transition new technologies across the Valley of Death

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Preparing for the Era of the Digitally Transparent Supply Chain: A Call to Research in a New Kind of Journal

    No full text
    We are commencing a new era in global trade: the era of the digitized supply chain[...

    Redesigning global supply chains during compounding geopolitical disruptions: the role of supply chain logics

    Get PDF
    Purpose: Why do managers redesign global supply chains in a particular manner when faced with compounding geopolitical disruptions? In answering this research question, our study identifies a constrained system of reasoning (decision-making logic) employed by managers when they redesign their supply chains in situations of heightened uncertainty. Design/methodology/approach: We conducted 40 elite interviews with senior supply chain executives in 28 companies across nine industries from November 2019 to June 2020, when the United Kingdom was preparing to leave the European Union, the US-China trade war was escalating, and Covid-19 was spreading rapidly around the globe. Findings: When redesigning global supply chains, we find that managerial decision-making logic is constrained by three distinct environmental conditions: 1) the perceived intensity of institutional pressures; 2) the relative mobility of suppliers and supply chain assets, and; 3) the perceived severity of the potential disruption risk. Intense government pressure and persistent geopolitical risk tend to impact firms in the same industry, resulting in similar approaches to decision-making regarding supply chain design. However, where suppliers are relatively immobile and supply chain assets are relatively fixed, a dominant logic is consistently present. Originality/value: Building on an institutional logics perspective, our study finds that managerial decision-making under heightened uncertainty is not solely guided by institutional pressures but also by perceptions of the risk of supply chain disruption and immobility of supply chain assets. These findings support the theoretical development of a novel construct that we term ‘supply chain logics.’ Finally, our study provides a decision-making framework for Senior Executives competing in an increasingly complex and unstable business environment

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    No full text

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    No full text
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical science. © The Author(s) 2019. Published by Oxford University Press
    corecore