63 research outputs found

    Complexity Heliophysics: A lived and living history of systems and complexity science in Heliophysics

    Full text link
    In this piece we study complexity science in the context of Heliophysics, describing it not as a discipline, but as a paradigm. In the context of Heliophysics, complexity science is the study of a star, interplanetary environment, magnetosphere, upper and terrestrial atmospheres, and planetary surface as interacting subsystems. Complexity science studies entities in a system (e.g., electrons in an atom, planets in a solar system, individuals in a society) and their interactions, and is the nature of what emerges from these interactions. It is a paradigm that employs systems approaches and is inherently multi- and cross-scale. Heliophysics processes span at least 15 orders of magnitude in space and another 15 in time, and its reaches go well beyond our own solar system and Earth's space environment to touch planetary, exoplanetary, and astrophysical domains. It is an uncommon domain within which to explore complexity science. After first outlining the dimensions of complexity science, the review proceeds in three epochal parts: 1) A pivotal year in the Complexity Heliophysics paradigm: 1996; 2) The transitional years that established foundations of the paradigm (1996-2010); and 3) The emergent literature largely beyond 2010. This review article excavates the lived and living history of complexity science in Heliophysics. The intention is to provide inspiration, help researchers think more coherently about ideas of complexity science in Heliophysics, and guide future research. It will be instructive to Heliophysics researchers, but also to any reader interested in or hoping to advance the frontier of systems and complexity science

    New product development portfolio management : a systematic literature review

    Get PDF
    Product innovation is a key driver of any company’s growth. The biggest challenge in managing product innovation is in determining the most promising new product development (NPD) projects from the many ideas generated, known as portfolio management. In practice, NPD portfolio management still bears some problematic issues, including focusing mainly on portfolio selection rather than managing the entire process, the vague links between the process and business strategy, and a lack of formal process. Therefore, a study that looks at NPD portfolio management through different perspectives is required. NPD portfolio management deals with dynamic decision-making processes, involving not only selection decisions, but also decisions to delay, continue or even terminate projects. To understand this integrative process, a systematic literature review that explored four knowledge domains, i.e., NPD portfolio management, decision- making, strategy and organisational routines, was carried out. It involved 40 articles published from 1981-2012. The review focused on revealing how decision-making processes in NPD portfolio management are conducted and how they relate to the strategy process and organisational routines. The key findings show that decisions in the NPD portfolio management process are made through interaction between cognitive and political factors, overlooking the organisational factors in the process. Furthermore, the extant literature does not explicitly explain how to link the NPD portfolio management process to the strategy process. Also, the findings indicate that the concept of organisational routines had not been used when investigating NPD portfolio management. These are the research gaps that led to the three research questions: 1) How are organisational factors involved with the cognitive and political factors in the decision-making processes in NPD portfolio management?; 2) How do the decision-making processes in NPD portfolio management link to the business strategy?; and 3) To what extent are organisational routines related to the decision-making processes in NPD portfolio management

    Model Averaging and its Use in Economics

    Get PDF
    The method of model averaging has become an important tool to deal with model uncertainty, for example in situations where a large amount of different theories exist, as are common in economics. Model averaging is a natural and formal response to model uncertainty in a Bayesian framework, and most of the paper deals with Bayesian model averaging. The important role of the prior assumptions in these Bayesian procedures is highlighted. In addition, frequentist model averaging methods are also discussed. Numerical methods to implement these methods are explained, and I point the reader to some freely available computational resources. The main focus is on uncertainty regarding the choice of covariates in normal linear regression models, but the paper also covers other, more challenging, settings, with particular emphasis on sampling models commonly used in economics. Applications of model averaging in economics are reviewed and discussed in a wide range of areas, among which growth economics, production modelling, finance and forecasting macroeconomic quantities.Comment: forthcoming; accepted versio

    Congress UPV Proceedings of the 21ST International Conference on Science and Technology Indicators

    Get PDF
    This is the book of proceedings of the 21st Science and Technology Indicators Conference that took place in València (Spain) from 14th to 16th of September 2016. The conference theme for this year, ‘Peripheries, frontiers and beyond’ aimed to study the development and use of Science, Technology and Innovation indicators in spaces that have not been the focus of current indicator development, for example, in the Global South, or the Social Sciences and Humanities. The exploration to the margins and beyond proposed by the theme has brought to the STI Conference an interesting array of new contributors from a variety of fields and geographies. This year’s conference had a record 382 registered participants from 40 different countries, including 23 European, 9 American, 4 Asia-Pacific, 4 Africa and Near East. About 26% of participants came from outside of Europe. There were also many participants (17%) from organisations outside academia including governments (8%), businesses (5%), foundations (2%) and international organisations (2%). This is particularly important in a field that is practice-oriented. The chapters of the proceedings attest to the breadth of issues discussed. Infrastructure, benchmarking and use of innovation indicators, societal impact and mission oriented-research, mobility and careers, social sciences and the humanities, participation and culture, gender, and altmetrics, among others. We hope that the diversity of this Conference has fostered productive dialogues and synergistic ideas and made a contribution, small as it may be, to the development and use of indicators that, being more inclusive, will foster a more inclusive and fair world

    Study on open science: The general state of the play in Open Science principles and practices at European life sciences institutes

    Get PDF
    Nowadays, open science is a hot topic on all levels and also is one of the priorities of the European Research Area. Components that are commonly associated with open science are open access, open data, open methodology, open source, open peer review, open science policies and citizen science. Open science may a great potential to connect and influence the practices of researchers, funding institutions and the public. In this paper, we evaluate the level of openness based on public surveys at four European life sciences institute

    Coopetition in an open-source way : lessons from mobile and cloud computing infrastructures

    Get PDF
    An increasing amount of technology is no longer developed in-house. Instead, we are in a new age where technology is developed by a networked community of individuals and organizations, who base their relations to each other on mutual interest. Advances arising from research in platforms, ecosystems, and infrastructures can provide valuable knowledge for better understanding and explaining technology development among a network of firms. More surprisingly, recent research suggests that technology can be jointly developed by rival competing firms in an open-source way. For instance, it is known that the mobile device makers Apple and Samsung continued collaborating in open-source projects while running expensive patent wars in the courts. On top of multidisciplinary theory in open-source software, cooperation among competitors (aka coopetition) and digital infrastructures, I (and my coauthors) explored how rival firms cooperate in the joint development of open-source infrastructures. While assimilating a wide variety of paradigms and analytical approaches, this doctoral research combined the qualitative analysis of naturally occurring data (QA) with the mining of software repositories (MSR) and social network analysis (SNA) within a set of case studies. By turning to the mobile and cloud computing industries in general, and the WebKit and OpenStack opensource infrastructures in particular, we found out that qualitative ethnographic materials, combined with social network visualizations, provide a rich medium that enables a better understanding of competitive and cooperative issues that are simultaneously present and interconnected in open-source infrastructures. Our research contributes back to managerial literature in coopetition strategy, but more importantly to Information Systems by addressing both cooperation and competition within the development of high-networked open-source infrastructures.Yhä suurempaa osaa teknologiasta ei enää kehitetä organisaatioiden omasta toimesta. Sen sijaan, olemme uudella aikakaudella jossa teknologiaa kehitetään verkostoituneessa yksilöiden ja organisaatioiden yhteisössä, missä toimitaan perustuen yhteiseen tavoitteeseen. Alustojen, ekosysteemien ja infrastruktuurien tutkimuksen tulokset voivat tuottaa arvokasta tietämystä teknologian kehittämisestä yritysten verkostossa. Erityisesti tuore tutkimustieto osoittaa että kilpailevat yritykset voivat yhdessä kehittää teknologiaa avoimeen lähdekoodiin perustuvilla käytännöillä. Esimerkiksi tiedetään että mobiililaitteiden valmistajat Apple ja Samsung tekivät yhteistyötä avoimen lähdekoodin projekteissa ja kävivät samaan aikaan kalliita patenttitaistoja eri oikeusfoorumeissa. Perustuen monitieteiseen teoriaan avoimen lähdekoodin ohjelmistoista, yhteistyöstä kilpailijoiden kesken (coopetition) sekä digitaalisista infrastruktuureista, minä (ja kanssakirjoittajani) tutkimme miten kilpailevat yritykset tekevät yhteistyötä avoimen lähdekoodin infrastruktuurien kehityksessä. Sulauttaessaan runsaan joukon paradigmoja ja analyyttisiä lähestymistapoja case-joukon puitteissa, tämä väitöskirjatutkimus yhdisti luonnollisesti esiintyvän datan kvantitatiivisen analyysin ohjelmapakettivarastojen louhintaan ja sosiaalisten verkostojen analyysiin. Tutkiessamme mobiili- ja pilvipalveluiden teollisuudenaloja yleisesti, ja WebKit ja OpenStack avoimen lähdekoodin infrastruktuureja erityisesti, havaitsimme että kvalitatiiviset etnografiset materiaalit yhdistettyinä sosiaalisten verkostojen visualisointiin tuottavat rikkaan aineiston joka mahdollistaa avoimen lähdekoodin infrastruktuuriin samanaikaisesti liittyvien kilpailullisten ja yhteistyökuvioiden hyvän ymmärtämisen. Tutkimuksemme antaa oman panoksensa johdon kirjallisuuteen coopetition strategy -alueella, mutta sitäkin enemmän tietojärjestelmätieteeseen, läpikäymällä sekä yhteistyötä että kilpailua tiiviisti verkostoituneessa avoimen lähdekoodin infrastruktuurien kehitystoiminnassaUma crescente quantidade de tecnologia não é desenvolvida internamente por uma só organização. Em vez disso, estamos em uma nova era em que a tecnologia é desenvolvida por uma comunidade de indivíduos e organizações que baseiam suas relações umas com as outras numa rede de interesse mútuo. Os avanços teórico decorrentes da pesquisa em plataformas computacionais, ecossistemas e infraestruturas digitais fornecem conhecimentos valiosos para uma melhor compreensão e explicação do desenvolvimento tecnológico por uma rede de multiplas empresas. Mais surpreendentemente, pesquisas recentes sugerem que tecnologia pode ser desenvolvida conjuntamente por empresas rivais concorrentes e de uma forma aberta (em código aberto). Por exemplo, sabe-se que os fabricantes de dispositivos móveis Apple e Samsung continuam a colaborar em projetos de código aberto ao mesmo tempo que se confrontam em caras guerras de patentes nos tribunais. Baseados no conhecimento científico de software de código aberto, de cooperação entre concorrentes (também conhecida como coopetição) e de infraestruturas digitais, eu e os meus co-autores exploramos como empresas concorrentes cooperam no desenvolvimento conjunto de infraestruturas de código aberto. Ao utilizar uma variedade de paradigmas e abordagens analíticas, esta pesquisa de doutoramento combinou a análise qualitativa de dados de ocorrência natural (QA) com a análise de repositórios de softwares (MSR) e a análise de redes sociais (SNA) dentro de um conjunto de estudos de casos. Ao investigar as industrias de technologias móveis e de computação em nuvem em geral, e as infraestruturas em código aberto WebKit e OpenStack, em particular, descobrimos que o material etnográfico qualitativo, combinado com visualizações de redes sociais, fornece um meio rico que permite uma melhor compreensão das problemas competitivos e cooperativos que estão simultaneamente presentes e interligados em infraestruturas de código aberto. A nossa pesquisa contribui para a literatura em gestão estratégica e coompetição, mas mais importante para literatura em Sistemas de Informação, abordando a cooperação e concorrência no desenvolvimento de infraestruturas de código aberto por uma rede the indivíduos e organizações em interesse mútuo

    Industry competition in the long-run with particular reference to the petrochemical industry

    Get PDF
    Imperial Users onl

    Development of Diffusion MRI Methodology to Quantify White Matter Integrity Underlying Post-Stroke Anomia

    Get PDF
    In 1909 German neurologist Korbinian Brodmann wrote “functional localization of the cerebral cortex without the lead of anatomy is impossible... In all domains, physiology has its firmest foundations in anatomy [1”. While histology is the current gold standard for studying brain microstructure, it is primarily a post-mortem technique that has an average resolution of one micrometer making it impractical for studying the entire brain. Diffusion Magnetic Resonance Imaging (dMRI) is ideally suited to study whole-brain tissue microstructure by sensitizing the MRI contrast to water diffusion, which has a length scale on the order of micrometers. Even though dMRI is applied clinically for the detection of acute ischemia, the relation between tissue microstructure and the dMRI signal is complex and not fully understood. The focus of this dissertation was the validation and development of a new biophysical model of the dMRI signal. Notwithstanding, it is important to keep in mind the potential clinical applications of these models, so in parallel we studied the relationship between white matter integrity and language impairments in post-stroke anomia. This application is of interest since response to language treatment is variable and it is currently difficult to predict which patients will benefit. A better understanding of the underlying brain damage could help inform on functionality and recovery potential. Our work resulted in 9 peer-reviewed papers in international journals and 13 abstracts in proceedings at national and international conferences. Using data collected from 32 chronic stroke patients with language impairments, we studied the relation between baseline naming impairments and microstructural integrity of the residual white matter. An existing dMRI technique, Diffusional Kurtosis Imaging (DKI), was used to assess the tissue microstructure along the length of two major white matter bundles: the Inferior Longitudinal Fasciculus (ILF) and the Superior Longitudinal Fasciculus (SLF). The frequency of semantic paraphasias was strongly associated with ILF axonal loss, whereas phonemic paraphasias were strongly associated with SLF axonal loss. This double dissociation between semantic and phonological processing is in agreement with the dual stream model of language processing and corroborates the concept that, during speech production, knowledge association (semantics) depends on the integrity of ventral pathways (ILF), whereas form encoding (phonological encoding) is more localized to dorsal pathways (SLF). Using a smaller dataset of 8 chronic stroke subjects whom underwent speech entrainment therapy, we assessed if naming improvements were supported by underlying changes in microstructure. Remarkably, we saw that a decrease in semantic errors during confrontational naming was related to a renormalization of the microstructure of the ILF. Together, these two studies support the idea that white matter integrity (in addition to regional gray matter damage) impacts baseline stroke impairments and disease progression. Acquiring accurate information about a patient’s linguistic disorder and the underlying neuropathology is often an integral part to developing an appropriate intervention strategy. However, DKI metrics describe the general physical process of diffusion, which can be difficult to interpret biologically. Different pathological processes could lead to similar DKI changes further complicating interpretation and possibly decreasing its specificity to disease. A multitude of biophysical models have been developed to improve the specificity of dMRI. Due to the complexity of biological tissue, assumptions are necessary, which can differ in stringency depending on the dMRI data at hand. One such assumption is that axons can be approximated by water confined to impermeable thin cylinders. In this dissertation, we provide evidence for this “stick model”. Using data from 2 healthy controls we show that the dMRI signal decay behaves as predicted from theory, particularly at strong diffusion weightings. This work validated the foundation of a biophysical model known as Fiber Ball Imaging (FBI), which allows for the calculation of the angular dependence of fiber bundles. Here, we extend FBI by introducing the technique Fiber Ball White Matter (FBWM) modeling that in addition provides estimations for the Axonal Water Fraction (AWF) and compartmental diffusivities. The ability to accurately estimate compartment specific diffusion dynamics could provide the opportunity to distinguish between different disease processes that affect axons differently than the extra-axonal environment (e.g. gliosis). Lastly, we were able to show that FBI data can also be used to calculate compartmental transverse relaxation times (T2). These metrics can be used as biomarkers, aid in the calculation of the myelin content, or be used to reduce bias in diffusion modeling metrics. Future work should focus on the application of FBI and FBWM to the study of white matter in post-stroke anomia. Since FBWM offers the advantage of isolating the diffusion dynamics of the intra- and extra- axonal environments, it could be used to distinguish between pathological processes such as glial cell infiltration and axonal degeneration. A more specific assessment of the structural integrity underlying anomia could provide information on an individual’s recovery potential and could pave the way for more targeted treatment strategies. The isolation of intra-axonal water is also beneficial for a technique known as dMRI tractography, which delineates the pathway of fiber bundles in the brain. dMRI tractography is a popular research tool for studying brain networks but it is notoriously challenging to do in post-stroke brains. In damaged brain tissue, the high extra-cellular water content masks the directionality of fibers; however, since FBI provides the orientational dependence of solely intra-axonal water, it is not affected by this phenomenon. It is important to understand that caution should be taken when applying biophysical models (FBWM/FBI vs. DKI) to the diseased brain as the validation we provided in this work was only for healthy white matter and these experiments should be repeated in pathological white matter

    Empirical essays on global pharmaceutical innovation

    Get PDF
    The economic impact of the pharmaceutical industry is incontestable. The total drug bill across OECD countries has continuously risen over the last decades (OECD, 2011), and, consequently, healthcare payers are increasingly implementing stricter policies that promote access to cheaper treatments. These policies, however, may hinder incentives to innovate, especially in diseases of substantial public health importance for which there is either under-investment or where innovation is difficult. Therefore, it is fundamental that health systems design policies that strike the right balance between promoting the development of affordable drugs, and allowing sufficient rents to innovators in order to incentivise R&D investment. The debate on this balance has never been more pertinent, with a slowdown in the number of drugs in the pipeline for potential market launch. Around 90% of drug candidates do not successfully complete the mid-stage of drug discovery (Paul et al., 2010; Mestre-Ferrandiz et al., 2012), contributing to the escalation of R&D costs with potentially significant social welfare consequences. The aim of this thesis is to contribute to this debate by exploring the nature of the R&D process, and assessing the factors associated with decreased productivity across disease areas and its equity implications. We survey the literature on the determinants of pharmaceutical innovation and critically appraise the evidence on factors that influence innovation of new therapies. We identify gaps and contribute conceptually to the understanding of the determinants of innovation. We depart from the existing literature in three significant ways. The major contribution is the analysis of failure for all stages of the R&D process, using a unique global panel dataset built by merging data on the lifecycle of industry innovation processes with global health data (Chapter 4). Secondly, we have used methodological approaches that model the dynamic nature of R&D decisions to forecast drug availability in the coming decades (Chapter 5). Thirdly, we are the first to assess the global impact of innovation on equity of access to new therapies (Chapter 6). Acknowledging the implications of data limitations (Chapter 3) we produce insight that contributes to understanding the determinants of failure in innovation and its implications for future availability of new therapies. Results suggest those determinants differ across the R&D stages. Furthermore, market competition may intensify the level of failure if too many young drugs are competing in the market, whilst collaboration between firms has an unclear effect on innovation. Moreover, the distribution of the R&D activity and disease burden have not changed significantly over the last two decades with a concentration of innovation in more commercially attractive disease areas associated with high mortality in the richest countries. Finally, that those equity concerns are likely to persist unless new policy interventions are designed to address inequalities in R&D and access to new therapies.Open Acces

    Dancing on the shoulders of giants:knowledge dynamics of renewable energy technologies

    Get PDF
    corecore