46 research outputs found

    Long-Term Growth Driven by a Sequence of General Purpose Technologies

    Get PDF
    We present a Schumpterian model of endogenous growth with General Purpose Technologies (GPTs) that captures two important historical stylized facts: First, from the beginning of mankind until today GPTs are arriving at an increasing frequency and, second, all GPTs heavily depended on previous technologies. In our model, the arrival of GPTs is endogenous and arises stochastically depending on the currently available applied knowledge stock. This way of endogenizing the arrival of new GPTs allows for a model which is more in tune with the historical reality than the existing GPT models.Schumpeterian growth; research and development; general purpose technologies

    The Calm Before the Storm? - Anticipating the Arrival of General Purpose Technologies

    Get PDF
    This paper presents a Schumpeterian quality-ladder model incorporating the impact of new General Purpose Technologies (GPTs). GPTs are breakthrough technologies with a wide range of applications, opening up new innovational complementarities. In contrast to most existing models which focus on the events after the arrival of a new GPT, the model developed in this paper focuses on the events before the arrival if R&D firms know the point of time and the technological impact of this drastic innovation. In this framework we can show, that the economy goes through three main phases: First, the economy is in its old steady state. Second, there are transitional dynamics and finally, the economy is in a new steady state with higher growth rates. The transitional dynamics are characterized by oscillating cycles. Shortly before the arrival of a new GPT, there is an increase in R&D activities and growth going even beyond the old steady state levels and immediately before the arrival of the new GPT, there is a large slump in R&D activities using the old GPT.Schumpeterian growth, research and development, general purpose technologies

    Long-term growth driven by a sequence of general purpose technologies

    Get PDF
    We present a Schumpterian model of endogenous growth with General Purpose Technologies (GPTs) that captures two important historical stylized facts: First, from the beginning of mankind until today GPTs are arriving at an increasing frequency and, second, all GPTs heavily depended on previous technologies. In our model, the arrival of GPTs is endogenous and arises stochastically depending on the currently available applied knowledge stock. This way of endogenizing the arrival of new GPTs allows for a model which is more in tune with the historical reality than the existing GPT models

    Evaluation of Proclarix in the diagnostic work‐up of prostate cancer

    Get PDF
    Objectives: The use of multiparametric magnetic resonance imaging (mpMRI) has been widely adopted in the diagnostic work‐up for suspicious prostate cancer (PCa) and is recommended in most current guidelines. However, mpMRI lesions are often indeterminate and/or turn out to be false‐positive on prostate biopsy. The aim of this work was to evaluate Proclarix, a biomarker test for the detection of relevant PCa, regarding its diagnostic value in all men before biopsy and in men with indeterminate lesions on mpMRI (PI‐RADS 3) during work‐up for PCa. Materials and Methods: Men undergoing mpMRI‐targeted and systematic biopsy of the prostate were prospectively enrolled. The Proclarix test was evaluated for the detection accuracy of clinically significant PCa (csPCa) defined as Grade Group ≥ 2 and its association to mpMRI results. Further, Proclarix's performance was also tested when adapted to prostate volume (Proclarix density) and performance compared to PSA density (PSAD). Results: A total of 150 men with a median age of 65 years and median PSA of 5.8 ng/mL were included in this study. CsPCa was diagnosed in 65 (43%) men. Proclarix was significantly associated with csPCa and higher PI‐RADS score (p < 0.001). At the pre‐defined cut‐off of 10%, Proclarix's sensitivity for csPCa was 94%, specificity 19%, negative predictive value 80% and positive predictive value 47%. Proclarix density showed the highest AUC for the detection of csPCa of 0.77 (95%CI: 0.69–0.85) compared to PSA, PSAD and Proclarix alone. Proclarix was able to identify all six csPCa in men with PI‐RADS 3 lesions (n = 28), whereas PSAD missed two out of six. At optimized cut‐offs, Proclarix density outperformed PSAD by potentially avoiding 41% of unnecessary biopsies. Conclusion: Proclarix demonstrates high sensitivity in detecting csPCa but may still result in unnecessary biopsies. However, Proclarix density was able to outperform PSAD and Proclarix and was found to be useful in men with PI‐RADS 3 findings by safely avoiding unnecessary biopsies without missing csPCa

    Neuronal diversity of the amygdala and the bed nucleus of the stria terminalis

    Get PDF
    The amygdala complex is a diverse group of more than 13 nuclei, segregated in five major groups: the basolateral (BLA), central (CeA), medial (MeA), cortical (CoA), and basomedial (BMA) amygdala nuclei. These nuclei can be distinguished depending on their cytoarchitectonic properties, connectivity, genetic, and molecular identity, and most importantly, on their functional role in animal behavior. The extended amygdala includes the CeA and the bed nucleus of the stria terminalis (BNST). Both CeA and the BNST share similar cellular organization, including common neuron types, reciprocal connectivity, and many overlapping downstream targets. In this section, we describe the advances of our knowledge on neuronal diversity in the amygdala complex and the BNST, based on recent functional studies, performed at genetic, molecular, physiological, and anatomical levels in rodent models, especially rats and mice. Molecular and connection property can be used separately, or in combinations, to define neuronal populations, leading to a multiplexed neuronal diversity-supporting different functional roles. © 2020 Elsevier B.V

    Global, regional, and national burden of disorders affecting the nervous system, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021

    Get PDF
    BackgroundDisorders affecting the nervous system are diverse and include neurodevelopmental disorders, late-life neurodegeneration, and newly emergent conditions, such as cognitive impairment following COVID-19. Previous publications from the Global Burden of Disease, Injuries, and Risk Factor Study estimated the burden of 15 neurological conditions in 2015 and 2016, but these analyses did not include neurodevelopmental disorders, as defined by the International Classification of Diseases (ICD)-11, or a subset of cases of congenital, neonatal, and infectious conditions that cause neurological damage. Here, we estimate nervous system health loss caused by 37 unique conditions and their associated risk factors globally, regionally, and nationally from 1990 to 2021.MethodsWe estimated mortality, prevalence, years lived with disability (YLDs), years of life lost (YLLs), and disability-adjusted life-years (DALYs), with corresponding 95% uncertainty intervals (UIs), by age and sex in 204 countries and territories, from 1990 to 2021. We included morbidity and deaths due to neurological conditions, for which health loss is directly due to damage to the CNS or peripheral nervous system. We also isolated neurological health loss from conditions for which nervous system morbidity is a consequence, but not the primary feature, including a subset of congenital conditions (ie, chromosomal anomalies and congenital birth defects), neonatal conditions (ie, jaundice, preterm birth, and sepsis), infectious diseases (ie, COVID-19, cystic echinococcosis, malaria, syphilis, and Zika virus disease), and diabetic neuropathy. By conducting a sequela-level analysis of the health outcomes for these conditions, only cases where nervous system damage occurred were included, and YLDs were recalculated to isolate the non-fatal burden directly attributable to nervous system health loss. A comorbidity correction was used to calculate total prevalence of all conditions that affect the nervous system combined.FindingsGlobally, the 37 conditions affecting the nervous system were collectively ranked as the leading group cause of DALYs in 2021 (443 million, 95% UI 378–521), affecting 3·40 billion (3·20–3·62) individuals (43·1%, 40·5–45·9 of the global population); global DALY counts attributed to these conditions increased by 18·2% (8·7–26·7) between 1990 and 2021. Age-standardised rates of deaths per 100 000 people attributed to these conditions decreased from 1990 to 2021 by 33·6% (27·6–38·8), and age-standardised rates of DALYs attributed to these conditions decreased by 27·0% (21·5–32·4). Age-standardised prevalence was almost stable, with a change of 1·5% (0·7–2·4). The ten conditions with the highest age-standardised DALYs in 2021 were stroke, neonatal encephalopathy, migraine, Alzheimer's disease and other dementias, diabetic neuropathy, meningitis, epilepsy, neurological complications due to preterm birth, autism spectrum disorder, and nervous system cancer.InterpretationAs the leading cause of overall disease burden in the world, with increasing global DALY counts, effective prevention, treatment, and rehabilitation strategies for disorders affecting the nervous system are needed

    Beslutsstödsystem: Smarta Arbetsflöden : I kontexten av telekommunikationsnätverk

    No full text
    This thesis, conducted at Ericsson Software Research, investigates how workflows can be made to work more dynamically. The thesis further aims at recommending how such workflows can be applied in the context of network operation centres (NOC) - the steps needed for automation of trouble ticket handling. An agent-based solution is considered, where each agent implements a workflow, ruled by conventional methods, but the agents communicate in a way allowing new agents to appear as well as allowing disconnections and reconnections without disturbing other parts of the system. Furthermore, different artificial intelligence algorithms most suited for automatic information gathering are investigated, mainly for application in troubleshooting environments during telecommunication network management. The intention of this is to provide the workflows with the tools needed for automation of routine tasks – in a way users can easily understand and follow. The perception here is that Naïve Bayesian networks are preferred, as they are easy to train, scales good and inference from a Naïve net is easy to understand in an intuitive way. Further, ZeroMQ is recommended when designing a decoupled workflow system.Detta examensarbete, som utförts vid Ericsson Software research, undersöker hur arbetsflöden kan göras mer dynamiska. Bland annat undersöks ett agentliknande system, där arbetsflöden styrs av konventionella metoder i respektive agent, men där agenterna kommunicerar på ett sätt som tillåter tillkomst och bortfall av agenter – utan att det påverkar andra delar av systemet. Vidare undersöks vilka algoritmer inom artificiell intelligens som bäst kan lämpa sig för automatisering av informationsinhämtning. Forskningen bedrivs inom telekommunikationsområdet – informationsinhämtningen är utformad att fungera i första hand i den kontexten. Syftet med detta är för att vidare förse arbetsflödena med de verktyg som krävs för att automatiskt utföra rutinuppgifter, men på ett sätt som användare enkelt kan följa och förstå. Uppfattningen här är att så kallade Naiva Bayesianska nät är att föredra, de är enkla att träna, skalar effektivt och slutledningar dragna ur ett Naivt nät är enkla att förstå på ett intuitivt sätt. Slutligen rekommenderas ZeroMQ när ett system utvecklas i syfte att implementera frikopplade arbetsflöden (decoupled workflow)

    The calm before the storm? - Anticipating the arrival of general purpose technologies

    No full text
    This paper presents a Schumpeterian quality-ladder model incorporating the impact of new General Purpose Technologies (GPTs). GPTs are breakthrough technologies with a wide range of applications, opening up new innovational complementarities. In contrast to most existing models which focus on the events after the arrival of a new GPT, the model developed in this paper focuses on the events before the arrival if R&D firms know the point of time and the technological impact of this drastic innovation. In this framework we can show, that the economy goes through three main phases: First, the economy is in its old steady state. Second, there are transitional dynamics and finally, the economy is in a new steady state with higher growth rates. The transitional dynamics are characterized by oscillating cycles. Shortly before the arrival of a new GPT, there is an increase in R&D activities and growth going even beyond the old steady state levels and immediately before the arrival of the new GPT, there is a large slump in R&D activities using the old GPT
    corecore