244 research outputs found

    Commercialising university inventions for sustainability—a case study of (non-)intermediating ‘cleantech’ at Aalto University

    Get PDF
    The challenge to transform towards more sustainable societies requires action on multiple levels, including commercialisation of inventions created in universities. We examine intermediation in the pre-commercialisation phase of cleantech inventions developed at Aalto University, Finland, focusing on the activities of a university innovation intermediary, Aalto Centre for Entrepreneurship (ACE), and how it operationalises the sustainability aims of the university. The roles of ACE are discussed in the context of the university innovation ecosystem and through three cases of cleantech inventions. Surprisingly, we find that ACE does not include any ways to operationally integrate sustainability. The consideration of sustainability in commercialisation projects is case specific and fully dependent on other actors. As a result we propose people- and process-oriented alternatives of how sustainability could be integrated into university innovation support functions. We also propose that innovation ecosystems should be broadened to include public actors for the benefits of co-creating for sustainability

    AdaChain: A Learned Adaptive Blockchain

    Full text link
    This paper presents AdaChain, a learning-based blockchain framework that adaptively chooses the best permissioned blockchain architecture in order to optimize effective throughput for dynamic transaction workloads. AdaChain addresses the challenge in the Blockchain-as-a-Service (BaaS) environments, where a large variety of possible smart contracts are deployed with different workload characteristics. AdaChain supports automatically adapting to an underlying, dynamically changing workload through the use of reinforcement learning. When a promising architecture is identified, AdaChain switches from the current architecture to the promising one at runtime in a way that respects correctness and security concerns. Experimentally, we show that AdaChain can converge quickly to optimal architectures under changing workloads, significantly outperform fixed architectures in terms of the number of successfully committed transactions, all while incurring low additional overhead

    Challenges and advanced concepts for the assessment of learning and memory function in mice

    Get PDF
    The mechanisms underlying the formation and retrieval of memories are still an active area of research and discussion. Manifold models have been proposed and refined over the years, with most assuming a dichotomy between memory processes involving non-conscious and conscious mechanisms. Despite our incomplete understanding of the underlying mechanisms, tests of memory and learning count among the most performed behavioral experiments. Here, we will discuss available protocols for testing learning and memory using the example of the most prevalent animal species in research, the laboratory mouse. A wide range of protocols has been developed in mice to test, e.g., object recognition, spatial learning, procedural memory, sequential problem solving, operant- and fear conditioning, and social recognition. Those assays are carried out with individual subjects in apparatuses such as arenas and mazes, which allow for a high degree of standardization across laboratories and straightforward data interpretation but are not without caveats and limitations. In animal research, there is growing concern about the translatability of study results and animal welfare, leading to novel approaches beyond established protocols. Here, we present some of the more recent developments and more advanced concepts in learning and memory testing, such as multi-step sequential lockboxes, assays involving groups of animals, as well as home cage-based assays supported by automated tracking solutions; and weight their potential and limitations against those of established paradigms. Shifting the focus of learning tests from the classical experimental chamber to settings which are more natural for rodents comes with a new set of challenges for behavioral researchers, but also offers the opportunity to understand memory formation and retrieval in a more conclusive way than has been attainable with conventional test protocols. We predict and embrace an increase in studies relying on methods involving a higher degree of automatization, more naturalistic- and home cage-based experimental setting as well as more integrated learning tasks in the future. We are confident these trends are suited to alleviate the burden on animal subjects and improve study designs in memory research

    Gradient Descent in Materio

    Get PDF
    Deep learning, a multi-layered neural network approach inspired by the brain, has revolutionized machine learning. One of its key enablers has been backpropagation, an algorithm that computes the gradient of a loss function with respect to the weights in the neural network model, in combination with its use in gradient descent. However, the implementation of deep learning in digital computers is intrinsically wasteful, with energy consumption becoming prohibitively high for many applications. This has stimulated the development of specialized hardware, ranging from neuromorphic CMOS integrated circuits and integrated photonic tensor cores to unconventional, material-based computing systems. The learning process in these material systems, taking place, e.g., by artificial evolution or surrogate neural network modelling, is still a complicated and time-consuming process. Here, we demonstrate an efficient and accurate homodyne gradient extraction method for performing gradient descent on the loss function directly in the material system. We demonstrate the method in our recently developed dopant network processing units, where we readily realize all Boolean gates. This shows that gradient descent can in principle be fully implemented in materio using simple electronics, opening up the way to autonomously learning material systems

    Use of Coronary Computed Tomographic Angiography to guide management of patients with coronary disease

    Get PDF
    Background In a prospective, multicenter, randomized controlled trial, 4,146 patients were randomized to receive standard care or standard care plus coronary computed tomography angiography (CCTA). Objectives The purpose of this study was to explore the consequences of CCTA-assisted diagnosis on invasive coronary angiography, preventive treatments, and clinical outcomes. Methods In post hoc analyses, we assessed changes in invasive coronary angiography, preventive treatments, and clinical outcomes using national electronic health records. Results Despite similar overall rates (409 vs. 401; p = 0.451), invasive angiography was less likely to demonstrate normal coronary arteries (20 vs. 56; hazard ratios [HRs]: 0.39 [95% confidence interval (CI): 0.23 to 0.68]; p < 0.001) but more likely to show obstructive coronary artery disease (283 vs. 230; HR: 1.29 [95% CI: 1.08 to 1.55]; p = 0.005) in those allocated to CCTA. More preventive therapies (283 vs. 74; HR: 4.03 [95% CI: 3.12 to 5.20]; p < 0.001) were initiated after CCTA, with each drug commencing at a median of 48 to 52 days after clinic attendance. From the median time for preventive therapy initiation (50 days), fatal and nonfatal myocardial infarction was halved in patients allocated to CCTA compared with those assigned to standard care (17 vs. 34; HR: 0.50 [95% CI: 0.28 to 0.88]; p = 0.020). Cumulative 6-month costs were slightly higher with CCTA: difference 462(95462 (95% CI: 303 to $621). Conclusions In patients with suspected angina due to coronary heart disease, CCTA leads to more appropriate use of invasive angiography and alterations in preventive therapies that were associated with a halving of fatal and non-fatal myocardial infarction. (Scottish COmputed Tomography of the HEART Trial [SCOT-HEART]; NCT01149590

    Frontiers in Pigment Cell and Melanoma Research

    Full text link
    We identify emerging frontiers in clinical and basic research of melanocyte biology and its associated biomedical disciplines. We describe challenges and opportunities in clinical and basic research of normal and diseased melanocytes that impact current approaches to research in melanoma and the dermatological sciences. We focus on four themes: (1) clinical melanoma research, (2) basic melanoma research, (3) clinical dermatology, and (4) basic pigment cell research, with the goal of outlining current highlights, challenges, and frontiers associated with pigmentation and melanocyte biology. Significantly, this document encapsulates important advances in melanocyte and melanoma research including emerging frontiers in melanoma immunotherapy, medical and surgical oncology, dermatology, vitiligo, albinism, genomics and systems biology, epidemiology, pigment biophysics and chemistry, and evolution

    Coronary CT Angiography and 5-Year Risk of Myocardial Infarction.

    Get PDF
    BACKGROUND: Although coronary computed tomographic angiography (CTA) improves diagnostic certainty in the assessment of patients with stable chest pain, its effect on 5-year clinical outcomes is unknown. METHODS: In an open-label, multicenter, parallel-group trial, we randomly assigned 4146 patients with stable chest pain who had been referred to a cardiology clinic for evaluation to standard care plus CTA (2073 patients) or to standard care alone (2073 patients). Investigations, treatments, and clinical outcomes were assessed over 3 to 7 years of follow-up. The primary end point was death from coronary heart disease or nonfatal myocardial infarction at 5 years. RESULTS: The median duration of follow-up was 4.8 years, which yielded 20,254 patient-years of follow-up. The 5-year rate of the primary end point was lower in the CTA group than in the standard-care group (2.3% [48 patients] vs. 3.9% [81 patients]; hazard ratio, 0.59; 95% confidence interval [CI], 0.41 to 0.84; P=0.004). Although the rates of invasive coronary angiography and coronary revascularization were higher in the CTA group than in the standard-care group in the first few months of follow-up, overall rates were similar at 5 years: invasive coronary angiography was performed in 491 patients in the CTA group and in 502 patients in the standard-care group (hazard ratio, 1.00; 95% CI, 0.88 to 1.13), and coronary revascularization was performed in 279 patients in the CTA group and in 267 in the standard-care group (hazard ratio, 1.07; 95% CI, 0.91 to 1.27). However, more preventive therapies were initiated in patients in the CTA group (odds ratio, 1.40; 95% CI, 1.19 to 1.65), as were more antianginal therapies (odds ratio, 1.27; 95% CI, 1.05 to 1.54). There were no significant between-group differences in the rates of cardiovascular or noncardiovascular deaths or deaths from any cause. CONCLUSIONS: In this trial, the use of CTA in addition to standard care in patients with stable chest pain resulted in a significantly lower rate of death from coronary heart disease or nonfatal myocardial infarction at 5 years than standard care alone, without resulting in a significantly higher rate of coronary angiography or coronary revascularization. (Funded by the Scottish Government Chief Scientist Office and others; SCOT-HEART ClinicalTrials.gov number, NCT01149590 .)

    Challenges and advanced concepts for the assessment of learning and memory function in mice

    Get PDF
    The mechanisms underlying the formation and retrieval of memories are still an active area of research and discussion. Manifold models have been proposed and refined over the years, with most assuming a dichotomy between memory processes involving non-conscious and conscious mechanisms. Despite our incomplete understanding of the underlying mechanisms, tests of memory and learning count among the most performed behavioral experiments. Here, we will discuss available protocols for testing learning and memory using the example of the most prevalent animal species in research, the laboratory mouse. A wide range of protocols has been developed in mice to test, e.g., object recognition, spatial learning, procedural memory, sequential problem solving, operant- and fear conditioning, and social recognition. Those assays are carried out with individual subjects in apparatuses such as arenas and mazes, which allow for a high degree of standardization across laboratories and straightforward data interpretation but are not without caveats and limitations. In animal research, there is growing concern about the translatability of study results and animal welfare, leading to novel approaches beyond established protocols. Here, we present some of the more recent developments and more advanced concepts in learning and memory testing, such as multi-step sequential lockboxes, assays involving groups of animals, as well as home cage-based assays supported by automated tracking solutions; and weight their potential and limitations against those of established paradigms. Shifting the focus of learning tests from the classical experimental chamber to settings which are more natural for rodents comes with a new set of challenges for behavioral researchers, but also offers the opportunity to understand memory formation and retrieval in a more conclusive way than has been attainable with conventional test protocols. We predict and embrace an increase in studies relying on methods involving a higher degree of automatization, more naturalistic- and home cage-based experimental setting as well as more integrated learning tasks in the future. We are confident these trends are suited to alleviate the burden on animal subjects and improve study designs in memory research
    • …
    corecore