72 research outputs found

    The role of a firm's absorptive capacity and the technology transfer process in clusters: How effective are technology centres in low-tech clusters?

    Full text link
    This paper analyses how the internal resources of small- and medium-sized enterprises determine access (learning processes) to technology centres (TCs) or industrial research institutes (innovation infrastructure) in traditional low-tech clusters. These interactions basically represent traded (market-based) transactions, which constitute important sources of knowledge in clusters. The paper addresses the role of TCs in low-tech clusters, and uses semi-structured interviews with 80 firms in a manufacturing cluster. The results point out that producer–user interactions are the most frequent; thus, the higher the sector knowledge-intensive base, the more likely the utilization of the available research infrastructure becomes. Conversely, the sectors with less knowledge-intensive structures, i.e. less absorptive capacity (AC), present weak linkages to TCs, as they frequently prefer to interact with suppliers, who act as transceivers of knowledge. Therefore, not all the firms in a cluster can fully exploit the available research infrastructure, and their AC moderates this engagement. In addition, the existence of TCs is not sufficient since the active role of a firm's search strategies to undertake interactions and conduct openness to available sources of knowledge is also needed. The study has implications for policymakers and academia

    Chemical vapour deposition synthetic diamond: materials, technology and applications

    Full text link
    Substantial developments have been achieved in the synthesis of chemical vapour deposition (CVD) diamond in recent years, providing engineers and designers with access to a large range of new diamond materials. CVD diamond has a number of outstanding material properties that can enable exceptional performance in applications as diverse as medical diagnostics, water treatment, radiation detection, high power electronics, consumer audio, magnetometry and novel lasers. Often the material is synthesized in planar form, however non-planar geometries are also possible and enable a number of key applications. This article reviews the material properties and characteristics of single crystal and polycrystalline CVD diamond, and how these can be utilized, focusing particularly on optics, electronics and electrochemistry. It also summarizes how CVD diamond can be tailored for specific applications, based on the ability to synthesize a consistent and engineered high performance product.Comment: 51 pages, 16 figure

    What to consider when pseudohypoparathyroidism is ruled out: IPPSD and differential diagnosis

    Get PDF
    Background: Pseudohypoparathyroidism (PHP) is a rare disease whose phenotypic features are rather difficult to identify in some cases. Thus, although these patients may present with the Albright''s hereditary osteodystrophy (AHO) phenotype, which is characterized by small stature, obesity with a rounded face, subcutaneous ossifications, mental retardation and brachydactyly, its manifestations are somewhat variable. Indeed, some of them present with a complete phenotype, whereas others show only subtle manifestations. In addition, the features of the AHO phenotype are not specific to it and a similar phenotype is also commonly observed in other syndromes. Brachydactyly type E (BDE) is the most specific and objective feature of the AHO phenotype, and several genes have been associated with syndromic BDE in the past few years. Moreover, these syndromes have a skeletal and endocrinological phenotype that overlaps with AHO/PHP. In light of the above, we have developed an algorithm to aid in genetic testing of patients with clinical features of AHO but with no causative molecular defect at the GNAS locus. Starting with the feature of brachydactyly, this algorithm allows the differential diagnosis to be broadened and, with the addition of other clinical features, can guide genetic testing. Methods: We reviewed our series of patients (n = 23) with a clinical diagnosis of AHO and with brachydactyly type E or similar pattern, who were negative for GNAS anomalies, and classify them according to the diagnosis algorithm to finally propose and analyse the most probable gene(s) in each case. Results: A review of the clinical data for our series of patients, and subsequent analysis of the candidate gene(s), allowed detection of the underlying molecular defect in 12 out of 23 patients: five patients harboured a mutation in PRKAR1A, one in PDE4D, four in TRPS1 and two in PTHLH. Conclusions: This study confirmed that the screening of other genes implicated in syndromes with BDE and AHO or a similar phenotype is very helpful for establishing a correct genetic diagnosis for those patients who have been misdiagnosed with "AHO-like phenotype" with an unknown genetic cause, and also for better describing the characteristic and differential features of these less common syndromes

    A global multinational survey of cefotaxime-resistant coliforms in urban wastewater treatment plants

    Get PDF
    The World Health Organization Global Action Plan recommends integrated surveillance programs as crucial strategies for monitoring antibiotic resistance. Although several national surveillance programs are in place for clinical and veterinary settings, no such schemes exist for monitoring antibiotic-resistant bacteria in the environment. In this transnational study, we developed, validated, and tested a low-cost surveillance and easy to implement approach to evaluate antibiotic resistance in wastewater treatment plants (WWTPs) by targeting cefotaxime-resistant (CTX-R) coliforms as indicators. The rationale for this approach was: i) coliform quantification methods are internationally accepted as indicators of fecal contamination in recreational waters and are therefore routinely applied in analytical labs; ii) CTX-R coliforms are clinically relevant, associated with extended-spectrum β-lactamases (ESBLs), and are rare in pristine environments. We analyzed 57 WWTPs in 22 countries across Europe, Asia, Africa, Australia, and North America. CTX-R coliforms were ubiquitous in raw sewage and their relative abundance varied significantly (<0.1% to 38.3%), being positively correlated (p < 0.001) with regional atmospheric temperatures. Although most WWTPs removed large proportions of CTX-R coliforms, loads over 103 colony-forming units per mL were occasionally observed in final effluents. We demonstrate that CTX-R coliform monitoring is a feasible and affordable approach to assess wastewater antibiotic resistance status

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Global overview of the management of acute cholecystitis during the COVID-19 pandemic (CHOLECOVID study)

    Get PDF
    Background: This study provides a global overview of the management of patients with acute cholecystitis during the initial phase of the COVID-19 pandemic. Methods: CHOLECOVID is an international, multicentre, observational comparative study of patients admitted to hospital with acute cholecystitis during the COVID-19 pandemic. Data on management were collected for a 2-month study interval coincident with the WHO declaration of the SARS-CoV-2 pandemic and compared with an equivalent pre-pandemic time interval. Mediation analysis examined the influence of SARS-COV-2 infection on 30-day mortality. Results: This study collected data on 9783 patients with acute cholecystitis admitted to 247 hospitals across the world. The pandemic was associated with reduced availability of surgical workforce and operating facilities globally, a significant shift to worse severity of disease, and increased use of conservative management. There was a reduction (both absolute and proportionate) in the number of patients undergoing cholecystectomy from 3095 patients (56.2 per cent) pre-pandemic to 1998 patients (46.2 per cent) during the pandemic but there was no difference in 30-day all-cause mortality after cholecystectomy comparing the pre-pandemic interval with the pandemic (13 patients (0.4 per cent) pre-pandemic to 13 patients (0.6 per cent) pandemic; P = 0.355). In mediation analysis, an admission with acute cholecystitis during the pandemic was associated with a non-significant increased risk of death (OR 1.29, 95 per cent c.i. 0.93 to 1.79, P = 0.121). Conclusion: CHOLECOVID provides a unique overview of the treatment of patients with cholecystitis across the globe during the first months of the SARS-CoV-2 pandemic. The study highlights the need for system resilience in retention of elective surgical activity. Cholecystectomy was associated with a low risk of mortality and deferral of treatment results in an increase in avoidable morbidity that represents the non-COVID cost of this pandemic

    The genetic architecture of the human cerebral cortex

    Get PDF
    The cerebral cortex underlies our complex cognitive capabilities, yet little is known about the specific genetic loci that influence human cortical structure. To identify genetic variants that affect cortical structure, we conducted a genome-wide association meta-analysis of brain magnetic resonance imaging data from 51,665 individuals. We analyzed the surface area and average thickness of the whole cortex and 34 regions with known functional specializations. We identified 199 significant loci and found significant enrichment for loci influencing total surface area within regulatory elements that are active during prenatal cortical development, supporting the radial unit hypothesis. Loci that affect regional surface area cluster near genes in Wnt signaling pathways, which influence progenitor expansion and areal identity. Variation in cortical structure is genetically correlated with cognitive function, Parkinson's disease, insomnia, depression, neuroticism, and attention deficit hyperactivity disorder

    Cognitive and psychiatric symptom trajectories 2–3 years after hospital admission for COVID-19: a longitudinal, prospective cohort study in the UK

    Get PDF
    Background COVID-19 is known to be associated with increased risks of cognitive and psychiatric outcomes after the acute phase of disease. We aimed to assess whether these symptoms can emerge or persist more than 1 year after hospitalisation for COVID-19, to identify which early aspects of COVID-19 illness predict longer-term symptoms, and to establish how these symptoms relate to occupational functioning. Methods The Post-hospitalisation COVID-19 study (PHOSP-COVID) is a prospective, longitudinal cohort study of adults (aged ≥18 years) who were hospitalised with a clinical diagnosis of COVID-19 at participating National Health Service hospitals across the UK. In the C-Fog study, a subset of PHOSP-COVID participants who consented to be recontacted for other research were invited to complete a computerised cognitive assessment and clinical scales between 2 years and 3 years after hospital admission. Participants completed eight cognitive tasks, covering eight cognitive domains, from the Cognitron battery, in addition to the 9-item Patient Health Questionnaire for depression, the Generalised Anxiety Disorder 7-item scale, the Functional Assessment of Chronic Illness Therapy Fatigue Scale, and the 20-item Cognitive Change Index (CCI-20) questionnaire to assess subjective cognitive decline. We evaluated how the absolute risks of symptoms evolved between follow-ups at 6 months, 12 months, and 2–3 years, and whether symptoms at 2–3 years were predicted by earlier aspects of COVID-19 illness. Participants completed an occupation change questionnaire to establish whether their occupation or working status had changed and, if so, why. We assessed which symptoms at 2–3 years were associated with occupation change. People with lived experience were involved in the study. Findings 2469 PHOSP-COVID participants were invited to participate in the C-Fog study, and 475 participants (191 [40·2%] females and 284 [59·8%] males; mean age 58·26 [SD 11·13] years) who were discharged from one of 83 hospitals provided data at the 2–3-year follow-up. Participants had worse cognitive scores than would be expected on the basis of their sociodemographic characteristics across all cognitive domains tested (average score 0·71 SD below the mean [IQR 0·16–1·04]; p<0·0001). Most participants reported at least mild depression (263 [74·5%] of 353), anxiety (189 [53·5%] of 353), fatigue (220 [62·3%] of 353), or subjective cognitive decline (184 [52·1%] of 353), and more than a fifth reported severe depression (79 [22·4%] of 353), fatigue (87 [24·6%] of 353), or subjective cognitive decline (88 [24·9%] of 353). Depression, anxiety, and fatigue were worse at 2–3 years than at 6 months or 12 months, with evidence of both worsening of existing symptoms and emergence of new symptoms. Symptoms at 2–3 years were not predicted by the severity of acute COVID-19 illness, but were strongly predicted by the degree of recovery at 6 months (explaining 35·0–48·8% of the variance in anxiety, depression, fatigue, and subjective cognitive decline); by a biocognitive profile linking acutely raised D-dimer relative to C-reactive protein with subjective cognitive deficits at 6 months (explaining 7·0–17·2% of the variance in anxiety, depression, fatigue, and subjective cognitive decline); and by anxiety, depression, fatigue, and subjective cognitive deficit at 6 months. Objective cognitive deficits at 2–3 years were not predicted by any of the factors tested, except for cognitive deficits at 6 months, explaining 10·6% of their variance. 95 of 353 participants (26·9% [95% CI 22·6–31·8]) reported occupational change, with poor health being the most common reason for this change. Occupation change was strongly and specifically associated with objective cognitive deficits (odds ratio [OR] 1·51 [95% CI 1·04–2·22] for every SD decrease in overall cognitive score) and subjective cognitive decline (OR 1·54 [1·21–1·98] for every point increase in CCI-20). Interpretation Psychiatric and cognitive symptoms appear to increase over the first 2–3 years post-hospitalisation due to both worsening of symptoms already present at 6 months and emergence of new symptoms. New symptoms occur mostly in people with other symptoms already present at 6 months. Early identification and management of symptoms might therefore be an effective strategy to prevent later onset of a complex syndrome. Occupation change is common and associated mainly with objective and subjective cognitive deficits. Interventions to promote cognitive recovery or to prevent cognitive decline are therefore needed to limit the functional and economic impacts of COVID-19. Funding National Institute for Health and Care Research Oxford Health Biomedical Research Centre, Wolfson Foundation, MQ Mental Health Research, MRC-UK Research and Innovation, and National Institute for Health and Care Research
    corecore