2,871 research outputs found

    Green fluorescent protein as an indicator of cryoinjury in tissues.

    Get PDF
    The fluorescence intensity of Green Fluorescent Protein (GFP) has previously been demonstrated to be an accurate indicator of cellular viability following cryoinsult in individual GFP-transfected cells. In an attempt to ascertain whether GFP fluorescence intensity may also be used as a viability indicator following cryogenic insults in whole tissues, this study examines the transient fluorescence intensity of GFP-transfected mouse hepatic tissue ex vivo following cryoinsult. The observed trends are compared with diffusion-based models. It was observed that the fluorescence intensity of the exposed tissues exhibited slow exponential decay, while the solution in which the tissues were placed inversely gained fluorescence. This slow decay (~3 h) is in contrast to the rapidly diminished fluorescence intensity (seconds) seen in GFP-cell cultures following cryoinsult. These trends suggest that mass diffusion of GFP in the interstitial space, and ultimately into the surrounding medium, is the primary mechanism which determines the fluorescence loss in cryoinjured tissues. These results suggest GFP-transfected tissues may be effectively used as indicators of cryoinjury, and hence viability, following hypothermal insult provided that a sufficiently long incubation is held before observation. It was found that a meaningful observation (15% reduction in fluorescence) could be made three hours subsequent to cryoinjury for the tissues used in this study

    The Potential for Student Performance Prediction in Small Cohorts with Minimal Available Attributes

    Get PDF
    The measurement of student performance during their progress through university study provides academic leadership with critical information on each student’s likelihood of success. Academics have traditionally used their interactions with individual students through class activities and interim assessments to identify those “at risk” of failure/withdrawal. However, modern university environments, offering easy on-line availability of course material, may see reduced lecture/tutorial attendance, making such identification more challenging. Modern data mining and machine learning techniques provide increasingly accurate predictions of student examination assessment marks, although these approaches have focussed upon large student populations and wide ranges of data attributes per student. However, many university modules comprise relatively small student cohorts, with institutional protocols limiting the student attributes available for analysis. It appears that very little research attention has been devoted to this area of analysis and prediction. We describe an experiment conducted on a final-year university module student cohort of 23, where individual student data are limited to lecture/tutorial attendance, virtual learning environment accesses and intermediate assessments. We found potential for predicting individual student interim and final assessment marks in small student cohorts with very limited attributes and that these predictions could be useful to support module leaders in identifying students potentially “at risk.”.Peer reviewe

    The NILE Project — Advances in the Conversion of Lignocellulosic Materials into Ethanol

    Full text link
    NILE ("New Improvements for Lignocellulosic Ethanol") was an integrated European project (2005-2010) devoted to the conversion of lignocellulosic raw materials to ethanol. The main objectives were to design novel enzymes suitable for the hydrolysis of cellulose to glucose and new yeast strains able to efficiently converting all the sugars present in lignocellulose into ethanol. The project also included testing these new developments in an integrated pilot plant and evaluating the environmental and socio-economic impacts of implementing lignocellulosic ethanol on a large scale. Two model raw materials – spruce and wheat straw – both preconditioned with similar pretreatments, were used. Several approaches were explored to improve the saccharification of these pretreated raw materials such as searching for new efficient enzymes and enzyme engineering. Various genetic engineering methods were applied to obtain stable xylose- and arabinose-fermenting Saccharomyces cerevisiae strains that tolerate the toxic compounds present in lignocellulosic hydrolysates. The pilot plant was able to treat 2 tons of dry matter per day, and hydrolysis and fermentation could be run successively or simultaneously. A global model integrating the supply chain was used to assess the performance of lignocellulosic ethanol from an economical and environmental perspective. It was found that directed evolution of a specific enzyme of the cellulolytic cocktail produced by the industrial fungus, Trichoderma reesei, and modification of the composition of this cocktail led to improvements of the enzymatic hydrolysis of pretreated raw material. These results, however, were difficult to reproduce at a large scale. A substantial increase in the ethanol conversion yield and in specific ethanol productivity was obtained through a combination of metabolic engineering of yeast strains and fermentation process development. Pilot trials confirmed the good behaviour of the yeast strains in industrial conditions as well as the suitability of lignin residues as fuels. The ethanol cost and the greenhouse gas emissions were highly dependent on the supply chain but the best performing supply chains showed environmental and economic benefits. From a global standpoint, the results showed the necessity for an optimal integration of the process to co-develop all the steps of the process and to test the improvements in a flexible pilot plant, thus allowing the comparison of various configurations and their economic and environmental impacts to be determined. <br> Le projet NILE, acronyme de "New Improvements for Lignocellulosic Ethanol", était un projet européen (2005-2010) consacré à la conversion des matières premières lignocellulosiques en éthanol. Ses principaux objectifs étaient de concevoir de nouvelles enzymes adaptées à l’hydrolyse de la cellulose en glucose et de nouvelles souches de levure capables de convertir efficacement tous les sucres présents dans la lignocellulose en éthanol. Une autre partie du projet consistait à tester ces nouveaux systèmes dans une installation pilote et à évaluer les impacts environnementaux et socio-économiques de la production et utilisation à grande échelle d’éthanol lignocellulosique. Deux matières premières modèles (l’épicéa et la paille de blé) prétraitées de façon semblable, ont été étudiées. Différentes approches ont été tentées pour améliorer la saccharification de ces matières premières, par exemple, la recherche de nouvelles enzymes efficaces ou l’ingénierie d’enzymes. Plusieurs stratégies d’ingénierie génétique ont été utilisées pour obtenir des souches stables de Saccharomyces cerevisiae capables de fermenter le xylose et l’arabinose, et de tolérer les composés toxiques présents dans les hydrolysats lignocellulosiques. L’installation pilote pouvait traiter 2 tonnes de matières sèches par jour, et l’hydrolyse et la fermentation pouvaient être menées successivement ou simultanément. Un modèle global intégrant la chaîne d’approvisionnement en matière première a servi à évaluer les performances économiques et environnementales de la production d’éthanol lignocellulosique. L’évolution dirigée d’une enzyme du cocktail cellulolytique produit par le champignon Trichoderma reesei, et la modification de la composition de ce cocktail améliorent l’hydrolyse enzymatique des matières premières prétraitées. Cependant, ces résultats n’ont pu être reproduits à grande échelle. Le rendement de conversion et la productivité spécifique en éthanol ont été sensiblement augmentés grâce à l’ingénierie métabolique des souches de levure et au développement d’un procédé optimal de fermentation. Les essais en pilote ont confirmé le bon comportement de ces souches de levure en conditions industrielles ainsi que la possibilité d’utiliser les résidus riches en lignine comme combustible. Le coût de production de l’éthanol et le bilan des émissions de gaz à effet de serre étaient très dépendants des sources d’énergie utilisées. D’un point de vue plus global, les résultats ont montré que l’optimisation du procédé nécessite de codévelopper toutes les étapes de façon intégrée et de valider les améliorations dans une installation pilote, afin notamment de pouvoir comparer différentes configurations et d’en déterminer les effets sur l’économie du procédé et ses impacts environnementaux

    Overlapping Chronic Pain Conditions: Implications for Diagnosis and Classification

    Get PDF
    AbstractThere is increasing recognition that many if not most common chronic pain conditions are heterogeneous with a high degree of overlap or coprevalence of other common pain conditions along with influences from biopsychosocial factors. At present, very little attention is given to the high degree of overlap of many common pain conditions when recruiting for clinical trials. As such, many if not most patients enrolled into clinical studies are not representative of most chronic pain patients. The failure to account for the heterogeneous and overlapping nature of most common pain conditions may result in treatment responses of small effect size when these treatments are administered to patients with chronic overlapping pain conditions (COPCs) represented in the general population. In this brief review we describe the concept of COPCs and the putative mechanisms underlying COPCs. Finally, we present a series of recommendations that will advance our understanding of COPCs.PerspectiveThis brief review describes the concept of COPCs. A mechanism-based heuristic model is presented and current knowledge and evidence for COPCs are presented. Finally, a set of recommendations is provided to advance our understanding of COPCs

    Neutral Evolution as Diffusion in phenotype space: reproduction with mutation but without selection

    Full text link
    The process of `Evolutionary Diffusion', i.e. reproduction with local mutation but without selection in a biological population, resembles standard Diffusion in many ways. However, Evolutionary Diffusion allows the formation of local peaks with a characteristic width that undergo drift, even in the infinite population limit. We analytically calculate the mean peak width and the effective random walk step size, and obtain the distribution of the peak width which has a power law tail. We find that independent local mutations act as a diffusion of interacting particles with increased stepsize.Comment: 4 pages, 2 figures. Paper now representative of published articl

    Working in the Public Interest? What must planners do differently? Critical thoughts on the state of planning

    Get PDF
    The current moment is generating huge challenges and raising significant questions about how our societies operate and the future of our cities and countryside. Economic shutdowns are bringing structural inequalities into sharp relief even as they illustrate the daunting scale of the transformations required to reduce our environmental impacts. Many pieces have already been written about how we might not just adapt to a post-Covid world but take the opportunity to build better, healthier, fairer, greener cities. Any hopes for significant change would entail fundamental shifts in the role of planning. At the same time, however, powerful property lobbies threaten a return to a business-as-usual model of development that is led not by care for people and place but the greedy hand of an ever less fettered free market. In England, this is symbolised by a new Conservative government promising to yet again radically streamline a planning system it sees as an impediment to economic recovery. Current circumstances also therefore challenge us to think more broadly about what planning and being a planner really mean in 2020. What is the purpose of planning? Do planners have the tools, resources, and capabilities to address significant societal challenges, and are they trusted to do so? What role should public authorities have and how might this interface with the logics of the market and private-sector driven development? And finally, what is the ‘public interest’ that planners often invoke as the foundation for their work, and how might it be compromised by the nature of the systems we operate in and where we work? The ESRC-funded Working in the Public Interest project has been seeking answers to these questions over the past three years. The project team from the University of Sheffield, Newcastle University and University College London has been engaging closely with contemporary planning practice in both the public and private sectors, focusing attention on what planners do all day. In depth interviews, focus groups to discuss contemporary challenges in planning, and extensive and engaged ethnography have yielded a rich set of insights into the state of planning and the nature of contemporary planning work across the UK. In this booklet we offer a series of brief overviews of key themes that this research has highlighted. Our aim here is not to offer a definition or detailed theoretical discussion of the public interest. Instead we hope to explore how various different facets of planning work are changing. At a broad level our argument is that a much wider range of issues and practices, including for example work-life balance and organisational change, need to be considered alongside issues such as professionalism and ethics when thinking about what it means to work in the public interest. In doing so we hope to stimulate broader debate within and beyond the planning profession about the nature and value of planning. We also aim to highlight a series of key questions and challenges that are shaping planners’ work and that will have significant implications for the future

    Laser-Assisted Cryosurgery in ex vivo Mice Hepatic Tissue: Viability Assays Using Green Fluorescent Protein

    Get PDF
    An experimental investigation is carried out to develop a novel approach to cryosurgery, where laser heating counteracts tissue freezing to better confine damage to the targeted cancerous tissue within a lethal low-temperature isothermal boundary—an approach we refer to as laser-assisted cryosurgery (LAC). The advantage of this procedure relative to conventional cryosurgery assisted with urethral warmers or cryoheaters is that laser heating provides volumetric rather than superficial heating, which leads to deeper penetration, more homogeneous tissue protection and better demarcation of the destructive freezing effect to a well-defined targeted volume. Tissue viability assays are performed using green fluorescence protein (GFP) as a viability marker and correlated with temperature history after performing LAC procedures on ex vivo mice hepatic tissue. The limit for cell denaturation at the irradiated surface predicted by GFP analysis is further confirmed using reverse transcription polymerase chain reaction (RT-PCR). In addition, the correlation between GFP fluorescence and cell viability and loss of GFP fluorescence in non-viable cells has been tested and validated by histological analysis using a standard cell viability measuring method (hematoxylin and eosin staining). Analysis of our experimental measurements show that reproducible thermal gradients (of 236 °C/cm) and predictable tissue necrosis can be reliably produced by LAC without exceeding temperature thresholds for cell denaturation (of Tsurf ≈ 48 °C) beyond preset tissue boundaries (with resolution of 0.1 °C/mm). The results have shown the feasibility of controlling temperatures at specified tissue locations to prevent hyperthermal or freezing damage

    Measurement of immunoglobulin concentration in cell culture supernates by computer-assisted ELISA

    Full text link
    The enzyme-linked immunosorbent assay (ELISA) is used extensively in immunologic research for obtaining quantitative estimates of immunoglobulin concentration in cell culture supernates. Through incorporation of a microcomputer for data acquisition, storage and rapid calculation of results, a substantial reduction in total assay time may be realized. Described here are a set of menu-driven programs written in Basic for the IBM-PC which provide advantages over existing software in simplicity, versatility and accuracy. Hardware requirements are minimal. These programs should encourage greater flexibility in terms of the size and complexity of experimental designs.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/25976/1/0000042.pd
    corecore