99 research outputs found

    Personalised Finite-Element Models using Image Registration in Parametric Space

    Get PDF
    Heart failure (HF) is a chronic clinical condition in which the heart fails to pump enough blood to meet the metabolic needs of the body. Patients have reduced physical performance and can see their quality of life severely impaired; around 40-70% of patients diagnosed of HF die within the first year following diagnosis. It is underestimated that 900,000 people in the UK currently suffer from HF. HF has a big impact on the NHS, representing 1 million inpatient bed, 5% of all emergency medical admission to hospitals and costs 2% of the total NHS budget. The annual incidence of new diagnoses is reported as 93,000 people in England alone – and this figure is already increasing at a rate above that at which population is ageing [1]. Cardiac resynchronisation therapy (CRT) has become established as an effective solution to treat selected patients with HF. The research presented in this thesis has been conducted as part of a large EPSRC-Funded project on the theme of Grand Challenges in Heathcare, with co-investigators from King’s College London (KCL), Imperial College London, University College London (UCL) and the University of Sheffield. The aim is to develop and to apply modelling techniques to simulate ventricular mechanics and CRT therapy in patient cohorts from Guy’s Hospital (London) and from the Sheffield Teaching Hospitals Trust. This will lead to improved understanding of cardiac physiological behaviour and how diseases affect normal cardiac performance, and to improved therapy planning by allowing candidate interventions to be simulated before they are applied on patients. The clinical workflow within the hospital manages the patient through the processes of diagnosis, therapy planning and follow-up. The first part of this thesis focuses on the development of a formal process for the integration of a computational analysis workflow, including medical imaging, segmentation, model construction, model execution and analysis, into the clinical workflow. During the early stages of the project, as the analysis workflow was being compiled, a major bottle-neck was identified regarding the time required to build accurate, patient-specific geometrical meshes from the segmented images. The second part of this thesis focuses on the development of a novel approach based on the use of image registration to improve the process of construction of a high-quality personalised finite element mesh for an individual patient. Chapter 1 summarises the clinical context and introduces the tools and processes that are applied in this thesis. Chapter 2 describes the challenges and the implementation of a computational analysis workflow and its integration into a clinical environment. Chapter 3 describes the theoretical underpinnings of the image registration algorithm that has been developed to address the problem of construction of high-quality personalised meshes. The approach includes the use of regularisation terms that are designed to improve the mesh quality. The selection and implementation of the regularisation terms is discussed in detail in Chapter 4. Chapter 5 describes the application of the method to a series of test problems, whilst Chapter 6 describes the application to the patient cohort in the clinical study. Chapter 7 demonstrates that the method, developed for robust mesh construction, can readily be applied to determine boundary conditions for computational fluid dynamics (CFD) analysis. Chapter 8 provides a summary of the achievements of the thesis, together with suggestions for further work

    Topological specification of connections between prefrontal cortex and hypothalamus in rhesus monkey

    Get PDF
    The hypothalamus is a subcortical brain region whose limits and constituent nuclei lack consensus. The hypothalamus has been linked to emotion and different states of stress, providing critical feedback about the internal environment to the prefrontal cortex, a region known for executive function within the cortex of humans. An understanding of the developmental origin of the hypothalamus can provide a basis for defining which limits and nuclei are ontologically hypothalamic, and which are not, as well as a framework for understanding its connectional relationship with other brain regions. The Prosomeric Model (Rubenstein et al. 1994; Puelles and Rubenstein 2003; Nieuwenhuys and Puelles 2016; Puelles 2018) explains the embryological development of the central nervous system (CNS) shared by all vertebrates as a Bauplan. As a primary event, the early neural plate is patterned by intersecting longitudinal plates and transverse segments, forming a mosaic of progenitor units. The hypothalamus is specified by three prosomeres [hp1, hp2, and the acroterminal domain (At)] of the secondary prosencephalon with corresponding alar and basal plate parts, which develop apart from the diencephalon. Mounting evidence suggests that progenitor units within alar plate and basal plate parts of hp1 and hp2 give rise to distinct hypothalamic nuclei, which preserve their relative invariant positioning (topology) in the adult brain. Nonetheless, the principles of the Prosomeric Model have not been applied to the hypothalamus of adult primates. The Structural Model (Barbas 1986; Barbas and Rempel-Clower 1997) highlights the variation of laminar structure in the grey matter of the prefrontal cortex as a basis for predicting specific cortico-cortical connections. The areas of the prefrontal cortex vary along a spectrum by number of layers, laminar definition, and cellularity of those layers. The systematic laminar patterns of different areas of the prefrontal cortex seem to be associated with differential rates of development or maturation. A topographical analysis of bidirectional projections between the prefrontal cortex and the hypothalamus was previously applied using the Structural Model (Rempel-Clower and Barbas 1998). The authors found the prefrontal cortex has highly specific projections to the hypothalamus, originating mostly from limbic orbital and medial prefrontal areas, which have lower laminar definition than other prefrontal areas. In addition, the hypothalamus has relatively specific patterns of projection to the prefrontal cortex. We previously lacked an organizing principle to examine the specific pattern of connections between the hypothalamus and prefrontal cortex in adult rhesus monkey. In the present study, hypothalamic nuclei in the rhesus monkey (Macaca mulatta) were parcellated using classic architectonic boundaries and stains. The topological relations of hypothalamic nuclei and adjacent hypothalamic landmarks were then analyzed with homology across rodent and primate species to trace the origin of adult hypothalamic nuclei to the alar or basal plate components of hp1 and hp2. A novel atlas of the hypothalamus of the adult rhesus monkey was generated with developmental ontologies for each hypothalamic nucleus. This atlas was then applied to a topological analysis of the strength and pattern of connections between the hypothalamus and prefrontal cortex in the adult rhesus monkey. The result is a systematic reinterpretation of the adult hypothalamus of the rhesus monkey whose prosomeric ontology was used to study connections and neuraxial pathways linking the hypothalamus and prefrontal cortex. The convergence of the Prosomeric and Structural Models provides a framework through development to explain the structural patterns found in the adult primate cortex and hypothalamus, and the likely consequences of their disruption

    Gene expression profile of cells in successive stages of corneal stem cell lineage

    Get PDF
    In this study the complete transcriptome of groups of cells at specific successive stages of the complete corneal stem cell hierarchy was revealed. The cornea presents a linear differential distribution of each type of cell of this hierarchy, with stem cells, transient amplifying cells and mature cells residing predominantly in the basal limbal, peripheral and central corneal epithelium respectively. In order to realise the complete set of genes that are up or down regulated at each stage of the corneal stem cell lineage, a Laser Micro-dissection and pressure Catapulting method was optimised, that allows for isolation of the desired type of cell from the specific areas they predominate, in a manner that would not challenge the integrity of their mRNA, as determined by 3'-5' relative ratios, estimated by semi-quantitative RT-PCR. To analyse the relative abundance of every gene in each of the cell types that were isolated, a linear amplification of mRNA method had to be optimised, as determined by comparing the relative abundances of specific endogenous and exogenous gene transcripts before and after the amplification reaction, using high density oligonucleotide arrays. In order to amplify the mRNA in such a manner and to such a degree that it could be analysed by high density oligonucleotide arrays an in-vitro transcription based amplification method was employed and optimised. The method entailed the generation of double stranded cDNA reverse transcripts carrying the T7 RNA polymerase promoter and subsequent in-vitro transcription that yielded large amounts of linearly amplified mRNA (aRNA) The midfield of data that was produced was analysed by appropriate mathematical methods such as Robust Multiarray Analysis and in order to obtain a set of genes that are up or down regulated specifically in each cell type. Principal Component analysis confirmed the validity of the hypothesis that the variance in gene expression arose from the fact that different types of cells were analysed The results were validated by semi-quantitative RT-PCR analysis, which confirmed the sensitivity of the arrays. Additionally several protein targets that were indicated by the array analysis were studied by immunohistochemical methods. The putative differential mechanisms regulating corneal epithelial stem and well as transient amplifying cell fate and corneal homeostasis are discussed. The results of this study are likely to augment the efforts of understanding corneal epithelial stem cells and possibly other adult stem cells and thereby assist in future research and therapeutic interventions involving stem cells

    Design and performance of lead systems for the analysis of atrial signal components in the ECG

    Get PDF
    For over a century, electrocardiology has been observing human cardiac activity through recordings of electrocardiograms (ECG). The potential differences derived from the nine electrodes of the standard 12-lead ECG, placed at their designated positions, are the expression of electric dynamics of which the heart is the source. According to well-defined protocols and established criteria of diagnosis, the signals of the electrocardiogram are used as indicators of cardiac pathology. However, of the four chambers of the human heart, each of which has a specific function, most attention in cardiology has been traditionally placed on the ventricles. This has meant that the conventional ECG system is focused on the observation of ventricular activity, and might not be optimal in studying the activity of the atria. The increasing prevalence of atrial fibrillation in the general population, with its inherent severe complications as well as the known social and economic impacts of the disease, has elicited studies investigating body surface potentials of atrial arrhythmias, invariably pivoted on the standard ECG. The aim of this thesis is to investigate the conception and validation of a lead system targeted at the analysis of atrial fibrillation. This new lead system should be dedicated and optimized to capturing a maximal amount of information about the atrial electric activity taking place during fibrillation, but at the same time be well anchored to the standard ECG configuration, in view of its application in clinical practice. This constraint has led to the use of the same number of electrodes, nine, while leaving at least half of these, five, in their initial positions. In the first part of this thesis, observations of body surface potential maps during normal atrial activity are discussed. The objective was to study the involvement of atrial repolarization in body surface potentials. While studying ECG signals recorded with 64-lead systems from 73 patients, special attention was devoted to the processing of low-amplitude signals. The local potential extremes were found at positions not sampled by the standard leads. Moreover, the PQ segment was found to be not electrically silent, the time course of the potential distribution being very similar to that during the P wave but for a reversed polarity and about 3-fold lower magnitudes. The results demonstrate a significant involvement of atrial repolarization during the PQ interval, and a small dispersion of atrial action potential durations. In the second part, the design and evaluation of a new optimized lead system (OACG) dedicated to atrial fibrillation is presented, based on a biophysical-model study. Considering the material constraint mentioned above, the locations of four of the six precordial electrodes were optimized while leaving the remaining five electrodes of the standard ECG system in place. The analysis was based on episodes of eleven different variants of AF simulated by a biophysical model of the atria positioned inside an inhomogeneous thorax. The optimization criterion used was derived from the singular value decomposition of the data matrices. The four new electrode positions increased the ratio of the eighth to the first singular value of the data matrices of the new configuration about five-fold compared to that of the conventional electrode positions. The OACG lead system produces a more complete view on AF compared to that of the conventional the standard 12-lead system. The third part treats the evaluation of the newly-designed OACG lead system in its application to clinical signals. Atrial fibrillation signals were recorded in patients at the nine electrode positions of 1) the standard 12-lead ECG, 2) a heuristically designed lead system, the ACG lead system, and 3) the OACG lead system. After cancellation of the ventricular signals, an information measure was derived from the singular value decomposition of the atrial signals. The resulting values obtained from the three lead systems were compared. For the limited number of recordings made available from the OACG lead system, consistently higher values of the information measure were obtained with the OACG lead system compared to the standard ECG or ACG. The ECG is clearly suboptimal in the analysis of atrial fibrillation and the OACG lead system provides a more complete view of its complex dynamics. The electric cardiac activity can be represented as the time course of a current dipole source placed inside a homogeneous thorax, the vectorcardiogram (VCG). The fourth and final topic of this thesis concerns the design of a VCG lead system committed to atrial fibrillation. Body surface potentials during atrial fibrillation were simulated by using a biophysical model of the human atria and thorax. The XYZ components of the equivalent dipole were derived from the Gabor-Nelson equations. These served as the gold standard while searching for methods to derive the vectorcardiogram from a limited number of electrode positions and their transfer coefficients. Six electrode configurations and dedicated matrices were tested using episodes of simulated atrial fibrillation and 25 different thorax models. The OACG lead system, including one electrode on the back, reduced the RMS-based relative estimation error in comparison with that of the well-known Frank lead system. The Frank lead system was found to be suboptimal for estimating the VCG during AF. Alternative electrode configurations should include at least one electrode on the back. The overall conclusion regarding these results can be recapitulated as a suboptimality of the standard 12-lead ECG system with respect to the analysis of atrial fibrillation. The key features of atrial activity are well present in body surface potentials, but appear at locations not covered by the standard lead system. While anchoring more than half of its electrodes at their conventional positions, the four new electrode positions optimized in regard to the information extraction exhibited higher performance in a biophysical-model study. The application of such an adapted lead system with its customized transfer coefficients to clinical signals promises a considerable improvement in the analysis of atrial fibrillation. The lead system with one electrode on the back of the thorax, allowing a three-dimensional capture of the complex dynamics of atrial fibrillation signals, demonstrated its utility in its application for deriving the VCG representation of the source estimation

    Analysis, Visualization, and Machine Learning of Epigenomic Data

    Get PDF
    The goal of the Encyclopedia of DNA Elements (ENCODE) project has been to characterize all the functional elements of the human genome. These elements include expressed transcripts and genomic regions bound by transcription factors (TFs), occupied by nucleosomes, occupied by nucleosomes with modified histones, or hypersensitive to DNase I cleavage, etc. Chromatin Immunoprecipitation (ChIP-seq) is an experimental technique for detecting TF binding in living cells, and the genomic regions bound by TFs are called ChIP-seq peaks. ENCODE has performed and compiled results from tens of thousands of experiments, including ChIP-seq, DNase, RNA-seq and Hi-C. These efforts have culminated in two web-based resources from our lab—Factorbook and SCREEN—for the exploration of epigenomic data for both human and mouse. Factorbook is a peak-centric resource presenting data such as motif enrichment and histone modification profiles for transcription factor binding sites computed from ENCODE ChIP-seq data. SCREEN provides an encyclopedia of ~2 million regulatory elements, including promoters and enhancers, identified using ENCODE ChIP-seq and DNase data, with an extensive UI for searching and visualization. While we have successfully utilized the thousands of available ENCODE ChIP-seq experiments to build the Encyclopedia and visualizers, we have also struggled with the practical and theoretical inability to assay every possible experiment on every possible biosample under every conceivable biological scenario. We have used machine learning techniques to predict TF binding sites and enhancers location, and demonstrate machine learning is critical to help decipher functional regions of the genome

    Data mining techniques for large-scale gene expression analysis

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 238-256).Modern computational biology is awash in large-scale data mining problems. Several high-throughput technologies have been developed that enable us, with relative ease and little expense, to evaluate the coordinated expression levels of tens of thousands of genes, evaluate hundreds of thousands of single-nucleotide polymorphisms, and sequence individual genomes. The data produced by these assays has provided the research and commercial communities with the opportunity to derive improved clinical prognostic indicators, as well as develop an understanding, at the molecular level, of the systemic underpinnings of a variety of diseases. Aside from the statistical methods used to evaluate these assays, another, more subtle challenge is emerging. Despite the explosive growth in the amount of data being generated and submitted to the various publicly available data repositories, very little attention has been paid to managing the phenotypic characterization of their samples (i.e., managing class labels in a controlled fashion). If sense is to be made of the underlying assay data, the samples' descriptive metadata must first be standardized in a machine-readable format. In this thesis, we explore these issues, specifically within the context of curating and analyzing a large DNA microarray database. We address three main challenges. First, we acquire a large subset of a publicly available microarray repository and develop a principled method for extracting phenotype information from freetext sample labels, then use that information to generate an index of the sample's medically-relevant annotation. The indexing method we develop, Concordia, incorporates pre-existing expert knowledge relating to the hierarchical relationships between medical terms, allowing queries of arbitrary specificity to be efficiently answered. Second, we describe a highly flexible approach to answering the question: "Given a previously unseen gene expression sample, how can we compute its similarity to all of the labeled samples in our database, and how can we utilize those similarity scores to predict the phenotype of the new sample?" Third, we describe a method for identifying phenotype-specific transcriptional profiles within the context of this database, and explore a method for measuring the relative strength of those signatures across the rest of the database, allowing us to identify molecular signatures that are shared across various tissues ad diseases. These shared fingerprints may form a quantitative basis for optimal therapy selection and drug repositioning for a variety of diseases.by Nathan Patrick Palmer.Ph.D

    Psr1p interacts with SUN/sad1p and EB1/mal3p to establish the bipolar spindle

    Get PDF
    Regular Abstracts - Sunday Poster Presentations: no. 382During mitosis, interpolar microtubules from two spindle pole bodies (SPBs) interdigitate to create an antiparallel microtubule array for accommodating numerous regulatory proteins. Among these proteins, the kinesin-5 cut7p/Eg5 is the key player responsible for sliding apart antiparallel microtubules and thus helps in establishing the bipolar spindle. At the onset of mitosis, two SPBs are adjacent to one another with most microtubules running nearly parallel toward the nuclear envelope, creating an unfavorable microtubule configuration for the kinesin-5 kinesins. Therefore, how the cell organizes the antiparallel microtubule array in the first place at mitotic onset remains enigmatic. Here, we show that a novel protein psrp1p localizes to the SPB and plays a key role in organizing the antiparallel microtubule array. The absence of psr1+ leads to a transient monopolar spindle and massive chromosome loss. Further functional characterization demonstrates that psr1p is recruited to the SPB through interaction with the conserved SUN protein sad1p and that psr1p physically interacts with the conserved microtubule plus tip protein mal3p/EB1. These results suggest a model that psr1p serves as a linking protein between sad1p/SUN and mal3p/EB1 to allow microtubule plus ends to be coupled to the SPBs for organization of an antiparallel microtubule array. Thus, we conclude that psr1p is involved in organizing the antiparallel microtubule array in the first place at mitosis onset by interaction with SUN/sad1p and EB1/mal3p, thereby establishing the bipolar spindle.postprin
    • …
    corecore