40 research outputs found
Modelling Human Regulatory Variation in Mouse: Finding the Function in Genome-Wide Association Studies and Whole-Genome Sequencing
An increasing body of literature from genome-wide association studies and human whole-genome sequencing highlights the identification of large numbers of candidate regulatory variants of potential therapeutic interest in numerous diseases. Our relatively poor understanding of the functions of non-coding genomic sequence, and the slow and laborious process of experimental validation of the functional significance of human regulatory variants, limits our ability to fully benefit from this information in our efforts to comprehend human disease. Humanized mouse models (HuMMs), in which human genes are introduced into the mouse, suggest an approach to this problem. In the past, HuMMs have been used successfully to study human disease variants; e.g., the complex genetic condition arising from Down syndrome, common monogenic disorders such as Huntington disease and β-thalassemia, and cancer susceptibility genes such as BRCA1. In this commentary, we highlight a novel method for high-throughput single-copy site-specific generation of HuMMs entitled High-throughput Human Genes on the X Chromosome (HuGX). This method can be applied to most human genes for which a bacterial artificial chromosome (BAC) construct can be derived and a mouse-null allele exists. This strategy comprises (1) the use of recombineering technology to create a human variantâharbouring BAC, (2) knock-in of this BAC into the mouse genome using Hprt docking technology, and (3) allele comparison by interspecies complementation. We demonstrate the throughput of the HuGX method by generating a series of seven different alleles for the human NR2E1 gene at Hprt. In future challenges, we consider the current limitations of experimental approaches and call for a concerted effort by the genetics community, for both human and mouse, to solve the challenge of the functional analysis of human regulatory variation
Recommended from our members
Automated Data Processing (ADP) research and development
Monitoring a comprehensive test ban treaty (CTBT) will require screening tens of thousands of seismic events each year. Reliable automated data analysis will be essential in keeping up with the continuous stream of events that a global monitoring network will detect. We are developing automated event location and identification algorithms by looking at the gaps and weaknesses in conventional ADP systems and by taking advantage of modem computational paradigms. Our research focus is on three areas: developing robust algorithms for signal feature extraction, integrating the analysis of critical measurements, and exploiting joint estimation techniques such as using data from acoustic, hydroacoustic, and seismic sensors. We identify several important problems for research and development; e.g., event location with approximate velocity models and event identification in the presence of outliers. We are employing both linear and nonlinear methods and advanced signal transform techniques to solve these event monitoring problems. Our goal is to increase event-interpretation throughput by employing the power and efficiency of modem computational techniques, and to improve the reliability of automated analysis by reducing the rates of false alarms and missed detections