39 research outputs found
On the number of founding germ cells in humans
BACKGROUND: The number of founding germ cells (FGCs) in mammals is of fundamental significance to the fidelity of gene transmission between generations, but estimates from various methods vary widely. In this paper we obtain a new estimate for the value in humans by using a mathematical model of germ cell development that depends on available oocyte counts for adult women. RESULTS: The germline-development model derives from the assumption that oogonial proliferation in the embryonic stage starts with a founding cells at t = 0 and that the subsequent proliferation can be defined as a simple stochastic birth process. It follows that the population size X(t) at the end of germline expansion (around the 5(th )month of pregnancy in humans; t = 0.42 years) is a random variable with a negative binomial distribution. A formula based on the expectation and variance of this random variable yields a moment-based estimate of a that is insensitive to the progressive reduction in oocyte numbers due to their utilization and apoptosis at later stages of life. In addition, we describe an algorithm for computing the maximum likelihood estimation of the FGC population size (a), as well as the rates of oogonial division and loss to apoptosis. Utilizing both of these approaches to evaluate available oocyte-counting data, we have obtained an estimate of a = 2 – 3 for Homo sapiens. CONCLUSION: The estimated number of founding germ cells in humans corresponds well with values previously derived from chimerical or mosaic mouse data. These findings suggest that the large variation in oocyte numbers between individual women is consistent with a smaller founding germ cell population size than has been estimated by cytological analyses
Recommended from our members
Biologically based multistage modeling of radiation effects
This past year we have made substantial progress in modeling the contribution of homeostatic regulation to low-dose radiation effects and carcinogenesis. We have worked to refine and apply our multistage carcinogenesis models to explicitly incorporate cell cycle states, simple and complex damage, checkpoint delay, slow and fast repair, differentiation, and apoptosis to study the effects of low-dose ionizing radiation in mouse intestinal crypts, as well as in other tissues. We have one paper accepted for publication in ''Advances in Space Research'', and another manuscript in preparation describing this work. I also wrote a chapter describing our combined cell-cycle and multistage carcinogenesis model that will be published in a book on stochastic carcinogenesis models edited by Wei-Yuan Tan. In addition, we organized and held a workshop on ''Biologically Based Modeling of Human Health Effects of Low dose Ionizing Radiation'', July 28-29, 2005 at Fred Hutchinson Cancer Research Center in Seattle, Washington. We had over 20 participants, including Mary Helen Barcellos-Hoff as keynote speaker, talks by most of the low-dose modelers in the DOE low-dose program, experimentalists including Les Redpath (and Mary Helen), Noelle Metting from DOE, and Tony Brooks. It appears that homeostatic regulation may be central to understanding low-dose radiation phenomena. The primary effects of ionizing radiation (IR) are cell killing, delayed cell cycling, and induction of mutations. However, homeostatic regulation causes cells that are killed or damaged by IR to eventually be replaced. Cells with an initiating mutation may have a replacement advantage, leading to clonal expansion of these initiated cells. Thus we have focused particularly on modeling effects that disturb homeostatic regulation as early steps in the carcinogenic process. There are two primary considerations that support our focus on homeostatic regulation. First, a number of epidemiologic studies using multistage carcinogenesis models that incorporate the ''initiation, promotion, and malignant conversion'' paradigm of carcinogenesis are indicating that promotion of initiated cells is the most important cellular mechanism driving the shape of the age specific hazard for many types of cancer. Second, we have realized that many of the genes that are modified in early stages of the carcinogenic process contribute to one or more of four general cellular pathways that confer a promotional advantage to cells when these pathways are disrupted
Does folic acid supplementation prevent or promote colorectal cancer? Results from model-based predictions.
Folate is essential for nucleotide synthesis, DNA replication, and methyl group supply. Low-folate status has been associated with increased risks of several cancer types, suggesting a chemopreventive role of folate. However, recent findings on giving folic acid to patients with a history of colorectal polyps raise concerns about the efficacy and safety of folate supplementation and the long-term health effects of folate fortification. Results suggest that undetected precursor lesions may progress under folic acid supplementation, consistent with the role of folate role in nucleotide synthesis and cell proliferation. To better understand the possible trade-offs between the protective effects due to decreased mutation rates and possibly concomitant detrimental effects due to increased cell proliferation of folic acid, we used a biologically based mathematical model of colorectal carcinogenesis. We predict changes in cancer risk based on timing of treatment start and the potential effect of folic acid on cell proliferation and mutation rates. Changes in colorectal cancer risk in response to folic acid supplementation are likely a complex function of treatment start, duration, and effect on cell proliferation and mutations rates. Predicted colorectal cancer incidence rates under supplementation are mostly higher than rates without folic acid supplementation unless supplementation is initiated early in life (before age 20 years). To the extent to which this model predicts reality, it indicates that the effect on cancer risk when starting folic acid supplementation late in life is small, yet mostly detrimental. Experimental studies are needed to provide direct evidence for this dual role of folate in colorectal cancer and to validate and improve the model predictions
Impact of Reduced Tobacco Smoking on Lung Cancer Mortality in the United States During 1975–2000
Background: Considerable effort has been expended on tobacco control strategies in the United States since the mid-1950s. However, we have little quantitative information on how changes in smoking behaviors have impacted lung cancer mortality. We quantified the cumulative impact of changes in smoking behaviors that started in the mid-1950s on lung cancer mortality in the United States over the period 1975–2000. Methods: A consortium of six groups of investigators used common inputs consisting of simulated cohort-wise smoking histories for the birth cohorts of 1890 through 1970 and independent models to estimate the number of US lung cancer deaths averted during 1975–2000 as a result of changes in smoking behavior that began in the mid-1950s. We also estimated the number of deaths that could have been averted had tobacco control been completely effective in eliminating smoking after the Surgeon General’s first report on Smoking and Health in 1964. Results: Approximately 795,851 US lung cancer deaths were averted during the period 1975–2000: 552,574 among men and 243,277 among women. In the year 2000 alone, approximately 70,218 lung cancer deaths were averted: 44,135 among men and 26,083 among women. However, these numbers are estimated to represent approximately 32% of lung cancer deaths that could have potentially been averted during the period 1975–2000, 38% of the lung cancer deaths that could have been averted in 1991–2000, and 44% of lung cancer deaths that could have been averted in 2000. Conclusions: Our results reflect the cumulative impact of changes in smoking behavior since the 1950s. Despite a large impact of changing smoking behaviors on lung cancer deaths, lung cancer remains a major public health problem. Continued efforts at tobacco control are critical to further reduce the burden of this disease
Recommended from our members
Comparing Benefits from Many Possible Computed Tomography Lung Cancer Screening Programs: Extrapolating from the National Lung Screening Trial Using Comparative Modeling
Background: The National Lung Screening Trial (NLST) demonstrated that in current and former smokers aged 55 to 74 years, with at least 30 pack-years of cigarette smoking history and who had quit smoking no more than 15 years ago, 3 annual computed tomography (CT) screens reduced lung cancer-specific mortality by 20% relative to 3 annual chest X-ray screens. We compared the benefits achievable with 576 lung cancer screening programs that varied CT screen number and frequency, ages of screening, and eligibility based on smoking. Methods and Findings: We used five independent microsimulation models with lung cancer natural history parameters previously calibrated to the NLST to simulate life histories of the US cohort born in 1950 under all 576 programs. ‘Efficient’ (within model) programs prevented the greatest number of lung cancer deaths, compared to no screening, for a given number of CT screens. Among 120 ‘consensus efficient’ (identified as efficient across models) programs, the average starting age was 55 years, the stopping age was 80 or 85 years, the average minimum pack-years was 27, and the maximum years since quitting was 20. Among consensus efficient programs, 11% to 40% of the cohort was screened, and 153 to 846 lung cancer deaths were averted per 100,000 people. In all models, annual screening based on age and smoking eligibility in NLST was not efficient; continuing screening to age 80 or 85 years was more efficient. Conclusions: Consensus results from five models identified a set of efficient screening programs that include annual CT lung cancer screening using criteria like NLST eligibility but extended to older ages. Guidelines for screening should also consider harms of screening and individual patient characteristics
Comparing benefits from many possible computed tomography lung cancer screening programs: Extrapolating from the National Lung Screening Trial using comparative modeling
Background: The National Lung Screening Trial (NLST) demonstrated that in current and former smokers aged 55 to 74 years, with at least 30 pack-years of cigarette smoking history and who had quit smoking no more than 15 years ago, 3 annual computed tomography (CT) screens reduced lung cancer-specific mortality by 20% relative to 3 annual chest X-ray screens. We compared the benefits achievable with 576 lung cancer screening programs that varied CT screen number and frequency, ages of screening, and eligibility based on smoking. Methods and Findings: We used five independent microsimulation models with lung cancer natural history parameters previously calibrated to the NLST to simulate life histories of the US cohort born in 1950 under all 576 programs. 'Efficient' (within model) programs prevented the greatest number of lung cancer deaths, compared to no screening, for a given number of CT screens. Among 120 'consensus efficient' (identified as efficient across models) programs, the average starting age was 55 years, the stopping age was 80 or 85 years, the average minimum pack-years was 27, and the maximum years since quitting was 20. Among consensus efficient programs, 11% to 40% of the cohort was screened, and 153 to 846 lung cancer deaths were averted per 100,000 people. In all models, annual screening based on age and smoking eligibility in NLST was not efficient; continuing screening to age 80 or 85 years was more efficient. Conclusions: Consensus results from five models identified a set of efficient screening programs that include annual CT lung cancer screening using criteria like NLST eligibility but extended to older ages. Guidelines for screening should also consider harms of screening and individual patient characteristics