20 research outputs found

    Mediterranean radiocarbon offsets and calendar dates for prehistory

    Get PDF
    A single Northern Hemisphere calibration curve has formed the basis of radiocarbon dating in Europe and the Mediterranean for five decades, setting the time frame for prehistory. However, as measurement precision increases, there is mounting evidence for some small but substantive regional (partly growing season) offsets in same-year radiocarbon levels. Controlling for interlaboratory variation, we compare radiocarbon data from Europe and the Mediterranean in the second to earlier first millennia BCE. Consistent with recent findings in the second millennium CE, these data suggest that some small, but critical, periods of variation for Mediterranean radiocarbon levels exist, especially associated with major reversals or plateaus in the atmospheric radiocarbon record. At high precision, these variations potentially affect calendar dates for prehistory by up to a few decades, including, for example, Egyptian history and the much-debated Thera/Santorini volcanic eruption

    A Multi-Proxy Assessment of the Impact of Environmental Instability on Late Holocene (4500-3800 BP) Native American Villages of the Georgia Coast

    Get PDF
    Circular shell rings along the South Atlantic Coast of North America are the remnants of some of the earliest villages that emerged during the Late Archaic (5000-3000 BP). Many of these villages, however, were abandoned during the Terminal Late Archaic (ca 3800-3000 BP). We combine Bayesian chronological modeling with mollusk shell geochemistry and oyster paleobiology to understand the nature and timing of environmental change associated with the emergence and abandonment of circular shell ring villages on Sapelo Island, Georgia. Our Bayesian models indicate that Native Americans occupied the three Sapelo shell rings at varying times with some generational overlap. By the end of the complex\u27s occupation, only Ring III was occupied before abandonment ca. 3845 BP. Ring III also consists of statistically smaller oysters harvested from less saline estuaries compared to earlier occupations. Integrating shell biochemical and paleobiological data with recent tree ring analyses shows a clear pattern of environmental fluctuations throughout the period in which the rings were occupied. We argue that as the environment became unstable around 4300 BP, aggregation at villages provided a way to effectively manage fisheries that are highly sensitive to environmental change. However, with the eventual collapse of oyster fisheries and subsequent rebound in environmental conditions ca. post-3800 BP, people dispersed from shell rings, and shifted to non-marine subsistence economies and other types of settlements. This study provides the most comprehensive evidence for correlations between large-scale environmental change and societal transformations on the Georgia coast during the Late Archaic period

    Social stratification without genetic differentiation at the site of Kulubnarti in Christian Period Nubia

    Get PDF
    Relatively little is known about Nubia’s genetic landscape prior to the influence of the Islamic migrations that began in the late 1st millennium CE. Here, we increase the number of ancient individuals with genome-level data from the Nile Valley from three to 69, reporting data for 66 individuals from two cemeteries at the Christian Period (~650–1000 CE) site of Kulubnarti, where multiple lines of evidence suggest social stratification. The Kulubnarti Nubians had ~43% Nilotic-related ancestry (individual variation between ~36–54%) with the remaining ancestry consistent with being introduced through Egypt and ultimately deriving from an ancestry pool like that found in the Bronze and Iron Age Levant. The Kulubnarti gene pool – shaped over a millennium – harbors disproportionately female-associated West Eurasian-related ancestry. Genetic similarity among individuals from the two cemeteries supports a hypothesis of social division without genetic distinction. Seven pairs of inter-cemetery relatives suggest fluidity between cemetery groups. Present-day Nubians are not directly descended from the Kulubnarti Nubians, attesting to additional genetic input since the Christian Period.K.A.S. was supported by a Doctoral Dissertation Research Improvement Grant from the National Science Foundation (BCS-1613577). D.R. was funded by NSF HOMINID grant BCS-1032255; NIH (NIGMS) grant GM100233; the Allen Discovery Center program, a Paul G. Allen Frontiers Group advised program of the Paul G. Allen Family Foundation; the John Templeton Foundation grant 61220; and the Howard Hughes Medical Institute

    Contact-Era Chronology Building in Iroquoia:Age Estimates for Arendarhonon Sites and Implications for Identifying Champlain's Cahiagué

    Get PDF
    Radiocarbon dating is rarely used in historical or contact-era North American archaeology because of idiosyncrasies of the calibration curve that result in ambiguous calendar dates for this period. We explore the potential and requirements for radiocarbon dating and Bayesian analysis to create a time frame for early contact-era sites in northeast North America independent of the assumptions and approximations involved in temporal constructs based on trade goods and other archaeological correlates. To illustrate, we use Bayesian chronological modeling to analyze radiocarbon dates on short-lived samples and a post from four Huron-Wendat Arendarhonon sites (Benson, Sopher, Ball, and Warminster) to establish an independent chronology. We find that Warminster was likely occupied in 1615–1616, and so is the most likely candidate for the site of Cahiagué visited by Samuel de Champlain in 1615–1616, versus the other main suggested alternative, Ball, which dates earlier, as do the Sopher and Benson sites. In fact, the Benson site seems likely to date ~50 years earlier than currently thought. We present the methods employed to arrive at these new, independent age estimates and argue that absolute redating of historic-era sites is necessary to accurately assess existing interpretations based on relative dating and associated regional narratives

    AAV Exploits Subcellular Stress Associated with Inflammation, Endoplasmic Reticulum Expansion, and Misfolded Proteins in Models of Cystic Fibrosis

    Get PDF
    Barriers to infection act at multiple levels to prevent viruses, bacteria, and parasites from commandeering host cells for their own purposes. An intriguing hypothesis is that if a cell experiences stress, such as that elicited by inflammation, endoplasmic reticulum (ER) expansion, or misfolded proteins, then subcellular barriers will be less effective at preventing viral infection. Here we have used models of cystic fibrosis (CF) to test whether subcellular stress increases susceptibility to adeno-associated virus (AAV) infection. In human airway epithelium cultured at an air/liquid interface, physiological conditions of subcellular stress and ER expansion were mimicked using supernatant from mucopurulent material derived from CF lungs. Using this inflammatory stimulus to recapitulate stress found in diseased airways, we demonstrated that AAV infection was significantly enhanced. Since over 90% of CF cases are associated with a misfolded variant of Cystic Fibrosis Transmembrane Conductance Regulator (ΔF508-CFTR), we then explored whether the presence of misfolded proteins could independently increase susceptibility to AAV infection. In these models, AAV was an order of magnitude more efficient at transducing cells expressing ΔF508-CFTR than in cells expressing wild-type CFTR. Rescue of misfolded ΔF508-CFTR under low temperature conditions restored viral transduction efficiency to that demonstrated in controls, suggesting effects related to protein misfolding were responsible for increasing susceptibility to infection. By testing other CFTR mutants, G551D, D572N, and 1410X, we have shown this phenomenon is common to other misfolded proteins and not related to loss of CFTR activity. The presence of misfolded proteins did not affect cell surface attachment of virus or influence expression levels from promoter transgene cassettes in plasmid transfection studies, indicating exploitation occurs at the level of virion trafficking or processing. Thus, we surmised that factors enlisted to process misfolded proteins such as ΔF508-CFTR in the secretory pathway also act to restrict viral infection. In line with this hypothesis, we found that AAV trafficked to the microtubule organizing center and localized near Golgi/ER transport proteins. Moreover, AAV infection efficiency could be modulated with siRNA-mediated knockdown of proteins involved in processing ΔF508-CFTR or sorting retrograde cargo from the Golgi and ER (calnexin, KDEL-R, β-COP, and PSMB3). In summary, our data support a model where AAV exploits a compromised secretory system and, importantly, underscore the gravity with which a stressed subcellular environment, under internal or external insults, can impact infection efficiency

    Children’s and adolescents’ rising animal-source food intakes in 1990–2018 were impacted by age, region, parental education and urbanicity

    Get PDF
    Animal-source foods (ASF) provide nutrition for children and adolescents’ physical and cognitive development. Here, we use data from the Global Dietary Database and Bayesian hierarchical models to quantify global, regional and national ASF intakes between 1990 and 2018 by age group across 185 countries, representing 93% of the world’s child population. Mean ASF intake was 1.9 servings per day, representing 16% of children consuming at least three daily servings. Intake was similar between boys and girls, but higher among urban children with educated parents. Consumption varied by age from 0.6 at <1 year to 2.5 servings per day at 15–19 years. Between 1990 and 2018, mean ASF intake increased by 0.5 servings per week, with increases in all regions except sub-Saharan Africa. In 2018, total ASF consumption was highest in Russia, Brazil, Mexico and Turkey, and lowest in Uganda, India, Kenya and Bangladesh. These findings can inform policy to address malnutrition through targeted ASF consumption programmes.publishedVersio

    Incident type 2 diabetes attributable to suboptimal diet in 184 countries

    Get PDF
    The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.8–14.4 million) incident T2D cases, representing 70.3% (68.8–71.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.0–27.1%)), excess refined rice and wheat intake (24.6% (22.3–27.2%)) and excess processed meat intake (20.3% (18.3–23.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.4–87.7%)) and Latin America and the Caribbean (81.8% (80.1–83.4%)); and lowest proportional burdens were in South Asia (55.4% (52.1–60.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally.publishedVersio

    Market share and recent hiring trends in anthropology faculty positions.

    No full text
    Between 1985 and 2014, the number of US doctoral graduates in Anthropology increased from about 350 to 530 graduates per year. This rise in doctorates entering the work force along with an overall decrease in the numbers of tenure-track academic positions has resulted in highly competitive academic job market. We estimate that approximately79% of US anthropology doctorates do not obtain tenure-track positions at BA/BS, MA/MS, and PhD institutions in the US. Here, we examine where US anthropology faculty obtained their degrees and where they ultimately end up teaching as tenure-track faculty. Using data derived from the 2014-2015 AnthroGuide and anthropology departmental web pages, we identify and rank PhD programs in terms of numbers of graduates who have obtained tenure-track academic jobs; examine long-term and ongoing trends in the programs producing doctorates for the discipline as a whole, as well as for the subfields of archaeology, bioanthropology, and sociocultural anthropology; and discuss gender inequity in academic anthropology within the US

    Choosing a path to the ancient world in a modern market: the reality of faculty jobs in archaeology

    No full text
    Over the past 30 years, the number of US doctoral anthropology graduates has increased by about 70%, but there has not been a corresponding increase in the availability of new faculty positions. Consequently, doctoral degree-holding archaeologists face more competition than ever before when applying for faculty positions. Here we examine where US and Canadian anthropological archaeology faculty originate and where they ultimately end up teaching. Using data derived from the 2014–2015 AnthroGuide, we rank doctoral programs whose graduates in archaeology have been most successful in the academic job market; identify long-term and ongoing trends in doctoral programs; and discuss gender division in academic archaeology in the US and Canada. We conclude that success in obtaining a faculty position upon graduation is predicated in large part on where one attends graduate school

    Diabetes mellitus auto-referido no Município de São Paulo: prevalência e desigualdade Self-reported diabetes mellitus in the city of São Paulo: prevalence and inequality

    No full text
    Constitui objetivo do presente trabalho a caracterização da prevalência do Diabetes Mellitus Auto-Referido (DMAR) no Município de São Paulo, obtida a partir do inquérito domiciliar realizado pelo Estudo Multicêntrico sobre a Prevalência do Diabetes no Brasil, de 1986 a 1988. A amostra foi composta por 2007 indivíduos de 30 a 69 anos de idade, de ambos os sexos, selecionados em três áreas de distintas condições sociais, nas quais estavam sendo implantados programas assistenciais aos diabéticos. A prevalência, obtida através de glicemia capilar, 2h após sobrecarga de 75 g de glicose, foi de 9,7%. A prevalência do DMAR foi de 4,7%, tendo sido observado aumento de acordo com a idade, e estreita relação com o relato de história familiar de diabetes. Observou-se, ainda, diferença significante segundo o sexo (3,5% na população masculina e 5,7% na feminina), concentrando-se maiores valores nos níveis sócio-econômicos mais elevados, na população masculina, e, nos níveis mais baixos, na população feminina.<br>This report analyzes characteristics of self-reported diabetes mellitus in the city of São Paulo, Brazil. The data were obtained from the Brazilian Multicenter Study on Prevalence of Diabetes Mellitus, a household survey performed in 1986-88. The São Paulo sample consisted of 2,007 individuals aged 30-69 years, of both sexes, selected from three areas with distinct socio-economic levels. The estimated prevalence using a 75g glucose load and measurement of two-hour capillary glycemia was 9.7%. Prevalence of self-reported diabetes was 4.7% and increased with age and presence of family history of diabetes. There was a significant difference between sexes (3.5% in men and 5.7% in women), with higher rates of self-reported diabetes at higher economic levels among men and higher rates at lower socio-economic levels among women
    corecore