204 research outputs found

    Twenty Thousand-Year-Old Huts at a Hunter-Gatherer Settlement in Eastern Jordan

    Get PDF
    Ten thousand years before Neolithic farmers settled in permanent villages, hunter-gatherer groups of the Epipalaeolithic period (c. 22–11,600 cal BP) inhabited much of southwest Asia. The latest Epipalaeolithic phase (Natufian) is well-known for the appearance of stone-built houses, complex site organization, a sedentary lifestyle and social complexity—precursors for a Neolithic way of life. In contrast, pre-Natufian sites are much less well known and generally considered as campsites for small groups of seasonally-mobile hunter-gatherers. Work at the Early and Middle Epipalaeolithic aggregation site of Kharaneh IV in eastern Jordan highlights that some of these earlier sites were large aggregation base camps not unlike those of the Natufian and contributes to ongoing debates on their duration of occupation. Here we discuss the excavation of two 20,000-year-old hut structures at Kharaneh IV that pre-date the renowned stone houses of the Natufian. Exceptionally dense and extensive occupational deposits exhibit repeated habitation over prolonged periods, and contain structural remains associated with exotic and potentially symbolic caches of objects (shell, red ochre, and burnt horn cores) that indicate substantial settlement of the site pre-dating the Natufian and outside of the Natufian homeland as currently understood

    Principal component analysis of ensemble recordings reveals cell assemblies at high temporal resolution

    Get PDF
    Simultaneous recordings of many single neurons reveals unique insights into network processing spanning the timescale from single spikes to global oscillations. Neurons dynamically self-organize in subgroups of coactivated elements referred to as cell assemblies. Furthermore, these cell assemblies are reactivated, or replayed, preferentially during subsequent rest or sleep episodes, a proposed mechanism for memory trace consolidation. Here we employ Principal Component Analysis to isolate such patterns of neural activity. In addition, a measure is developed to quantify the similarity of instantaneous activity with a template pattern, and we derive theoretical distributions for the null hypothesis of no correlation between spike trains, allowing one to evaluate the statistical significance of instantaneous coactivations. Hence, when applied in an epoch different from the one where the patterns were identified, (e.g. subsequent sleep) this measure allows to identify times and intensities of reactivation. The distribution of this measure provides information on the dynamics of reactivation events: in sleep these occur as transients rather than as a continuous process

    The efficacy of high-throughput sequencing and target enrichment on charred archaeobotanical remains

    Get PDF
    The majority of archaeological plant material is preserved in a charred state. Obtaining reliable ancient DNA data from these remains has presented challenges due to high rates of nucleotide damage, short DNA fragment lengths, low endogenous DNA content and the potential for modern contamination. It has been suggested that high-throughput sequencing (HTS) technologies coupled with DNA enrichment techniques may overcome some of these limitations. Here we report the findings of HTS and target enrichment on four important archaeological crops (barley, grape, maize and rice) performed in three different laboratories, presenting the largest HTS assessment of charred archaeobotanical specimens to date. Rigorous analysis of our data-excluding false-positives due to background contamination or incorrect index assignments-indicated a lack of endogenous DNA in nearly all samples, except for one lightly-charred maize cob. Even with target enrichment, this sample failed to yield adequate data required to address fundamental questions in archaeology and biology. We further reanalysed part of an existing dataset on charred plant material, and found all purported endogenous DNA sequences were likely to be spurious. We suggest these technologies are not suitable for use with charred archaeobotanicals and urge great caution when interpreting data obtained by HTS of these remains

    Factors associated with intentions to adhere to colorectal cancer screening follow-up exams

    Get PDF
    BACKGROUND: To increase adherence rate to recommendations for follow-up after abnormal colorectal cancer (CRC) screening results, factors that inhibit and facilitate follow-up must be identified. The purpose of this study was to identify the factors associated with intention to adhere to CRC screening follow-up exams. METHODS: During a 4-week period in October 2003, this survey was conducted with 426 subjects participating in a community-based CRC screening program in Nagano, Japan. Study measures included intention to adhere to recommendation for clinical follow-up in the event of an abnormal fecal occult blood test (FOBT) result, perceived susceptibility and severity of CRC, perceived benefits and barriers related to undergoing follow-up examination, social support, knowledge of CRC risk factors, health status, previous CRC screening, personality and social demographic characteristics. Univariate and multivariate logistic regression analyses on intention to adhere to recommendations for follow-up were performed. RESULTS: Among the 288 individuals analyzed, approximately 74.7% indicated that they would definitely adhere to recommendations for follow-up. After controlling for age, gender, marital status, education, economic status, trait anxiety, bowel symptoms, family history of CRC, and previous screening FOBT, analyses revealed that lower levels of perceived barriers, higher levers of perceived benefits and knowledge of CRC risk factors were significantly associated with high intention respectively. CONCLUSION: The results of this study suggest that future interventions should focus on reducing modifiable barriers by clarifying misperceptions about follow-up, promoting the acceptance of complete diagnostic evaluations, addressing psychological distress, and making follow-up testing more convenient and accessible. Moreover, educating the public regarding the risk factors of CRC and increasing understanding of the benefits of follow-up is also important

    Increased cortical surface area and gyrification following long-term survival from early monocular enucleation

    Get PDF
    AbstractPurposeRetinoblastoma is typically diagnosed before 5 years of age and is often treated by enucleation (surgical removal) of the cancerous eye. Here, we sought to characterize morphological changes of the cortex following long-term survival from early monocular enucleation.MethodsNine adults with early right-eye enucleation (≤48 months of age) due to retinoblastoma were compared to 18 binocularly intact controls. Surface area, cortical thickness, and gyrification estimates were obtained from T1 weighted images and group differences were examined.ResultsEarly monocular enucleation was associated with increased surface area and/or gyrification in visual (i.e., V1, inferior temporal), auditory (i.e., supramarginal), and multisensory (i.e., superior temporal, inferior parietal, superior parietal) cortices compared with controls. Visual cortex increases were restricted to the right hemisphere contralateral to the remaining eye, consistent with previous subcortical data showing asymmetrical lateral geniculate nucleus volume following early monocular enucleation.ConclusionsAltered morphological development of visual, auditory, and multisensory regions occurs subsequent to long-time survival from early eye loss

    Effects of Neonatal Neural Progenitor Cell Implantation on Adult Neuroanatomy and Cognition in the Ts65Dn Model of Down Syndrome

    Get PDF
    As much of the aberrant neural development in Down syndrome (DS) occurs postnatally, an early opportunity exists to intervene and influence life-long cognitive development. Recent success using neural progenitor cells (NPC) in models of adult neurodegeneration indicate such therapy may be a viable option in diseases such as DS. Murine NPC (mNPC, C17.2 cell line) or saline were implanted bilaterally into the dorsal hippocampus of postnatal day 2 (PND 2) Ts65Dn pups to explore the feasibility of early postnatal treatment in this mouse model of DS. Disomic littermates provided karyotype controls for trisomic pups. Pups were monitored for developmental milestone achievement, and then underwent adult behavior testing at 14 weeks of age. We found that implanted mNPC survived into adulthood and migrated beyond the implant site in both karyotypes. The implantation of mNPC resulted in a significant increase in the density of dentate granule cells. However, mNPC implantation did not elicit cognitive changes in trisomic mice either neonatally or in adulthood. To the best of our knowledge, these results constitute the first assessment of mNPC as an early intervention on cognitive ability in a DS model

    Characterization of PTZ-Induced Seizure Susceptibility in a Down Syndrome Mouse Model That Overexpresses CSTB

    Get PDF
    Down syndrome (DS) is a complex genetic syndrome characterized by intellectual disability, dysmorphism and variable additional physiological traits. Current research progress has begun to decipher the neural mechanisms underlying cognitive impairment, leading to new therapeutic perspectives. Pentylenetetrazol (PTZ) has recently been found to have positive effects on learning and memory capacities of a DS mouse model and is foreseen to treat DS patients. But PTZ is also known to be a convulsant drug at higher dose and DS persons are more prone to epileptic seizures than the general population. This raises concerns over what long-term effects of treatment might be in the DS population. The cause of increased propensity for epilepsy in the DS population and which Hsa21 gene(s) are implicated remain unknown. Among Hsa21 candidate genes in epilepsy, CSTB, coding for the cystein protease inhibitor cystatin B, is involved in progressive myoclonus epilepsy and ataxia in both mice and human. Thus we aim to evaluate the effect of an increase in Cstb gene dosage on spontaneous epileptic activity and susceptibility to PTZ-induced seizure. To this end we generated a new mouse model trisomic for Cstb by homologous recombination. We verified that increasing copy number of Cstb from Trisomy (Ts) to Tetrasomy (Tt) was driving overexpression of the gene in the brain, we checked transgenic animals for presence of locomotor activity and electroencephalogram (EEG) abnormalities characteristic of myoclonic epilepsy and we tested if those animals were prone to PTZ-induced seizure. Overall, the results of the analysis shows that an increase in Cstb does not induce any spontaneous epileptic activity and neither increase or decrease the propensity of Ts and Tt mice to myoclonic seizures suggesting that Ctsb dosage should not interfere with PTZ-treatment

    Persistent Place-Making in Prehistory: the Creation, Maintenance, and Transformation of an Epipalaeolithic Landscape

    Get PDF
    Most archaeological projects today integrate, at least to some degree, how past people engaged with their surroundings, including both how they strategized resource use, organized technological production, or scheduled movements within a physical environment, as well as how they constructed cosmologies around or created symbolic connections to places in the landscape. However, there are a multitude of ways in which archaeologists approach the creation, maintenance, and transformation of human-landscape interrelationships. This paper explores some of these approaches for reconstructing the Epipalaeolithic (ca. 23,000–11,500 years BP) landscape of Southwest Asia, using macro- and microscale geoarchaeological approaches to examine how everyday practices leave traces of human-landscape interactions in northern and eastern Jordan. The case studies presented here demonstrate that these Epipalaeolithic groups engaged in complex and far-reaching social landscapes. Examination of the Early and Middle Epipalaeolithic (EP) highlights that the notion of “Neolithization” is somewhat misleading as many of the features we use to define this transition were already well-established patterns of behavior by the Neolithic. Instead, these features and practices were enacted within a hunter-gatherer world and worldview

    Absence of N addition facilitates B cell development, but impairs immune responses

    Get PDF
    The programmed, stepwise acquisition of immunocompetence that marks the development of the fetal immune response proceeds during a period when both T cell receptor and immunoglobulin (Ig) repertoires exhibit reduced junctional diversity due to physiologic terminal deoxynucleotidyl transferase (TdT) insufficiency. To test the effect of N addition on humoral responses, we transplanted bone marrow from TdT-deficient (TdT−/−) and wild-type (TdT+/+) BALB/c mice into recombination activation gene 1-deficient BALB/c hosts. Mice transplanted with TdT−/− cells exhibited diminished humoral responses to the T-independent antigens α-1-dextran and (2,4,6-trinitrophenyl) hapten conjugated to AminoEthylCarboxymethyl-FICOLL, to the T-dependent antigens NP19CGG and hen egg lysozyme, and to Enterobacter cloacae, a commensal bacteria that can become an opportunistic pathogen in immature and immunocompromised hosts. An exception to this pattern of reduction was the T-independent anti-phosphorylcholine response to Streptococcus pneumoniae, which is normally dominated by the N-deficient T15 idiotype. Most of the humoral immune responses in the recipients of TdT−/− bone marrow were impaired, yet population of the blood with B and T cells occurred more rapidly. To further test the effect of N-deficiency on B cell and T cell population growth, transplanted TdT-sufficient and -deficient BALB/c IgMa and congenic TdT-sufficient CB17 IgMb bone marrow were placed in competition. TdT−/− cells demonstrated an advantage in populating the bone marrow, the spleen, and the peritoneal cavity. TdT deficiency, which characterizes fetal lymphocytes, thus appears to facilitate filling both central and peripheral lymphoid compartments, but at the cost of altered responses to a broad set of antigens

    Human malarial disease: a consequence of inflammatory cytokine release

    Get PDF
    Malaria causes an acute systemic human disease that bears many similarities, both clinically and mechanistically, to those caused by bacteria, rickettsia, and viruses. Over the past few decades, a literature has emerged that argues for most of the pathology seen in all of these infectious diseases being explained by activation of the inflammatory system, with the balance between the pro and anti-inflammatory cytokines being tipped towards the onset of systemic inflammation. Although not often expressed in energy terms, there is, when reduced to biochemical essentials, wide agreement that infection with falciparum malaria is often fatal because mitochondria are unable to generate enough ATP to maintain normal cellular function. Most, however, would contend that this largely occurs because sequestered parasitized red cells prevent sufficient oxygen getting to where it is needed. This review considers the evidence that an equally or more important way ATP deficency arises in malaria, as well as these other infectious diseases, is an inability of mitochondria, through the effects of inflammatory cytokines on their function, to utilise available oxygen. This activity of these cytokines, plus their capacity to control the pathways through which oxygen supply to mitochondria are restricted (particularly through directing sequestration and driving anaemia), combine to make falciparum malaria primarily an inflammatory cytokine-driven disease
    corecore