178 research outputs found

    Extreme pCO2 Variability in a Macrotidal Eelgrass Meadow Mediated by Tidal and Diurnal Cycles

    Get PDF
    It has been suggested that photosynthetic activity of macrophytes in coastal areas can decrease pCO2 and may provide areas of refuge for organisms sensitive to ocean acidification. To assess the effect of a large eel grass meadow on water chemistry, discreet samples were collected hourly over several 24 hour cycles in Padilla Bay, WA. Calculated pCO2 ranged from less than 100 ppm to greater than 700 ppm, often over the course of only a few hours. Aragonite saturation, DIC and pH were also highly variable. These data, weather station data and in-situ sensors(Padilla Bay National Estuarine Research Reserve) were used to develop a model that estimates pCO2 for the summer season. Tidal height and photosynthetically active radiation were the most significant predictors of pH and pCO2, with Salinity and DO. Model estimates suggest that an even wider range of pCO2 values are common in this estuary, especially in the early summer. Data from a mooring in 20 meters of water over a kilometer from the intertidal eel grass environment, provide some hints of the spatial extent of influence

    Rapid, efficient functional characterization and recovery of HIV-specific human CD8+ T cells using microengraving

    Get PDF
    The nature of certain clinical samples (tissue biopsies, fluids) or the subjects themselves (pediatric subjects, neonates) often constrain the number of cells available to evaluate the breadth of functional T-cell responses to infections or therapeutic interventions. The methods most commonly used to assess this functional diversity ex vivo and to recover specific cells to expand in vitro usually require more than 106 cells. Here we present a process to identify antigen-specific responses efficiently ex vivo from 104–105 single cells from blood or mucosal tissues using dense arrays of subnanoliter wells. The approach combines on-chip imaging cytometry with a technique for capturing secreted proteins—called “microengraving”—to enumerate antigenspecific responses by single T cells in a manner comparable to conventional assays such as ELISpot and intracellular cytokine staining. Unlike those assays, however, the individual cells identified can be recovered readily by micromanipulation for further characterization in vitro. Applying this method to assess HIV-specific T cell responses demonstrates that it is possible to establish clonal CD8+ T-cell lines that represent the most abundant specificities present in circulation using 100- to 1,000-fold fewer cells than traditional approaches require and without extensive genotypic analysis a priori. This rapid (<24 h), efficient, and inexpensive process should improve the comparative study of human T-cell immunology across ages and anatomic compartments

    Profiling Human Antibody Responses by Integrated Single-Cell Analysis

    Get PDF
    Comprehensive characterization of the antigen-specific B cells induced during infections or following vaccination would facilitate the discovery of novel antibodies and inform how interventions shape protective humoral responses. The analysis of human B cells and their antibodies has been performed using flow cytometry to evaluate memory B cells and expanded plasmablasts, while microtechnologies have also provided a useful tool to examine plasmablasts/plasma cells after vaccination. Here we present an integrated analytical platform, using arrays of subnanoliter wells (nanowells), for constructing detailed profiles for human B cells comprising the immunophenotypes of these cells, the distribution of isotypes of the secreted antibodies, the specificity and relative affinity for defined antigens, and for a subset of cells, the genes encoding the heavy and light chains. The approach combines on-chip image cytometry, microengraving, and single-cell RT-PCR. Using clinical samples from HIV-infected subjects, we demonstrate that the method can identify antigen-specific neutralizing antibodies, is compatible with both plasmablasts/plasma cells and activated memory B cells, and is well-suited for characterizing the limited numbers of B cells isolated from tissue biopsies (e.g., colon biopsies). The technology should facilitate detailed analyses of human humoral responses for evaluating vaccines and their ability to raise protective antibody responses across multiple anatomical compartments

    Induced pseudoscalar coupling of the proton weak interaction

    Full text link
    The induced pseudoscalar coupling gpg_p is the least well known of the weak coupling constants of the proton's charged--current interaction. Its size is dictated by chiral symmetry arguments, and its measurement represents an important test of quantum chromodynamics at low energies. During the past decade a large body of new data relevant to the coupling gpg_p has been accumulated. This data includes measurements of radiative and non radiative muon capture on targets ranging from hydrogen and few--nucleon systems to complex nuclei. Herein the authors review the theoretical underpinnings of gpg_p, the experimental studies of gpg_p, and the procedures and uncertainties in extracting the coupling from data. Current puzzles are highlighted and future opportunities are discussed.Comment: 58 pages, Latex, Revtex4, prepared for Reviews of Modern Physic

    Nanotools for Neuroscience and Brain Activity Mapping

    Get PDF
    Neuroscience is at a crossroads. Great effort is being invested into deciphering specific neural interactions and circuits. At the same time, there exist few general theories or principles that explain brain function. We attribute this disparity, in part, to limitations in current methodologies. Traditional neurophysiological approaches record the activities of one neuron or a few neurons at a time. Neurochemical approaches focus on single neurotransmitters. Yet, there is an increasing realization that neural circuits operate at emergent levels, where the interactions between hundreds or thousands of neurons, utilizing multiple chemical transmitters, generate functional states. Brains function at the nanoscale, so tools to study brains must ultimately operate at this scale, as well. Nanoscience and nanotechnology are poised to provide a rich toolkit of novel methods to explore brain function by enabling simultaneous measurement and manipulation of activity of thousands or even millions of neurons. We and others refer to this goal as the Brain Activity Mapping Project. In this Nano Focus, we discuss how recent developments in nanoscale analysis tools and in the design and synthesis of nanomaterials have generated optical, electrical, and chemical methods that can readily be adapted for use in neuroscience. These approaches represent exciting areas of technical development and research. Moreover, unique opportunities exist for nanoscientists, nanotechnologists, and other physical scientists and engineers to contribute to tackling the challenging problems involved in understanding the fundamentals of brain function

    In quest of a systematic framework for unifying and defining nanoscience

    Get PDF
    This article proposes a systematic framework for unifying and defining nanoscience based on historic first principles and step logic that led to a “central paradigm” (i.e., unifying framework) for traditional elemental/small-molecule chemistry. As such, a Nanomaterials classification roadmap is proposed, which divides all nanomatter into Category I: discrete, well-defined and Category II: statistical, undefined nanoparticles. We consider only Category I, well-defined nanoparticles which are >90% monodisperse as a function of Critical Nanoscale Design Parameters (CNDPs) defined according to: (a) size, (b) shape, (c) surface chemistry, (d) flexibility, and (e) elemental composition. Classified as either hard (H) (i.e., inorganic-based) or soft (S) (i.e., organic-based) categories, these nanoparticles were found to manifest pervasive atom mimicry features that included: (1) a dominance of zero-dimensional (0D) core–shell nanoarchitectures, (2) the ability to self-assemble or chemically bond as discrete, quantized nanounits, and (3) exhibited well-defined nanoscale valencies and stoichiometries reminiscent of atom-based elements. These discrete nanoparticle categories are referred to as hard or soft particle nanoelements. Many examples describing chemical bonding/assembly of these nanoelements have been reported in the literature. We refer to these hard:hard (H-n:H-n), soft:soft (S-n:S-n), or hard:soft (H-n:S-n) nanoelement combinations as nanocompounds. Due to their quantized features, many nanoelement and nanocompound categories are reported to exhibit well-defined nanoperiodic property patterns. These periodic property patterns are dependent on their quantized nanofeatures (CNDPs) and dramatically influence intrinsic physicochemical properties (i.e., melting points, reactivity/self-assembly, sterics, and nanoencapsulation), as well as important functional/performance properties (i.e., magnetic, photonic, electronic, and toxicologic properties). We propose this perspective as a modest first step toward more clearly defining synthetic nanochemistry as well as providing a systematic framework for unifying nanoscience. With further progress, one should anticipate the evolution of future nanoperiodic table(s) suitable for predicting important risk/benefit boundaries in the field of nanoscience

    The Science Performance of JWST as Characterized in Commissioning

    Get PDF
    This paper characterizes the actual science performance of the James Webb Space Telescope (JWST), as determined from the six month commissioning period. We summarize the performance of the spacecraft, telescope, science instruments, and ground system, with an emphasis on differences from pre-launch expectations. Commissioning has made clear that JWST is fully capable of achieving the discoveries for which it was built. Moreover, almost across the board, the science performance of JWST is better than expected; in most cases, JWST will go deeper faster than expected. The telescope and instrument suite have demonstrated the sensitivity, stability, image quality, and spectral range that are necessary to transform our understanding of the cosmos through observations spanning from near-earth asteroids to the most distant galaxies

    Whole-genome sequencing reveals host factors underlying critical COVID-19

    Get PDF
    Critical COVID-19 is caused by immune-mediated inflammatory lung injury. Host genetic variation influences the development of illness requiring critical care1 or hospitalization2–4 after infection with SARS-CoV-2. The GenOMICC (Genetics of Mortality in Critical Care) study enables the comparison of genomes from individuals who are critically ill with those of population controls to find underlying disease mechanisms. Here we use whole-genome sequencing in 7,491 critically ill individuals compared with 48,400 controls to discover and replicate 23 independent variants that significantly predispose to critical COVID-19. We identify 16 new independent associations, including variants within genes that are involved in interferon signalling (IL10RB and PLSCR1), leucocyte differentiation (BCL11A) and blood-type antigen secretor status (FUT2). Using transcriptome-wide association and colocalization to infer the effect of gene expression on disease severity, we find evidence that implicates multiple genes—including reduced expression of a membrane flippase (ATP11A), and increased expression of a mucin (MUC1)—in critical disease. Mendelian randomization provides evidence in support of causal roles for myeloid cell adhesion molecules (SELE, ICAM5 and CD209) and the coagulation factor F8, all of which are potentially druggable targets. Our results are broadly consistent with a multi-component model of COVID-19 pathophysiology, in which at least two distinct mechanisms can predispose to life-threatening disease: failure to control viral replication; or an enhanced tendency towards pulmonary inflammation and intravascular coagulation. We show that comparison between cases of critical illness and population controls is highly efficient for the detection of therapeutically relevant mechanisms of disease

    20-Year Risks of Breast-Cancer Recurrence after Stopping Endocrine Therapy at 5 Years

    Get PDF
    The administration of endocrine therapy for 5 years substantially reduces recurrence rates during and after treatment in women with early-stage, estrogen-receptor (ER)-positive breast cancer. Extending such therapy beyond 5 years offers further protection but has additional side effects. Obtaining data on the absolute risk of subsequent distant recurrence if therapy stops at 5 years could help determine whether to extend treatment
    corecore