138 research outputs found

    Forging a national diet : beef and the political economy of plenty in postwar America

    Get PDF
    Few foods items are more associated with the United States than beef yet it was not until the 1950s that Americans ate more beef than any other meat. The triumph of mass beef consumption was not accidental or a preordained event. As this dissertation argues, beef became the most consumed meat in America because of a policy enacted by a succession of presidential administrations and was aided by popular demand. Beef policy, as understood by its enactors, was an attempt at creating a nation undifferentiated by diet and unified by eating a meal fit for the leader of the free world. Drawing on primary research materials found at the National Archives in College Park, MD, and at five presidential archives, along with government publications and beef industry literature, this work shines a light on a policy of domestic security that went unnamed and uncelebrated yet had a profound effect on how Americans ate. Within the five presidential administrations between 1945 and 1974 could be found a dedication to securing economic peace between producers and consumers as each side battled over the shape of the economy after World War II. This work situations beef policy within several historical fields, including the history of policy and politics, food studies, environmental history, social history, and women's history. By drawing on a diverse group of fields, this dissertation uncovers the complex factors that transformed a nation of aspirational beef eaters into literal ones.Includes bibliographical reference

    Extensive hydrogen supersaturations in the western South Atlantic Ocean suggest substantial underestimation of nitrogen fixation

    Get PDF
    The nitrogen cycle is fundamental to Earth's biogeochemistry. Yet major uncertainties of quantification remain, particularly regarding the global oceanic nitrogen fixation rate. Hydrogen is produced during nitrogen fixation and will become supersaturated in surface waters if there is net release from diazotrophs. Ocean surveys of hydrogen supersaturation thus have the potential to illustrate the spatial and temporal distribution of nitrogen fixation, and to guide the far more onerous but quantitative methods for measuring it. Here we present the first transect of high resolution measurements of hydrogen supersaturations in surface waters along a meridional 10,000 km cruise track through the Atlantic. We compare measured saturations with published measurements of nitrogen fixation rates and also with model-derived values. If the primary source of excess hydrogen is nitrogen fixation and has a hydrogen release ratio similar to Trichodesmium, our hydrogen measurements would point to similar rates of fixation in the North and South Atlantic, roughly consistent with modelled fixation rates but not with measured rates, which are lower in the south. Possible explanations would include any substantial nitrogen fixation by newly discovered diazotrophs, particularly any having a hydrogen release ratio similar to or exceeding that of Trichodesmium; under-sampling of nitrogen fixation south of the equator related to excessive focus on Trichodesmium; and methodological shortcomings of nitrogen fixation techniques that cause a bias towards colonial diazotrophs relative to unicellular forms. Alternatively our data are affected by an unknown hydrogen source that is greater in the southern half of the cruise track than the northern

    Human Proteome Project Mass Spectrometry Data Interpretation Guidelines 3.0

    Get PDF
    The Human Proteome Organization’s (HUPO) Human Proteome Project (HPP) developed Mass Spectrometry (MS) Data Interpretation Guidelines that have been applied since 2016. These guidelines have helped ensure that the emerging draft of the complete human proteome is highly accurate and with low numbers of false-positive protein identifications. Here, we describe an update to these guidelines based on consensus-reaching discussions with the wider HPP community over the past year. The revised 3.0 guidelines address several major and minor identified gaps. We have added guidelines for emerging data independent acquisition (DIA) MS workflows and for use of the new Universal Spectrum Identifier (USI) system being developed by the HUPO Proteomics Standards Initiative (PSI). In addition, we discuss updates to the standard HPP pipeline for collecting MS evidence for all proteins in the HPP, including refinements to minimum evidence. We present a new plan for incorporating MassIVE-KB into the HPP pipeline for the next (HPP 2020) cycle in order to obtain more comprehensive coverage of public MS data sets. The main checklist has been reorganized under headings and subitems, and related guidelines have been grouped. In sum, Version 2.1 of the HPP MS Data Interpretation Guidelines has served well, and this timely update to version 3.0 will aid the HPP as it approaches its goal of collecting and curating MS evidence of translation and expression for all predicted ∼20 000 human proteins encoded by the human genome.This work was funded in part by the National Institutes of Health grants R01GM087221 (EWD/RLM), R24GM127667 (EWD), U54EB020406 (EWD), R01HL133135 (RLM), U19AG02312 (RLM), U54ES017885 (GSO), U24CA210967-01 (GSO), R01LM013115 (NB) and P41GM103484 (NB); National Science Foundation grants ABI-1759980 (NB), DBI-1933311 (EWD), and IOS-1922871 (EWD); Canadian Institutes of Health Research 148408 (CMO); Korean Ministry of Health and Welfare HI13C2098 (YKP); French Ministry of Higher Education, Research and Innovation, ProFI project, ANR-10-INBS-08 (YV); also in part by the National Eye Institute (NEI), National Human Genome Research Institute (NHGRI), National Heart, Lung, and Blood Institute (NHLBI), National Institute of Allergy and Infectious Diseases (NIAID), National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), National Institute of General Medical Sciences (NIGMS), and National Institute of Mental Health (NIMH) of the National Institutes of Health under Award Number U24HG007822 (SO) (the content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health)

    Thermodynamic Computing

    Get PDF
    The hardware and software foundations laid in the first half of the 20th Century enabled the computing technologies that have transformed the world, but these foundations are now under siege. The current computing paradigm, which is the foundation of much of the current standards of living that we now enjoy, faces fundamental limitations that are evident from several perspectives. In terms of hardware, devices have become so small that we are struggling to eliminate the effects of thermodynamic fluctuations, which are unavoidable at the nanometer scale. In terms of software, our ability to imagine and program effective computational abstractions and implementations are clearly challenged in complex domains. In terms of systems, currently five percent of the power generated in the US is used to run computing systems - this astonishing figure is neither ecologically sustainable nor economically scalable. Economically, the cost of building next-generation semiconductor fabrication plants has soared past $10 billion. All of these difficulties - device scaling, software complexity, adaptability, energy consumption, and fabrication economics - indicate that the current computing paradigm has matured and that continued improvements along this path will be limited. If technological progress is to continue and corresponding social and economic benefits are to continue to accrue, computing must become much more capable, energy efficient, and affordable. We propose that progress in computing can continue under a united, physically grounded, computational paradigm centered on thermodynamics. Herein we propose a research agenda to extend these thermodynamic foundations into complex, non-equilibrium, self-organizing systems and apply them holistically to future computing systems that will harness nature's innate computational capacity. We call this type of computing "Thermodynamic Computing" or TC.Comment: A Computing Community Consortium (CCC) workshop report, 36 page

    Tax Evasion, Tax Avoidance and Tax Planning in Australia: The participation in mass-marketed tax avoidance schemes in the Pilbara region of Western Australia in the 1990s

    Get PDF
    This paper will examine the development of mass-marketed tax avoidance schemes in Australia. It will consider changes in approach to tax avoidance from the ‘bottom of the harbour’ schemes of the 1960s and 1970s to the mass-marketed tax avoidance schemes of the 1990s. It will examine the changing structure of tax avoidance from individually crafted tax avoidance structures designed by accountants and lawyers used by high wealth individuals to mass produced structures targeted at highly paid, and therefore highly taxed, blue collar workers in Australia’s mining industry in the 1990s. In the latter half of the twentieth century ‘unacceptable’ tax planning went from highly expensive, individually ‘tailor made’ structures afforded and used only by the very wealthy, to inexpensive replicated structures marketed to skilled and unskilled tradespeople and labourers. By 1998 over 42 000 Australian taxpayers were engaged in tax avoidance schemes with the highest proportion focussed in the mining regions of Western Australia. In the remote and inhospitable mining community of Pannawonica, which has one of the highest paid workforces in Australia, the Australian Taxation Office identified that as many as one in five taxpayers were engaged in a mass-marketed tax avoidance scheme. The paper will identify the causes of these changes, including the advent of the computerised information technology which permitted ‘mass production’ of business structures designed to exploit business incentives in the Australian taxation system in the 1990s. It will also set these developments within the broader context of the tax compliance culture prevailing in Australia and overseas during this period

    A review of nitrogen isotopic alteration in marine sediments

    Get PDF
    Key Points: Use of sedimentary nitrogen isotopes is examined; On average, sediment 15N/14N increases approx. 2 per mil during early burial; Isotopic alteration scales with water depth Abstract: Nitrogen isotopes are an important tool for evaluating past biogeochemical cycling from the paleoceanographic record. However, bulk sedimentary nitrogen isotope ratios, which can be determined routinely and at minimal cost, may be altered during burial and early sedimentary diagenesis, particularly outside of continental margin settings. The causes and detailed mechanisms of isotopic alteration are still under investigation. Case studies of the Mediterranean and South China Seas underscore the complexities of investigating isotopic alteration. In an effort to evaluate the evidence for alteration of the sedimentary N isotopic signal and try to quantify the net effect, we have compiled and compared data demonstrating alteration from the published literature. A >100 point comparison of sediment trap and surface sedimentary nitrogen isotope values demonstrates that, at sites located off of the continental margins, an increase in sediment 15N/14N occurs during early burial, likely at the seafloor. The extent of isotopic alteration appears to be a function of water depth. Depth-related differences in oxygen exposure time at the seafloor are likely the dominant control on the extent of N isotopic alteration. Moreover, the compiled data suggest that the degree of alteration is likely to be uniform through time at most sites so that bulk sedimentary isotope records likely provide a good means for evaluating relative changes in the global N cycle

    Genetic Networks of Liver Metabolism Revealed by Integration of Metabolic and Transcriptional Profiling

    Get PDF
    Although numerous quantitative trait loci (QTL) influencing disease-related phenotypes have been detected through gene mapping and positional cloning, identification of the individual gene(s) and molecular pathways leading to those phenotypes is often elusive. One way to improve understanding of genetic architecture is to classify phenotypes in greater depth by including transcriptional and metabolic profiling. In the current study, we have generated and analyzed mRNA expression and metabolic profiles in liver samples obtained in an F2 intercross between the diabetes-resistant C57BL/6 leptinob/ob and the diabetes-susceptible BTBR leptinob/ob mouse strains. This cross, which segregates for genotype and physiological traits, was previously used to identify several diabetes-related QTL. Our current investigation includes microarray analysis of over 40,000 probe sets, plus quantitative mass spectrometry-based measurements of sixty-seven intermediary metabolites in three different classes (amino acids, organic acids, and acyl-carnitines). We show that liver metabolites map to distinct genetic regions, thereby indicating that tissue metabolites are heritable. We also demonstrate that genomic analysis can be integrated with liver mRNA expression and metabolite profiling data to construct causal networks for control of specific metabolic processes in liver. As a proof of principle of the practical significance of this integrative approach, we illustrate the construction of a specific causal network that links gene expression and metabolic changes in the context of glutamate metabolism, and demonstrate its validity by showing that genes in the network respond to changes in glutamine and glutamate availability. Thus, the methods described here have the potential to reveal regulatory networks that contribute to chronic, complex, and highly prevalent diseases and conditions such as obesity and diabetes

    Multiple novel prostate cancer susceptibility signals identified by fine-mapping of known risk loci among Europeans

    Get PDF
    Genome-wide association studies (GWAS) have identified numerous common prostate cancer (PrCa) susceptibility loci. We have fine-mapped 64 GWAS regions known at the conclusion of the iCOGS study using large-scale genotyping and imputation in 25 723 PrCa cases and 26 274 controls of European ancestry. We detected evidence for multiple independent signals at 16 regions, 12 of which contained additional newly identified significant associations. A single signal comprising a spectrum of correlated variation was observed at 39 regions; 35 of which are now described by a novel more significantly associated lead SNP, while the originally reported variant remained as the lead SNP only in 4 regions. We also confirmed two association signals in Europeans that had been previously reported only in East-Asian GWAS. Based on statistical evidence and linkage disequilibrium (LD) structure, we have curated and narrowed down the list of the most likely candidate causal variants for each region. Functional annotation using data from ENCODE filtered for PrCa cell lines and eQTL analysis demonstrated significant enrichment for overlap with bio-features within this set. By incorporating the novel risk variants identified here alongside the refined data for existing association signals, we estimate that these loci now explain ∼38.9% of the familial relative risk of PrCa, an 8.9% improvement over the previously reported GWAS tag SNPs. This suggests that a significant fraction of the heritability of PrCa may have been hidden during the discovery phase of GWAS, in particular due to the presence of multiple independent signals within the same regio

    Systemic HIV and SIV latency reversal via non-canonical NF-κB signalling in vivo

    Get PDF
    Long-lasting, latently infected resting CD4+ T cells are the greatest obstacle to obtaining a cure for HIV infection, as these cells can persist despite decades of treatment with antiretroviral therapy (ART). Estimates indicate that more than 70 years of continuous, fully suppressive ART are needed to eliminate the HIV reservoir1. Alternatively, induction of HIV from its latent state could accelerate the decrease in the reservoir, thus reducing the time to eradication. Previous attempts to reactivate latent HIV in preclinical animal models and in clinical trials have measured HIV induction in the peripheral blood with minimal focus on tissue reservoirs and have had limited effect2–9. Here we show that activation of the non-canonical NF-κB signalling pathway by AZD5582 results in the induction of HIV and SIV RNA expression in the blood and tissues of ART-suppressed bone-marrow–liver–thymus (BLT) humanized mice and rhesus macaques infected with HIV and SIV, respectively. Analysis of resting CD4+ T cells from tissues after AZD5582 treatment revealed increased SIV RNA expression in the lymph nodes of macaques and robust induction of HIV in almost all tissues analysed in humanized mice, including the lymph nodes, thymus, bone marrow, liver and lung. This promising approach to latency reversal—in combination with appropriate tools for systemic clearance of persistent HIV infection—greatly increases opportunities for HIV eradication

    Sistemas nacionais de inteligência: origens, lógica de expansão e configuração atual

    Full text link
    • …
    corecore