887 research outputs found

    Non-Market Food Practices Do Things Markets Cannot: Why Vermonters Produce and Distribute Food That\u27s Not For Sale

    Get PDF
    Researchers tend to portray food self-provisioning in high-income societies as a coping mechanism for the poor or a hobby for the well-off. They describe food charity as a regrettable band-aid. Vegetable gardens and neighborly sharing are considered remnants of precapitalist tradition. These are non-market food practices: producing food that is not for sale and distributing food in ways other than selling it. Recent scholarship challenges those standard understandings by showing (i) that non-market food practices remain prevalent in high-income countries, (ii) that people in diverse social groups engage in these practices, and (iii) that they articulate diverse reasons for doing so. In this dissertation, I investigate the persistent pervasiveness of non-market food practices in Vermont. To go beyond explanations that rely on individual motivation, I examine the roles these practices play in society. First, I investigate the prevalence of non-market food practices. Several surveys with large, representative samples reveal that more than half of Vermont households grow, hunt, fish, or gather some of their own food. Respondents estimate that they acquire 14% of the food they consume through non-market means, on average. For reference, commercial local food makes up about the same portion of total consumption. Then, drawing on the words of 94 non-market food practitioners I interviewed, I demonstrate that these practices serve functions that markets cannot. Interviewees attested that non-market distribution is special because it feeds the hungry, strengthens relationships, builds resilience, puts edible-but-unsellable food to use, and aligns with a desired future in which food is not for sale. Hunters, fishers, foragers, scavengers, and homesteaders said that these activities contribute to their long-run food security as a skills-based safety net. Self-provisioning allows them to eat from the landscape despite disruptions to their ability to access market food such as job loss, supply chain problems, or a global pandemic. Additional evidence from vegetable growers suggests that non-market settings liberate production from financial discipline, making space for work that is meaningful, playful, educational, and therapeutic. Non-market food practices mend holes in the social fabric torn by the commodification of everyday life. Finally, I synthesize scholarly critiques of markets as institutions for organizing the production and distribution of food. Markets send food toward money rather than hunger. Producing for market compels farmers to prioritize financial viability over other values such as stewardship. Historically, people rarely if ever sell each other food until external authorities coerce them to do so through taxation, indebtedness, cutting off access to the means of subsistence, or extinguishing non-market institutions. Today, more humans than ever suffer from chronic undernourishment even as the scale of commercial agriculture pushes environmental pressures past critical thresholds of planetary sustainability. This research substantiates that alternatives to markets exist and have the potential to address their shortcomings

    UMSL Bulletin 2023-2024

    Get PDF
    The 2023-2024 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1088/thumbnail.jp

    Backpropagation Beyond the Gradient

    Get PDF
    Automatic differentiation is a key enabler of deep learning: previously, practitioners were limited to models for which they could manually compute derivatives. Now, they can create sophisticated models with almost no restrictions and train them using first-order, i. e. gradient, information. Popular libraries like PyTorch and TensorFlow compute this gradient efficiently, automatically, and conveniently with a single line of code. Under the hood, reverse-mode automatic differentiation, or gradient backpropagation, powers the gradient computation in these libraries. Their entire design centers around gradient backpropagation. These frameworks are specialized around one specific task—computing the average gradient in a mini-batch. This specialization often complicates the extraction of other information like higher-order statistical moments of the gradient, or higher-order derivatives like the Hessian. It limits practitioners and researchers to methods that rely on the gradient. Arguably, this hampers the field from exploring the potential of higher-order information and there is evidence that focusing solely on the gradient has not lead to significant recent advances in deep learning optimization. To advance algorithmic research and inspire novel ideas, information beyond the batch-averaged gradient must be made available at the same level of computational efficiency, automation, and convenience. This thesis presents approaches to simplify experimentation with rich information beyond the gradient by making it more readily accessible. We present an implementation of these ideas as an extension to the backpropagation procedure in PyTorch. Using this newly accessible information, we demonstrate possible use cases by (i) showing how it can inform our understanding of neural network training by building a diagnostic tool, and (ii) enabling novel methods to efficiently compute and approximate curvature information. First, we extend gradient backpropagation for sequential feedforward models to Hessian backpropagation which enables computing approximate per-layer curvature. This perspective unifies recently proposed block- diagonal curvature approximations. Like gradient backpropagation, the computation of these second-order derivatives is modular, and therefore simple to automate and extend to new operations. Based on the insight that rich information beyond the gradient can be computed efficiently and at the same time, we extend the backpropagation in PyTorch with the BackPACK library. It provides efficient and convenient access to statistical moments of the gradient and approximate curvature information, often at a small overhead compared to computing just the gradient. Next, we showcase the utility of such information to better understand neural network training. We build the Cockpit library that visualizes what is happening inside the model during training through various instruments that rely on BackPACK’s statistics. We show how Cockpit provides a meaningful statistical summary report to the deep learning engineer to identify bugs in their machine learning pipeline, guide hyperparameter tuning, and study deep learning phenomena. Finally, we use BackPACK’s extended automatic differentiation functionality to develop ViViT, an approach to efficiently compute curvature information, in particular curvature noise. It uses the low-rank structure of the generalized Gauss-Newton approximation to the Hessian and addresses shortcomings in existing curvature approximations. Through monitoring curvature noise, we demonstrate how ViViT’s information helps in understanding challenges to make second-order optimization methods work in practice. This work develops new tools to experiment more easily with higher-order information in complex deep learning models. These tools have impacted works on Bayesian applications with Laplace approximations, out-of-distribution generalization, differential privacy, and the design of automatic differentia- tion systems. They constitute one important step towards developing and establishing more efficient deep learning algorithms

    UMSL Bulletin 2022-2023

    Get PDF
    The 2022-2023 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1087/thumbnail.jp

    The present and future status of heavy neutral leptons

    Full text link
    ArtĂ­culo escrito por un elevado nĂşmero de autores, solo se referencian el que aparece en primer lugar, los autores pertenecientes a la UAM y el nombre del grupo de colaboraciĂłn, si lo hubiereThe existence of nonzero neutrino masses points to the likely existence of multiple Standard Model neutral fermions. When such states are heavy enough that they cannot be produced in oscillations, they are referred to as heavy neutral leptons (HNLs). In this white paper, we discuss the present experimental status of HNLs including colliders, beta decay, accelerators, as well as astrophysical and cosmological impacts. We discuss the importance of continuing to search for HNLs, and its potential impact on our understanding of key fundamental questions, and additionally we outline the future prospects for next-generation future experiments or upcoming accelerator run scenario

    Enhancing the forensic comparison process of common trace materials through the development of practical and systematic methods

    Get PDF
    An ongoing advancement in forensic trace evidence has driven the development of new and objective methods for comparing various materials. While many standard guides have been published for use in trace laboratories, different areas require a more comprehensive understanding of error rates and an urgent need for harmonizing methods of examination and interpretation. Two critical areas are the forensic examination of physical fits and the comparison of spectral data, which depend highly on the examiner’s judgment. The long-term goal of this study is to advance and modernize the comparative process of physical fit examinations and spectral interpretation. This goal is fulfilled through several avenues: 1) improvement of quantitative-based methods for various trace materials, 2) scrutiny of the methods through interlaboratory exercises, and 3) addressing fundamental aspects of the discipline using large experimental datasets, computational algorithms, and statistical analysis. A substantial new body of knowledge has been established by analyzing population sets of nearly 4,000 items representative of casework evidence. First, this research identifies material-specific relevant features for duct tapes and automotive polymers. Then, this study develops reporting templates to facilitate thorough and systematic documentation of an analyst’s decision-making process and minimize risks of bias. It also establishes criteria for utilizing a quantitative edge similarity score (ESS) for tapes and automotive polymers that yield relatively high accuracy (85% to 100%) and, notably, no false positives. Finally, the practicality and performance of the ESS method for duct tape physical fits are evaluated by forensic practitioners through two interlaboratory exercises. Across these studies, accuracy using the ESS method ranges between 95-99%, and again no false positives are reported. The practitioners’ feedback demonstrates the method’s potential to assist in training and improve peer verifications. This research also develops and trains computational algorithms to support analysts making decisions on sample comparisons. The automated algorithms in this research show the potential to provide objective and probabilistic support for determining a physical fit and demonstrate comparative accuracy to the analyst. Furthermore, additional models are developed to extract feature edge information from the systematic comparison templates of tapes and textiles to provide insight into the relative importance of each comparison feature. A decision tree model is developed to assist physical fit examinations of duct tapes and textiles and demonstrates comparative performance to the trained analysts. The computational tools also evaluate the suitability of partial sample comparisons that simulate situations where portions of the item are lost or damaged. Finally, an objective approach to interpreting complex spectral data is presented. A comparison metric consisting of spectral angle contrast ratios (SCAR) is used as a model to assess more than 94 different-source and 20 same-source electrical tape backings. The SCAR metric results in a discrimination power of 96% and demonstrates the capacity to capture information on the variability between different-source samples and the variability within same-source samples. Application of the random-forest model allows for the automatic detection of primary differences between samples. The developed threshold could assist analysts with making decisions on the spectral comparison of chemically similar samples. This research provides the forensic science community with novel approaches to comparing materials commonly seen in forensic laboratories. The outcomes of this study are anticipated to offer forensic practitioners new and accessible tools for incorporation into current workflows to facilitate systematic and objective analysis and interpretation of forensic materials and support analysts’ opinions

    2023-2024 Catalog

    Get PDF
    The 2023-2024 Governors State University Undergraduate and Graduate Catalog is a comprehensive listing of current information regarding:Degree RequirementsCourse OfferingsUndergraduate and Graduate Rules and Regulation

    Peering into the Dark: Investigating dark matter and neutrinos with cosmology and astrophysics

    Get PDF
    The LCDM model of modern cosmology provides a highly accurate description of our universe. However, it relies on two mysterious components, dark matter and dark energy. The cold dark matter paradigm does not provide a satisfying description of its particle nature, nor any link to the Standard Model of particle physics. I investigate the consequences for cosmological structure formation in models with a coupling between dark matter and Standard Model neutrinos, as well as probes of primordial black holes as dark matter. I examine the impact that such an interaction would have through both linear perturbation theory and nonlinear N-body simulations. I present limits on the possible interaction strength from cosmic microwave background, large scale structure, and galaxy population data, as well as forecasts on the future sensitivity. I provide an analysis of what is necessary to distinguish the cosmological impact of interacting dark matter from similar effects. Intensity mapping of the 21 cm line of neutral hydrogen at high redshift using next generation observatories, such as the SKA, would provide the strongest constraints yet on such interactions, and may be able to distinguish between different scenarios causing suppressed small scale structure. I also present a novel type of probe of structure formation, using the cosmological gravitational wave signal of high redshift compact binary mergers to provide information about structure formation, and thus the behaviour of dark matter. Such observations would also provide competitive constraints. Finally, I investigate primordial black holes as an alternative dark matter candidate, presenting an analysis and framework for the evolution of extended mass populations over cosmological time and computing the present day gamma ray signal, as well as the allowed local evaporation rate. This is used to set constraints on the allowed population of low mass primordial black holes, and the likelihood of witnessing an evaporation

    Towards a muon collider

    Get PDF
    A muon collider would enable the big jump ahead in energy reach that is needed for a fruitful exploration of fundamental interactions. The challenges of producing muon collisions at high luminosity and 10 TeV centre of mass energy are being investigated by the recently-formed International Muon Collider Collaboration. This Review summarises the status and the recent advances on muon colliders design, physics and detector studies. The aim is to provide a global perspective of the field and to outline directions for future work

    2017 GREAT Day Program

    Get PDF
    SUNY Geneseo’s Eleventh Annual GREAT Day.https://knightscholar.geneseo.edu/program-2007/1011/thumbnail.jp
    • …
    corecore