1,067 research outputs found

    Geodesic bicombings on some hyperspaces

    Get PDF
    We show that if (X,d)(X,d) is a metric space which admits a consistent covex geodesic bicombing, then we can construct a conical bicombing on CB(X)CB(X), the hyperspace of nonempty, closed, bounded, and convex subsets of XX (with the Hausdorff metric). If XX is a normed space, this same method produces a consistent convex bicombing on CB(X)CB(X). We follow this by examining a geodesic bicombing on the nonempty compact subsets of XX, assuming XX is a proper metric space

    EFFECTS OF FOOD ASSISTANCE AND NUTRITION PROGRAMS ON NUTRITION AND HEALTH, VOLUME 2, DATA SOURCES

    Get PDF
    This is the second of four reports completed by Abt Associates Inc., under the contract "The Nutrition and Health Outcome Study." This report is an evaluation of various data sources for their potential for analyzing the impacts of USDA's food assistance and nutrition programs (FANPs). Data sources are evaluated against three criteria: coverage of both program participants and nonparticipants; identification of participants and determination of eligibility among nonparticipants; and availability of impact measures. Each data source is classified into one of four categories: principal, potential, recognized, and insufficient. Principal and potential sources are discussed and profiled in this report.USDA Food Assistance and Nutrition Programs, data sources, program participation, nutrition outcomes, health outcomes, Food Consumption/Nutrition/Food Safety, Food Security and Poverty,

    One-Sided Derivative of Distance to a Compact Set

    Get PDF
    We give a complete and self-contained proof of a folklore theorem which says that in an Alexandrov space the distance between a point Îł(t)\gamma(t) on a geodesic Îł\gamma and a compact set KK is a right-differentiable function of tt. Moreover, the value of this right-derivative is given by the negative cosine of the minimal angle between the geodesic and any shortest path to the compact set (Theorem 4.3). Our treatment serves as a general introduction to metric geometry and relies only on the basic elements, such as comparison triangles and upper angles.Comment: 22 pages, 8 figure

    A case study of colliding tornadic storms

    Get PDF
    Abstract only availableTornadoes occur frequently across the United States each year, causing millions of dollars in damage. Meteorologists are constantly searching for new and improved methods for predicting these weather phenomenons's in order to increase public awareness and warning times. In this case study, one event was found in which two storm cells collided and produced a tornado over the Kansas City, Missouri area, causing an extensive amount of damage. The goals of this study is to first determine what caused the collision between the two storm cells, secondly, whether the collision between the two storm cells increased the intensity of the tornado using NSSL/SPC (National Severe Storms Laboratory/ Storm Prediction Center) meteorologist Stephen F. Corfidi's “vector approach.” A method that involves the use of mathematics to find the mean of the wind directions throughout the cloud layers in the storms and also the location of the low-level jet. Radar imagery was also used in determining the location, time, intensity, and other details of the two storm cells. It is our hope, that the completion of this study will produce results that are conducive to the development of more innovative methods for forecasting this type of event.Louis Stokes Alliance for Minority Participatio

    PEX11β induces peroxisomal gene expression and alters peroxisome number during early Xenopus laevis development

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Peroxisomes are organelles whose roles in fatty acid metabolism and reactive oxygen species elimination have contributed much attention in understanding their origin and biogenesis. Many studies have shown that <it>de novo </it>peroxisome biogenesis is an important regulatory process, while yeast studies suggest that total peroxisome numbers are in part regulated by proteins such as Pex11, which can facilitate the division of existing peroxisomes. Although <it>de novo </it>biogenesis and divisions are likely important mechanisms, the regulation of peroxisome numbers during embryonic development is poorly understood. Peroxisome number and function are particularly crucial in oviparous animals such as frogs where large embryonic yolk and fatty acid stores must be quickly metabolized, and resulting reactive oxygen species eliminated. Here we elucidate the role of Pex11β in regulating peroxisomal gene expression and number in <it>Xenopus laevis </it>embryogenesis.</p> <p>Results</p> <p>Microinjecting haemagglutinin (HA) tagged Pex11β in early embryos resulted in increased RNA levels for peroxisome related genes PMP70 and catalase at developmental stages 10 and 20, versus uninjected embryos. Catalase and PMP70 proteins were found in punctate structures at stage 20 in control embryos, whereas the injection of ectopic HA-Pex11β induced their earlier localization in punctate structures at stage 10. Furthermore, the peroxisomal marker GFP-SKL, which was found localized as peroxisome-like structures at stage 20, was similarly found at stage 10 when co-microinjected with HA-Pex11β.</p> <p>Conclusions</p> <p>Overexpressed Pex11β altered peroxisomal gene levels and induced the early formation of peroxisomes-like structures during development, both of which demonstrate that Pex11β may be a key regulator of peroxisome number in early Xenopus embryos.</p

    Contamination in complex healthcare trials:the falls in care homes (FinCH) study experience

    Get PDF
    BACKGROUND: Trials are at risk of contamination bias which can occur when participants in the control group are inadvertently exposed to the intervention. This is a particular risk in rehabilitation studies where it is easy for trial interventions to be either intentionally or inadvertently adopted in control settings. The Falls in Care Homes (FinCH) trial is used in this paper as an example of a large randomised controlled trial of a complex intervention to explore the potential risks of contamination bias. We outline the FinCH trial design, present the potential risks from contamination bias, and the strategies used in the design of the trial to minimise or mitigate against this. The FinCH trial was a multi-centre randomised controlled trial, with embedded process evaluation, which evaluated whether systematic training in the use of the Guide to Action Tool for Care Homes reduced falls in care home residents. Data were collected from a number of sources to explore contamination in the FinCH trial. Where specific procedures were adopted to reduce risk of, or mitigate against, contamination, this was recorded. Data were collected from study e-mails, meetings with clinicians, research assistant and clinician network communications, and an embedded process evaluation in six intervention care homes. During the FinCH trial, there were six new falls prevention initiatives implemented outside the study which could have contaminated our intervention and findings. Methods used to minimise contamination were: cluster randomisation at the level of care home; engagement with the clinical community to highlight the risks of early adoption; establishing local collaborators in each site familiar with the local context; signing agreements with NHS falls specialists that they would maintain confidentiality regarding details of the intervention; opening additional research sites; and by raising awareness about the importance of contamination in research among participants. CONCLUSION: Complex rehabilitation trials are at risk of contamination bias. The potential for contamination bias in studies can be minimized by strengthening collaboration and dialogue with the clinical community. Researchers should recognise that clinicians may contaminate a study through lack of research expertise

    Transferring a molecular foundation model for polymer property predictions

    Full text link
    Transformer-based large language models have remarkable potential to accelerate design optimization for applications such as drug development and materials discovery. Self-supervised pretraining of transformer models requires large-scale datasets, which are often sparsely populated in topical areas such as polymer science. State-of-the-art approaches for polymers conduct data augmentation to generate additional samples but unavoidably incurs extra computational costs. In contrast, large-scale open-source datasets are available for small molecules and provide a potential solution to data scarcity through transfer learning. In this work, we show that using transformers pretrained on small molecules and fine-tuned on polymer properties achieve comparable accuracy to those trained on augmented polymer datasets for a series of benchmark prediction tasks

    Time-reversal symmetry relation for nonequilibrium flows ruled by the fluctuating Boltzmann equation

    Full text link
    A time-reversal symmetry relation is established for out-of-equilibrium dilute or rarefied gases described by the fluctuating Boltzmann equation. The relation is obtained from the associated coarse-grained master equation ruling the random numbers of particles in cells of given position and velocity in the single-particle phase space. The symmetry relation concerns the fluctuating particle and energy currents of the gas flowing between reservoirs or thermalizing surfaces at given particle densities or temperatures.Comment: The result of this paper has been presented at the conference "Boltzmann equation: mathematics, modeling and simulations" in memory of Carlo Cercignani at the Henri Poincar\'e Institute, Paris, February 9-11, 201
    • …
    corecore