2,455 research outputs found

    Toward Realistic Dosimetry In Vitro: Determining Effective Concentrations of Test Substances in Cell Culture and Their Prediction by an In Silico Mass Balance Model

    Get PDF
    Nominal concentrations (CNom) in cell culture media are routinely used to define concentration–effect relationships in the in vitro toxicology. The actual concentration in the medium (CMedium) can be affected by adsorption processes, evaporation, or degradation of chemicals. Therefore, we measured the total and free concentration of 12 chemicals, covering a wide range of lipophilicity (log KOW −0.07–6.84), in the culture medium (CMedium) and cells (CCell) after incubation with Balb/c 3T3 cells for up to 48 h. Measured values were compared to predictions using an as yet unpublished in silico mass balance model that combined relevant equations from similar models published by others. The total CMedium for all chemicals except tamoxifen (TAM) were similar to the CNom. This was attributed to the cellular uptake of TAM and accumulation into lysosomes. The free (i.e., unbound) CMedium for the low/no protein binding chemicals were similar to the CNom, whereas values of all moderately to highly protein-bound chemicals were less than 30% of the CNom. Of the 12 chemicals, the two most hydrophilic chemicals, acetaminophen (APAP) and caffeine (CAF), were the only ones for which the CCell was the same as the CNom. The CCell for all other chemicals tended to increase over time and were all 2- to 274-fold higher than CNom. Measurements of CCytosol, using a digitonin method to release cytosol, compared well with CCell (using a freeze–thaw method) for four chemicals (CAF, APAP, FLU, and KET), indicating that both methods could be used. The mass balance model predicted the total CMedium within 30% of the measured values for 11 chemicals. The free CMedium of all 12 chemicals were predicted within 3-fold of the measured values. There was a poorer prediction of CCell values, with a median overprediction of 3- to 4-fold. In conclusion, while the number of chemicals in the study is limited, it demonstrates the large differences between CNom and total and free CMedium and CCell, which were also relatively well predicted by the mass balance model

    Housing Intervention and Neighbourhood Development: Harnessing Change in West Broadway

    Get PDF
    During the period leading to the early 1990s the West Broadway area of inner city Winnipeg experienced many signs of neighbourhood decline, such as residential fires, housing abandonment and structural deterioration. From the mid 1990s considerable amounts of volunteer energy, public funding and philanthropic resources were devoted to turning the neighbourhood around, focusing efforts through community development, employment training, arts programs, housing upgrading and other themes. Many individuals and organizations combined their capabilities in the attempt to create an inclusive and diverse community. The study Housing Intervention and Neighbourhood Development was grounded in the need to take stock of changes in the neighbourhood and to relate these to knowledge of the nature of neighbourhood change. It was intended that this would enable an informed assessment of whether dynamics such as gentrification, disinvestment and stabilization appear to be in operation in parts of the neighbourhood. This assessment, in turn, would support discussion of strategies that could be implemented to help guide how the neighbourhood would unfold

    Applications of deep convolutional neural networks to digitized natural history collections

    Get PDF
    Natural history collections contain data that are critical for many scientific endeavors. Recent efforts in mass digitization are generating large datasets from these collections that can provide unprecedented insight. Here, we present examples of how deep convolutional neural networks can be applied in analyses of imaged herbarium specimens. We first demonstrate that a convolutional neural network can detect mercury-stained specimens across a collection with 90% accuracy. We then show that such a network can correctly distinguish two morphologically similar plant families 96% of the time. Discarding the most challenging specimen images increases accuracy to 94% and 99%, respectively. These results highlight the importance of mass digitization and deep learning approaches and reveal how they can together deliver powerful new investigative tools

    Algorithm for J-Integral Measurements by Digital Image Correlation

    Get PDF
    The work is devoted to the testing of the algorithm for calculating J-integral based on the construction of vector fields by digital image correlation (DIC) method. A comparative analysis of J-integral values calculated using DIC and instrumental data obtained in accordance with ASTM E 1820 "Standard Test Method for Measurement of Fracture Toughness" has made. It is shown that this approach can be used for cases when the standard technique for measuring the J-integral cannot be applied, or the standard technique does not allow achieving the required accuracy for the integral determination in local areas of the loaded material

    Reproducible big data science: A case study in continuous FAIRness.

    Get PDF
    Big biomedical data create exciting opportunities for discovery, but make it difficult to capture analyses and outputs in forms that are findable, accessible, interoperable, and reusable (FAIR). In response, we describe tools that make it easy to capture, and assign identifiers to, data and code throughout the data lifecycle. We illustrate the use of these tools via a case study involving a multi-step analysis that creates an atlas of putative transcription factor binding sites from terabytes of ENCODE DNase I hypersensitive sites sequencing data. We show how the tools automate routine but complex tasks, capture analysis algorithms in understandable and reusable forms, and harness fast networks and powerful cloud computers to process data rapidly, all without sacrificing usability or reproducibility-thus ensuring that big data are not hard-to-(re)use data. We evaluate our approach via a user study, and show that 91% of participants were able to replicate a complex analysis involving considerable data volumes

    Using paired serology and surveillance data to quantify dengue transmission and control during a large outbreak in Fiji.

    Get PDF
    Dengue is a major health burden, but it can be challenging to examine transmission and evaluate control measures because outbreaks depend on multiple factors, including human population structure, prior immunity and climate. We combined population-representative paired sera collected before and after the 2013/14 dengue-3 outbreak in Fiji with surveillance data to determine how such factors influence transmission and control in island settings. Our results suggested the 10-19 year-old age group had the highest risk of infection, but we did not find strong evidence that other demographic or environmental risk factors were linked to seroconversion. A mathematical model jointly fitted to surveillance and serological data suggested that herd immunity and seasonally varying transmission could not explain observed dynamics. However, the model showed evidence of an additional reduction in transmission coinciding with a vector clean-up campaign, which may have contributed to the decline in cases in the later stages of the outbreak

    Estimation of Rift Valley fever virus spillover to humans during the Mayotte 2018–2019 epidemic

    Get PDF
    Rift Valley fever (RVF) is an emerging, zoonotic, arboviral hemorrhagic fever threatening livestock and humans mainly in Africa. RVF is of global concern, having expanded its geographical range over the last decades. The impact of control measures on epidemic dynamics using empirical data has not been assessed. Here, we fitted a mathematical model to seroprevalence livestock and human RVF case data from the 2018–2019 epidemic in Mayotte to estimate viral transmission among livestock, and spillover from livestock to humans through both direct contact and vector-mediated routes. Model simulations were used to assess the impact of vaccination on reducing the epidemic size. The rate of spillover by direct contact was about twice as high as vector transmission. Assuming 30% of the population were farmers, each transmission route contributed to 45% and 55% of the number of human infections, respectively. Reactive vaccination immunizing 20% of the livestock population reduced the number of human cases by 30%. Vaccinating 1 mo later required using 50% more vaccine doses for a similar reduction. Vaccinating only farmers required 10 times as more vaccine doses for a similar reduction in human cases. Finally, with 52.0% (95% credible interval [CrI] [42.9–59.4]) of livestock immune at the end of the epidemic wave, viral reemergence in the next rainy season (2019–2020) is unlikely. Coordinated human and animal health surveillance, and timely livestock vaccination appear to be key to controlling RVF in this setting. We furthermore demonstrate the value of a One Health quantitative approach to surveillance and control of zoonotic infectious diseases

    Fermi Large Area Telescope Constraints on the Gamma-ray Opacity of the Universe

    Get PDF
    The Extragalactic Background Light (EBL) includes photons with wavelengths from ultraviolet to infrared, which are effective at attenuating gamma rays with energy above ~10 GeV during propagation from sources at cosmological distances. This results in a redshift- and energy-dependent attenuation of the gamma-ray flux of extragalactic sources such as blazars and Gamma-Ray Bursts (GRBs). The Large Area Telescope onboard Fermi detects a sample of gamma-ray blazars with redshift up to z~3, and GRBs with redshift up to z~4.3. Using photons above 10 GeV collected by Fermi over more than one year of observations for these sources, we investigate the effect of gamma-ray flux attenuation by the EBL. We place upper limits on the gamma-ray opacity of the Universe at various energies and redshifts, and compare this with predictions from well-known EBL models. We find that an EBL intensity in the optical-ultraviolet wavelengths as great as predicted by the "baseline" model of Stecker et al. (2006) can be ruled out with high confidence.Comment: 42 pages, 12 figures, accepted version (24 Aug.2010) for publication in ApJ; Contact authors: A. Bouvier, A. Chen, S. Raino, S. Razzaque, A. Reimer, L.C. Reye
    corecore