16 research outputs found
Impact of actual and self-perceived body type on visual perception of distances
2015 Spring.Includes bibliographical references.We investigate several questions regarding the proposition that physical body size and one's image of their own body type affect the ability to make accurate judgements of distances. Data collected include subjects' guesses of distances of four cones set 10, 15, 20, and 25 meters away and the weight, BMI, and self-perception of body image for each of 67 subjects. Interest lies in determining the covariates that are most important in explaining one's ability to accurately judge distances and whether weight or BMI is the better explainer among the physical body size predictors. We utilize linear mixed models to account for correlation among each subjects' own distance guesses and to allow for flexible modeling of subject-specific effects. Flexibility is further promoted through use of model averaging techniques to account for model selection uncertainty inherent in typical approaches in which an analyst selects only one model from which inferences are made. A generalization of the coefficient of determination from ordinary linear models is made to the linear mixed model setting (R²LMM) in order to provide an additional goodness measure for fixed effects and for individual fixed effects themselves. Baseline differences among subjects' ability to accurately judge distances are so vast that extracting the importance of the fixed effects becomes difficult. It is found that body size is a significant predictor of subjects' ability to accurately judge distances but body image is not at the 0.05 significance level. We recommend choosing weight over BMI as a predictor of guessing behavior based on information criteria, model averaging, and the generalized R²LMM. Specifically, heavier individuals tend to guess more accurately
Challenges and opportunities for agroforestry practitioners to participate in state preferential property tax programs for agriculture and forestry
All 50 states offer preferential property tax programs that lower the taxes paid on enrolled agricultural and/or forest lands. While agroforestry is a land-use that combines elements of both agriculture and forestry, eligibility criteria and other rules and regulations may prevent landowners from enrolling agroforestry practices in one or more of the agricultural and forestry tax programs. This pilot-scale study developed conceptual and methodological frameworks to identify the current barriers to and opportunities in preferential tax policies applicable to agroforestry practices. We conducted an extensive review of state preferential property tax programs relevant for agroforestry practices, following focus group discussions with regional experts in five selected states across the United States: North Carolina, Nebraska, Wisconsin, New York, and Oregon. Based on a systematic review of statutes and their supporting documents, we developed a database of programs, which support or create barriers to enrollment of agroforestry practitioners into the programs. We found that agricultural tax assessments were more likely to favor multi-use agriculture and forestry systems than the preferential tax assessments of forestlands in the five states. Forest farming and silvopasture, followed by alley cropping, windbreaks, and riparian forest buffers, were found to be the most common agroforestry practices allowed under preferential tax classifications in the study states. This study provides a framework for cataloging and analyzing preferential property tax-programs to document barriers and facilitators to agroforestry practices in the United States
Investigation of risk factors for introduction of highly pathogenic avian influenza H5N1 virus onto table egg farms in the United States, 2022: a case–control study
Introduction: The 2022–2023 highly pathogenic avian influenza (HPAI) H5N1 outbreak in the United States (U.S.) is the most geographically extensive and costly animal health event in U.S. history. In 2022 alone, over 57 million commercial and backyard poultry in 47 U.S. states were affected. Over 75% of affected poultry were part of the commercial table egg production sector. Methods: We conducted a case–control study to identify potential risk factors for introduction of HPAI virus onto commercial table egg operations. Univariate and multivariable analyses were conducted to compare farm characteristics, management, and biosecurity factors on case and control farms. Results: Factors associated with increased risk of infection included being in an existing control zone, sightings of wild waterfowl, mowing or bush hogging vegetation less than 4 times a month, having an off-site method of daily mortality disposal (off-site composting or burial, rendering, or landfill), and wild bird access to feed/feed ingredients at least some of the time. Protective factors included a high level of vehicle washing for trucks and trailers entering the farm (a composite variable that included having a permanent wash station), having designated personnel assigned to specific barns, having a farm entrance gate, and requiring a change of clothing for workers entering poultry barns. Discussion: Study results improve our understanding of risk factors for HPAI infection and control measures for preventing HPAI on commercial U.S. table egg farms
Investigation of risk factors for introduction of highly pathogenic avian influenza H5N1 infection among commercial turkey operations in the United States, 2022: a case-control study
Introduction: The 2022–2023 highly pathogenic avian influenza (HPAI) H5N1 outbreak in the United States (U.S.) is the largest and most costly animal health event in U.S. history. Approximately 70% of commercial farms affected during this outbreak have been turkey farms. Methods: We conducted a case-control study to identify potential risk factors for introduction of HPAI virus onto commercial meat turkey operations. Data were collected from 66 case farms and 59 control farms in 12 states. Univariate and multivariable analyses were conducted to compare management and biosecurity factors on case and control farms. Results: Factors associated with increased risk of infection included being in an existing control zone, having both brooders and growers, having toms, seeing wild waterfowl or shorebirds in the closest field, and using rendering for dead bird disposal. Protective factors included having a restroom facility, including portable, available to crews that visit the farm and workers having access and using a shower at least some of the time when entering a specified barn. Discussion: Study results provide a better understanding of risk factors for HPAI infection and can be used to inform prevention and control measures for HPAI on U.S. turkey farms
Investigation of risk factors for introduction of highly pathogenic avian influenza H5N1 infection among commercial turkey operations in the United States, 2022: a case-control study
IntroductionThe 2022–2023 highly pathogenic avian influenza (HPAI) H5N1 outbreak in the United States (U.S.) is the largest and most costly animal health event in U.S. history. Approximately 70% of commercial farms affected during this outbreak have been turkey farms.MethodsWe conducted a case-control study to identify potential risk factors for introduction of HPAI virus onto commercial meat turkey operations. Data were collected from 66 case farms and 59 control farms in 12 states. Univariate and multivariable analyses were conducted to compare management and biosecurity factors on case and control farms.ResultsFactors associated with increased risk of infection included being in an existing control zone, having both brooders and growers, having toms, seeing wild waterfowl or shorebirds in the closest field, and using rendering for dead bird disposal. Protective factors included having a restroom facility, including portable, available to crews that visit the farm and workers having access and using a shower at least some of the time when entering a specified barn.DiscussionStudy results provide a better understanding of risk factors for HPAI infection and can be used to inform prevention and control measures for HPAI on U.S. turkey farms
Coxiella burnetii in domestic doe goats in the United States, 2019–2020
Coxiella burnetii is a bacterial pathogen capable of causing serious disease in humans and abortions in goats. Infected goats can shed C. burnetii through urine, feces, and parturient byproducts, which can lead to infections in humans when the bacteria are inhaled. Goats are important C. burnetii reservoirs as evidenced by goat-related outbreaks across the world. To better understand the current landscape of C. burnetii infection in the domestic goat population, 4,121 vaginal swabs from 388 operations across the United States were analyzed for the presence of C. burnetii by IS1111 PCR as part of the United States Department of Agriculture, Animal Plant Health Inspection Service, Veterinary Services’ National Animal Health Monitoring System Goats 2019 Study. In total, 1.5% (61/4121) of swabs representing 10.3% (40/388) (weighted estimate of 7.8, 95% CI 4.4–13.5) of operations were positive for C. burnetii DNA. The quantity of C. burnetii on positive swabs was low with an average Ct of 37.9. Factors associated with greater odds of testing positive included suspected Q fever in the herd in the previous 3 years, the presence of wild deer or elk on the operation, and the utilization of hormones for estrus synchronization. Factors associated with reduced odds of testing positive include the presence of kittens and treatment of herds with high tannin concentrate plants, diatomaceous earth, and tetrahydropyrimidines. In vitro analysis demonstrated an inhibitory effect of the tetrahydropyrimidine, pyrantel pamoate, on the growth of C. burnetii in axenic media as low as 1 μg per mL. The final multivariable logistic regression modeling identified the presence of wild predators on the operation or adjacent property (OR = 9.0, 95% CI 1.3–61.6, p value = 0.0248) as a risk factor for C. burnetii infection
Foot-and-Mouth Disease Infection Dynamics in Contact-Exposed Pigs Are Determined by the Estimated Exposure Dose
The quantitative relationship between the exposure dose of foot-and-mouth disease virus (FMDV) and subsequent infection dynamics has been demonstrated through controlled inoculation studies in various species. However, similar quantitation of viral doses has not been achieved during contact exposure experiments due to the intrinsic difficulty of measuring the virus quantities exchanged between animals. In the current study, novel modeling techniques were utilized to investigate FMDV infection dynamics in groups of pigs that had been contact-exposed to FMDV-infected donors shedding varying levels of virus, as well as in pigs inoculated via the intra-oropharyngeal (IOP) route. Estimated virus exposure doses were modeled and were found to be statistically significantly associated with the dynamics of FMDV RNA detection in serum and oropharyngeal fluid (OPF), and with the time to onset of clinical disease. The minimum estimated shedding quantity in OPF that defined infectiousness of donor pigs was 6.55 log10 genome copy numbers (GCN)/ml (95% CI 6.11, 6.98), which delineated the transition from the latent to infectious phase of disease which occurred during the incubation phase. This quantity corresponded to a minimum estimated exposure dose of 5.07 log10 GCN/ml (95% CI 4.25, 5.89) in contact-exposed pigs. Thus, we demonstrated that a threshold quantity of FMDV detection in donor pigs was necessary in order to achieve transmission by direct contact. The outcomes from this investigation demonstrate that variability of infection dynamics which occurs during the progression of FMD in naturally exposed pigs can be partially attributed to variations in exposure dose. Moreover, these modeling approaches for dose-quantitation may be retrospectively applied to contact-exposure experiments or field scenarios. Hence, robust information could be incorporated into models used to evaluate FMD spread and control
Quantitative impacts of incubation phase transmission of foot-and-mouth disease virus
The current investigation applied a Bayesian modeling approach to a unique experimental transmission study to estimate the occurrence of transmission of foot-and-mouth disease (FMD) during the incubation phase amongst group-housed pigs. The primary outcome was that transmission occurred approximately one day prior to development of visible signs of disease (posterior median 21 hours, 95% CI: 1.1–45.0). Updated disease state durations were incorporated into a simulation model to examine the importance of addressing preclinical transmission in the face of robust response measures. Simulation of FMD outbreaks in the US pig production sector demonstrated that including a preclinical infectious period of one day would result in a 40% increase in the median number of farms affected (166 additional farms and 664,912 pigs euthanized) compared to the scenario of no preclinical transmission, assuming suboptimal outbreak response. These findings emphasize the importance of considering transmission of FMD during the incubation phase in modeling and response planning
Investigation of at-vent dynamics and dilution using thermal infrared radiometers at Masaya volcano, Nicaragua
In order to develop a detailed understanding of the dynamics of gas puffing (gas release as a series of distinct pulses) and more sustained degassing (steady plumes of gas) during persistent volcanic degassing, measurements of gas mass flux are required in the vicinity of the volcanic vent. Masaya volcano (Nicaragua), a persistently degassing system, is an ideal location for measuring the dynamics of releases of volcanic gas in the first few seconds of their propagation. We carried out two field experiments during February 2002 and March 2003, during which thermal infrared thermometers were targeted into the degassing vent at Masaya to record thermal variations related to variations in the at-vent gas emission over short (on the order of seconds) time scales. The thermometers recorded an oscillating signal as gas puffs passed through the field of view, detailing variations in the degassing process developing over time scales of seconds. These data were processed to extract puff frequencies, amplitudes, durations, emission velocities and volumes. These data showed that, over time periods of hours, the total gas flux was stable with little variation in the puffing frequency. However, between the 2002 and 2003 data set we noted an increase in mean plume temperature, puffing frequency, puff emission velocity and puff volume, as well as a decrease in mean puff duration. These changes were consistent with a thermal data-derived increase in emitted gas flux from 4.2 × 107 m3 d- 1 to 6.4 × 107 m3 d- 1 between the two campaigns. Turbulent gas puffs entrain surrounding air, and quantifying the magnitude of air entrainment, or dilution, represents a major challenge for the measurement of total volcanic gas emissions. Our observations of small gas puffs suggest that they behave as turbulent buoyant thermals, and we use equations for mass, momentum and buoyancy, coupled with the standard entrainment assumption for turbulent buoyant flows, to estimate the gas puff dilution. The theoretically calculated dilution of 0.09 and 0.24 between emission and detection yields total SO2 mass fluxes of 209 t d- 1 and 864 t d- 1 for 2002 and 2003, respectively. This compares well with UV-spectrometer SO2 fluxes of 470 and 680 t d- 1 for February 2002 and March 2003, respectively. © 2007 Elsevier B.V. All rights reserved
Using a Portfolio Approach to Evaluate Animal Health Surveillance Portfolios in the United States
Selecting the optimal level of surveillance to implement for an animal disease is important when decision-makers are allocating resources within a surveillance portfolio (collection of all surveillance activities for a species). Decision-makers should consider economically efficient options that meet effectiveness requirements of a surveillance system (i.e., disease detection capability, timeliness, etc.). In this research, we look at components in two disease surveillance systems within a species portfolio and compare current surveillance testing levels with four other optional levels. Option 1 does not meet the detection capability thresholds, while option 2 meets thresholds for one disease but not the other. Options 3 and 4 meet the detection capability thresholds and result in a cost savings compared to current levels. We conclude that Option 3 would be the optimum level of surveillance as it has a lower cost-effectiveness ratio compared to option 4 and the current level, as well as a cost savings of $637,500