177 research outputs found

    Virtual soil testing – what is it?

    Get PDF
    Non-Peer ReviewedCurrently only ten percent of the total arable land in western Canada is soil tested at best. The percentage of farmers that soil test on a yearly basis is even lower. Providing recommendations to the farming community for the non-tested land presents both a challenge and an opportunity. Virtual soil testing started as an idea to essentially utilize information collected from soil tested fields and provide more qualified recommendations for those fields that were not tested. Virtual soil testing (or VST®) is in essence a modeling technique that reverses the soil testing process, i.e., utilizes crop production characteristics in association with chemical tests to predict soil nutrient levels for a subsequent crop (Karamanos and Cannon 2002). It is based on the Fertility Analysis and Recommendations Management (F.A.R.M.) model (Kruger et al. 1994) that was developed by Henry (1990; 1991) and was subsequently adapted to Saskatchewan, Manitoba and Alberta conditions by Karamanos and Henry (1991) and Karamanos et al. (1992a,b), respectively. F.A.R.M. essentially recognizes three sources of nitrogen contributing to plant N uptake, namely, soil available as determined by soil testing, net mineralizable and fertilizer nitrogen. Target yields are based on moisture use efficiency crop production equations (Karamanos and Henry, 1991) and are estimated for 75, 50 and 25 percent probability of precipitation in a given Soil Climatic Zone (Meyers and Karamanos, 1997). Recommendations for the rest of nutrients are simply based on “available” nutrient ranges and are in table format. This system of recommendations was introduced in the Province of Saskatchewan in 1991 and is currently used by Enviro-Test Laboratories in all three Prairie Provinces. Development of the VST process required modifications in the F.A.R.M. model, especially in relation to the soil mineralization component. These modifications are discussed by Karamanos and Cannon (2002)

    Single measurement to predict potential mineralizable nitrogen

    Get PDF
    Non-Peer ReviewedAlthough soil nitrate nitrogen (N) has been used as a basis for N fertilizer recommendation in western Canada, potential mineralizable N should be (or is) a more accurate indicator of the N supplying power of the soil. Potential mineralizable N, analyzed by extraction with hot KCl, and organic matter content were determined on the AESA Soil Quality Benchmark Sites in Alberta. Using these results, we developed an approach to estimate Nt from soil organic matter, based on the equation Nt=No(1-e-kt)y, and validated the calculated Nt with the hot KCl extracted N. Results indicated that the potential mineralizable N released from soil differed among ecoregions and slope positions. Potential mineralizable N is lower in southern Alberta than central Alberta. The lower slopes released more N than higher slope positions. Nt released in soil over the growing season correlated well with hot KCl extracted N in three different slope positions. However, variability of Nt in the upper slope position was greater than middle and lower slopes due to a shallow A horizon and variable soil moisture during the growing season. After removal of outliers (9% of the total data set), the values of R2 (regression of hot KCl with calculated Nt) are 0.529, 0.576 and 0.627 for upper, middle and lower slope position, respectively. Using calculated Nt results, a potential mineralizable map in Alberta has been developed. This map will guide producers to manage soil as well as fertilizer N

    Evaluation of irradiation and Termin-8® addition to spray-dried animal plasma, base mix and/or whole diet on growth performance of nursery pigs

    Get PDF
    Two studies were conducted to evaluate the effects of irradiation of spray-dried animal plasma and Termin-8 treatment to spray-dried animal plasma, base mix (specialty protein products, milk products, ground oat groats, soy flour, flow agent, vitamins, and minerals), or whole diet on nursery pig performance. Overall (d 0 to 14) in Exp. 1, pigs fed diets containing irradiated plasma had increased ADG and pigs fed Termin-8® treated plasma had increased ADG and ADFI compared to pigs fed diets with regular plasma or whole diets (containing either regular or irradiated plasma) treated with Termin-8. No differences in F/G were observed among treatments. In Exp. 2, pigs fed diets that contained either animal plasma or base mix treated with Termin-8 in the SEW diet had increased ADG and F/G from d 0 to 13 compared to no Termin-8 treatment, but no differences were observed overall (d 0 to 40). Therefore, the use of irradiated spray-dried animal plasma and Termin-8 treated spray-dried animal plasma and base mix improves growth performance in nursery pigs during the initial period after weaning

    Blue Straggler Stars: Early Observations that Failed to Solve the Problem

    Full text link
    In this chapter, I describe early ideas on blue stragglers, and various observations (some published, some not) that promised but failed to resolve the question of their origin. I review the data and ideas that were circulating from Allan Sandage's original discovery in 1953 of "anomalous blue stars" in the globular cluster M3, up until about 1992, when what seems to have been the only previous meeting devoted to Blue Straggler Stars (BSSs) was held at the Space Telescope Science Institute.Comment: Chapter 2, in Ecology of Blue Straggler Stars, H.M.J. Boffin, G. Carraro & G. Beccari (Eds), Astrophysics and Space Science Library, Springe

    Dynamics of tree diversity in undisturbed and logged subtropical rainforest in Australia

    Get PDF
    In subtropical rainforest in eastern Australia, changes in the diversity of trees were compared under natural conditions and eight silvicultural regimes over 35 years. In the treated plots basal area remaining after logging ranged from 12 to 58 m2 per ha. In three control plots richness differed little over this period. In the eight treated plots richness per plot generally declined after intervention and then gradually increased to greater than original diversity. After logging there was a reduction in richness per plot and an increase in species richness per stem in all but the lightest selective treatments. The change in species diversity was related to the intensity of the logging, however the time taken for species richness to return to pre-logging levels was similar in all silvicultural treatments and was not effected by the intensity of treatment. These results suggest that light selective logging in these forests mainly affects dominant species. The return to high diversity after only a short time under all silvicultural regimes suggests that sustainability and the manipulation of species composition for desired management outcomes is possible

    The global functioning: Social and role scales-further validation in a large sample of adolescents and young adults at clinical high risk for psychosis

    Get PDF
    Objective: Traditional measures for assessing functioning with adult patients with schizophrenia have been shown to be insufficient for assessing the issues that occur in adolescents and young adults at clinical high risk (CHR) for psychosis. The current study provides an expanded validation of the Global Functioning: Social (GF:Social) and Role (GF:Role) scales developed specifically for use with CHR individuals and explores the reliability and accuracy of the ratings, the validity of the scores in comparison to other established clinical measures, stability of functioning over a 2-year period, and psychosis predictive ability. Methods: Seven hundred fifty-five CHR individuals and 277 healthy control (HC) participants completed the GF:Social and Role scales at baseline as part of the North American Prodrome Longitudinal Study (NAPLS2). Results: Inter-rater reliability and accuracy were high for both scales. Correlations between the GF scores and other established clinical measures demonstrated acceptable convergent and discriminant validity. In addition, GF:Social and Role scores were unrelated to positive symptoms. CHR participants showed large impairments in social and role functioning over 2-years, relative to the HCs, even after adjusting for age, IQ, and attenuated positive symptoms. Finally, social decline prior to baseline was more pronounced in CHR converters, relative to non-converters. Conclusions: The GF scales can be administered in a large-scale multi-site study with excellent inter-rater reliability and accuracy. CHR individuals showed social and role functioning impairments over time that were not confounded by positive symptom severity levels. The results of this study demonstrate that social decline is a particularly effective predictor of conversion outcome

    Young and Intermediate-age Distance Indicators

    Full text link
    Distance measurements beyond geometrical and semi-geometrical methods, rely mainly on standard candles. As the name suggests, these objects have known luminosities by virtue of their intrinsic proprieties and play a major role in our understanding of modern cosmology. The main caveats associated with standard candles are their absolute calibration, contamination of the sample from other sources and systematic uncertainties. The absolute calibration mainly depends on their chemical composition and age. To understand the impact of these effects on the distance scale, it is essential to develop methods based on different sample of standard candles. Here we review the fundamental properties of young and intermediate-age distance indicators such as Cepheids, Mira variables and Red Clump stars and the recent developments in their application as distance indicators.Comment: Review article, 63 pages (28 figures), Accepted for publication in Space Science Reviews (Chapter 3 of a special collection resulting from the May 2016 ISSI-BJ workshop on Astronomical Distance Determination in the Space Age

    Potentially important periods of change in the development of social and role functioning in youth at clinical high risk for psychosis

    Get PDF
    The developmental course of daily functioning prior to first psychosis-onset remains poorly understood. This study explored age-related periods of change in social and role functioning. The longitudinal study included youth (aged 12-23, mean follow-up years = 1.19) at clinical high risk (CHR) for psychosis (converters [CHR-C], n = 83; nonconverters [CHR-NC], n = 275) and a healthy control group (n = 164). Mixed-model analyses were performed to determine age-related differences in social and role functioning. We limited our analyses to functioning before psychosis conversion; thus, data of CHR-C participants gathered after psychosis onset were excluded. In controls, social and role functioning improved over time. From at least age 12, functioning in CHR was poorer than in controls, and this lag persisted over time. Between ages 15 and 18, social functioning in CHR-C stagnated and diverged from that of CHR-NC, who continued to improve (p =.001). Subsequently, CHR-C lagged behind in improvement between ages 21 and 23, further distinguishing them from CHR-NC (p <.001). A similar period of stagnation was apparent for role functioning, but to a lesser extent (p =.007). The results remained consistent when we accounted for the time to conversion. Our findings suggest that CHR-C start lagging behind CHR-NC in social and role functioning in adolescence, followed by a period of further stagnation in adulthood

    Social decline in the psychosis prodrome: Predictor potential and heterogeneity of outcome

    Get PDF
    Background: While an established clinical outcome of high importance, social functioning has been emerging as possibly having a broader significance to the evolution of psychosis and long term disability. In the current study we explored the association between social decline, conversion to psychosis, and functional outcome in individuals at clinical high risk (CHR) for psychosis. Methods: 585 subjects collected in the North American Prodrome Longitudinal Study (NAPLS2) were divided into 236 Healthy Controls (HCs), and CHR subjects that developed psychosis (CHR + C, N = 79), or those that did not (Non-Converters, CHR-NC, N = 270). CHR + C subjects were further divided into those that experienced an atypical decline in social functioning prior to baseline (beyond typical impairment levels) when in min-to-late adolescence (CHR + C-SD, N = 39) or those that did not undergoing a decline (CHR + C-NSD, N = 40). Results: Patterns of poor functional outcomes varied across the CHR subgroups: CHR-NC (Poor Social 36.3%, Role 42.2%) through CHR + C-NSD (Poor Social 50%, Poor Role 67.5%) to CHR + C-SD (Poor Social 76.9%, Poor Role 89.7%) functioning. The two Converter subgroups had comparable positive symptoms at baseline. At 12 months, the CHR + C-SD group stabilized, but social functioning levels remained significantly lower than the other two subgroups. Conclusions: The current study demonstrates that pre-baseline social decline in mid-to-late adolescence predicts psychosis. In addition, we found that this social decline in converters is strongly associated with especially poor functional outcome and overall poorer prognosis. Role functioning, in contrast, has not shown similar predictor potential, and rather appears to be an illness indicator that worsens over time
    corecore