26 research outputs found
Effects of Nitrogen and Planting Seed Size on Cotton Growth, Development, and Yield
A standardized experiment was conducted during 2009 and 2010 at 20 location-years across U.S. cotton (Gossypium hirsutum L.)-producing states to compare the N use requirement of contemporary cotton cultivars based on their planting seed size. Treatments consisted of three cotton varieties with planting seed of different numbers of seed per kg and N rates of 0, 45, 90, and 134 kg ha⁻¹. Soil at each trial location was sampled and tested for nitrate presence. High levels of soil nitrate (>91 N-NO₃⁻kg ha⁻¹) were found in Arizona and western Texas, and soil nitrate in the range of 45 to 73 kg N-NO₃⁻ ha⁻¹ was found at locations in the central United States. Cotton lint yield responded to applied N at 11 of 20 locations. Considering only sites that responded to applied N, highest lint yields were achieved with 112 to 224 kg ha⁻¹of applied plus pre-plant residual soil NO₃—translating to an optimal N requirement of 23 kg ha⁻¹ per 218 kg bale of lint produced. Among the varieties tested those with medium-sized seed produced higher yields in response to N than did larger and smaller seeded varieties. Varieties with larger seed had longer and stronger fibers, higher fiber length uniformity than small seeded varieties and decreased micronaire. Seed protein and oil increased and decreased slightly in response to increasing amounts of soil nitrate plus applied N, respectively
Is Canada ready for patient accessible electronic health records? A national scan
<p>Abstract</p> <p>Background</p> <p>Access to personal health information through the electronic health record (EHR) is an innovative means to enable people to be active participants in their own health care. Currently this is not an available option for consumers of health. The absence of a key technology, the EHR, is a significant obstacle to providing patient accessible electronic records. To assess the readiness for the implementation and adoption of EHRs in Canada, a national scan was conducted to determine organizational readiness and willingness for patient accessible electronic records.</p> <p>Methods</p> <p>A survey was conducted of Chief Executive Officers (CEOs) of Canadian public and acute care hospitals.</p> <p>Results</p> <p>Two hundred thirteen emails were sent to CEOs of Canadian general and acute care hospitals, with a 39% response rate. Over half (54.2%) of hospitals had some sort of EHR, but few had a record that was predominately electronic. Financial resources were identified as the most important barrier to providing patients access to their EHR and there was a divergence in perceptions from healthcare providers and what they thought patients would want in terms of access to the EHR, with providers being less willing to provide access and patients desire for greater access to the full record.</p> <p>Conclusion</p> <p>As the use of EHRs becomes more commonplace, organizations should explore the possibility of responding to patient needs for clinical information by providing access to their EHR. The best way to achieve this is still being debated.</p
31st Annual Meeting and Associated Programs of the Society for Immunotherapy of Cancer (SITC 2016) : part two
Background
The immunological escape of tumors represents one of the main ob- stacles to the treatment of malignancies. The blockade of PD-1 or CTLA-4 receptors represented a milestone in the history of immunotherapy. However, immune checkpoint inhibitors seem to be effective in specific cohorts of patients. It has been proposed that their efficacy relies on the presence of an immunological response. Thus, we hypothesized that disruption of the PD-L1/PD-1 axis would synergize with our oncolytic vaccine platform PeptiCRAd.
Methods
We used murine B16OVA in vivo tumor models and flow cytometry analysis to investigate the immunological background.
Results
First, we found that high-burden B16OVA tumors were refractory to combination immunotherapy. However, with a more aggressive schedule, tumors with a lower burden were more susceptible to the combination of PeptiCRAd and PD-L1 blockade. The therapy signifi- cantly increased the median survival of mice (Fig. 7). Interestingly, the reduced growth of contralaterally injected B16F10 cells sug- gested the presence of a long lasting immunological memory also against non-targeted antigens. Concerning the functional state of tumor infiltrating lymphocytes (TILs), we found that all the immune therapies would enhance the percentage of activated (PD-1pos TIM- 3neg) T lymphocytes and reduce the amount of exhausted (PD-1pos TIM-3pos) cells compared to placebo. As expected, we found that PeptiCRAd monotherapy could increase the number of antigen spe- cific CD8+ T cells compared to other treatments. However, only the combination with PD-L1 blockade could significantly increase the ra- tio between activated and exhausted pentamer positive cells (p= 0.0058), suggesting that by disrupting the PD-1/PD-L1 axis we could decrease the amount of dysfunctional antigen specific T cells. We ob- served that the anatomical location deeply influenced the state of CD4+ and CD8+ T lymphocytes. In fact, TIM-3 expression was in- creased by 2 fold on TILs compared to splenic and lymphoid T cells. In the CD8+ compartment, the expression of PD-1 on the surface seemed to be restricted to the tumor micro-environment, while CD4 + T cells had a high expression of PD-1 also in lymphoid organs. Interestingly, we found that the levels of PD-1 were significantly higher on CD8+ T cells than on CD4+ T cells into the tumor micro- environment (p < 0.0001).
Conclusions
In conclusion, we demonstrated that the efficacy of immune check- point inhibitors might be strongly enhanced by their combination with cancer vaccines. PeptiCRAd was able to increase the number of antigen-specific T cells and PD-L1 blockade prevented their exhaus- tion, resulting in long-lasting immunological memory and increased median survival
Reflectance-based Nitrogen Fertilizer Management for Irrigated Cotton
Water and nitrogen are the first and second constraints to cotton production in the arid southwestern U.S, respectively (Morrow and Krieg, 1990). Subsurface drip irrigation (SDI) area in cottonland is currently estimated at 300,000 ac, and is growing (Jim Bordovsky, personal communication). Efficiency of water application to cotton in SDI systems is about 90% (Bordovsky and Lyle, 1998). However, N management research for cotton in SDI has not kept up with the water management research. Improving N fertilizer use efficiency would allow lower rates of N fertilizer to be used by producers without hurting lint yields. The reduced costs of improving efficiency of inputs such as fertilizer would help keep cotton farmers competitive in the world market place. Additionally, residual nitrate (NO3) can be leached to groundwater and impact water quality. The environment of the West Texas Region is thereby protected when N fertilizer use efficiency is improved.
Timing of N application is an important management tool that can result in improved N use efficiency in cotton. Norton and Silvertooth (1998) reported reduction in N fertilizer needed and increased N use efficiency if pre-plant N was avoided in irrigated cotton in Arizona. Based on that research, the Cooperative Extension of the University of Arizona states that the main window for N applications to cotton is centered at peak bloom or about 2200 heat units (base 60°F). The rate of N uptake at peak bloom is apparently maximum in cotton (Silvertooth, 2001). Previous research conducted in this area has indicated that improving the timing of N fertilizer injections in SDI cotton systems based on canopy reflectance assessments of in-season N status can save up to 90 lb N/ac, without hurting yields (Bronson et al., 2003; Chua et al., 2003). We also observed in earlier work that modifying the timing of in-season N applications by applying N when chlorophyll meter readings were low, resulted in reduced N fertilizer applications and reduced residual soil NO3 --N (Chua et al., 2003). However, more research is needed on basing the timing and rates of N fertilizer injections to SDI cotton on spectral reflectance. In the previous work (Chua et al., 2003), our SDI system was not set up for fertigation treatments, but our existing, present SDI system is. In addition to reflectance treatments and their associate reference treatments (i.e. 1.5 * soil test treatment), we added a low, 0.5 * soil test treatment N rate to provide more information on a wide range of N fertilizer inputs
Use of GIS-Based Site-Specific Nitrogen Management for Improving Energy Efficiency
To our knowledge, geographical information system (GIS)-based site-specific nitrogen management (SSNM) techniques have not been used to assess agricultural energy costs and efficiency. This chapter uses SSNM case studies for corn (Zea mays L.) grown in Missouri and cotton (Gossypium hirsutum L.) grown in Texas. In five case studies, the impact of SSNM will be compared with blanket N fertilizer recommendations. The five case studies are investigating (I) the impact N on energy produced in cotton production, (2) the impact of variable-rate N for cotton production based on soil nitrate and crop reflectance, (3) the feasibility of variable-rate N based on corn crop reflectance, (4) the use of corn management zones and crop reflectance for improving N recommendations and energy efficiency, and (5) the ability of using aerial photographs to improve N recommendations in corn
Managing Crop Residue with Green Manure, Urea, and Tillage in a Rice–Wheat Rotation
Most double-crop grain farmers in South Asia remove or burn cropresidue to facilitate seedbed preparation and to avoid possible yield reductions. This results in loss of soil organic matter (SOM) and nutrients. In this study, we determined whether incorporating wheat (Triticum aestivum L.) residue, rice (Oryza sativa L.) residue, and sesbania (Sesbania aculeta L.) green manure with urea fertilizer N in a rice–wheat cropping system can improve grain yields, N use efficiency, and SOM. We incorporated wheat residue (6 Mg ha-1, C/N = 94), rice residue (6 Mg ha-1, C/N = 63), or both, with and without green manure (20 or 40 Mg fresh ha-1, C/N = 19), in a field experiment with irrigated rice and wheat grown each year in rotation on a Tolewal sandy loam (Typic Ustochrept) in the Punjab of India. Rice and wheat residue did not affect grain yields of wheat and rice, but residue incorporation did result in reduced recovery efficiency of urea N and green manure N. Rice production was greater with wheat residue incorporation when an average of 86 kg N ha-1 of a prescribed 120 kg N ha-1 dose was applied as green manure N and the balance as urea N vs. 120 kg urea N ha-1 alone. Despite wider C/N than rice residue, wheat residue additions to flooded rice resulted in greater C sequestration in soil than with rice residue or 40 Mg green manure ha-1. These results demonstrate that a green manure crop and/or incorporating crop residue in a rice–wheat system has potential to increase SOM while maintaining high grain yields
Cotton Irrigation Scheduling Using a Crop Growth Model and FAO-56 Methods: Field and Simulation Studies
Crop growth simulation models can address a variety of agricultural problems, but their use to directly assist in-season irrigation management decisions is less common. Confidence in model reliability can be increased if models are shown to provide improved in-season management recommendations, which are explicitly tested in the field. The objective of this study was to compare the CSM-CROPGRO-Cotton model (with recently updated ET routines) to a well-tested FAO-56 irrigation scheduling spreadsheet by (1) using both tools to schedule cotton irrigation during 2014 and 2015 in central Arizona and (2) conducting a post-hoc simulation study to further compare outputs from these tools. Two replications of each irrigation scheduling treatment and a water-stressed treatment were established on a 2.6 ha field. Irrigation schedules were developed on a weekly basis and administered via an overhead lateral-move sprinkler irrigation system. Neutron moisture meters were used weekly to estimate soil moisture status and crop water use, and destructive plant samples were routinely collected to estimate cotton leaf area index (LAI) and canopy weight. Cotton yield was estimated using two mechanical cotton pickers with differing capabilities: (1) a two-row picker that facilitated manual collection of yield samples from 32 m(2) areas and (2) a four-row picker equipped with a sensor-based cotton yield monitoring system. In addition to statistical testing of field data via mixed models, the data were used for post-hoc reparameterization and fine-tuning of the irrigation scheduling tools. Post-hoc simulations were conducted to compare measured and simulated evapotranspiration, crop coefficients, root zone soil moisture depletion, cotton growth metrics, and yield for each irrigation treatment. While total seasonal irrigation amounts were similar among the two scheduling tools, the crop model recommended more water during anthesis and less during the early season, which led to higher cotton fiber yield in both seasons (p < 0.05). The tools calculated cumulative evapotranspiration similarly, with root mean squared errors (RMSEs) less than 13%; however, FAO-56 crop coefficient (K-c) plots demonstrated subtle differences in daily evapotranspiration calculations. Root zone soil moisture depletion was better calculated by CSM-CROPGRO-Cotton, perhaps due to its more complex soil profile simulation; however, RMSEs for depletion always exceeded 20% for both tools and reached 149% for the FAO-56 spreadsheet in 2014. CSM-CROPGRO-Cotton simulated cotton LAI, canopy weight, canopy height, and yield with RMSEs less than 21%, while the FAO-56 spreadsheet had no capability for such outputs. Through field verification and thorough post-hoc data analysis, the results demonstrated that the CSM-CROPGRO-Cotton model with updated FAO-56 ET routines could match or exceed the accuracy and capability of an FAO-56 spreadsheet tool for cotton water use calculations and irrigation scheduling.Cotton IncorporatedThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
Recommended from our members
Growth, water use, and crop coefficients of direct-seeded guayule with furrow and subsurface drip irrigation in Arizona
Crop establishment costs of guayule (Parthenium argentatum A. Gray), a perennial desert shrub that produces natural rubber, can be significantly reduced using direct seeding rather than the traditional practice of transplanting greenhouse-grown seedlings. However, information regarding the irrigation application, crop evapotranspiration (ETc), and crop coefficients (Kc) for managing direct-seeded guayule crops has not been provided. In this study, guayule was direct-seeded in Apr. 2018 in fields at two location in Arizona; Maricopa, on a sandy loam soil and Eloy, on a clay soil, and harvested 23–24 months later in 2020. At each location, five irrigation rates were applied with subsurface drip irrigation (SDI) ranging from 50 to 150 % replacement of ETc (denoted as D50 to D150 treatments), respectively. A 6th treatment using furrow irrigation at 100 % ETc replacement (F100) was included. Treatments were replicated three times. The ETc was estimated for the first 74–84 days of crop establishment and thereafter, actual ETc (ETc act) was determined weekly-biweekly for the D100 and F100 treatments using a soil water balance. The objectives were to evaluate the responses in dry biomass (DB), rubber yield (RY), and resin (ReY) yield to water application rate, develop irrigation management criteria for the two soil types, and determine the ETc and crop coefficients for the 100 % treatments. The total irrigation applied to treatments ranged from 1830−1910 mm to 5090–5470 and averaged 3590 and 3320 mm for the 100 % SDI (D100) and furrow (F100) treatments at Maricopa and Eloy, respectively. The summed estimated ETc plus ETc act for the D100 and F100 treatments were 3663 and 3506 mm at Maricopa, respectively and 3428 and 3320 at Eloy, respectively. Average measured mid-season Kc in the 1st year varied from 1.20 to 1.26. Average measured mid-season Kc in the 2nd year were higher for D100 (≈1.30) than for F100 (≈1.23). Adjusted to the standard climate proposed in FAO56, mid-season Kc are 1.24 for D100 and 1.17 for F100 in the 2nd year. Average DB at Eloy (28.6 Mg ha−1) was not significantly higher than at Maricopa (24.0 Mg ha−1). However, RY and ReY were both significantly higher at Maricopa. At each location, rubber content was significantly higher for the F100 and the two lowest SDI rates than for other treatments. The highest mean RY and ReY were achieved with D100 at Maricopa and D75 at Eloy. These two also had significantly greater water productivity (WP; DB, RY, and ReY per unit of total water applied) than those at higher SDI rates and the F100 treatments. RY and ReY and their WP were generally higher for D100 than F100 in the sandy loam but not in the clay soil. For direct-seeded guayule in clay soils, furrow irrigation should be considered due to the lower rubber content and higher costs associated with SDI.24 month embargo; available online 14 July 2021This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
Recommended from our members
Estimation of direct-seeded guayule cover, crop coefficient, and yield using UAS-based multispectral and RGB data
Guayule (Parthenium argentatum, A. Gray), a perennial desert shrub, produces high-quality natural rubber and is targeted as a domestic natural rubber source in the U.S. While commercialization efforts for guayule are on-going, crop management requires plant growth monitoring, irrigation requirement assessment, and final yield estimation. Such assistance for guayule management could be provided with remote sensing (RS) data. In this study, field and RS data, collected via drones, from a 2-year guayule irrigation experiment conducted at Maricopa, Arizona were evaluated. In-season field measurements included fractional canopy cover (fc), basal (Kcb) and single (Kc) crop coefficients, and final yields of dry biomass (DB), rubber (RY), and resin (ReY). The objectives of this paper were to compare vegetations indices from MS data (NDVI) and RGB data (triangular greenness index, TGI); and derive linear prediction models for estimating fc, Kcb, Kc, and yield as functions of the MS and RGB indices. The NDVI and TGI showed similar seasonal trends and were correlated at a coefficient of determination (r2) of 0.52 and a root mean square error (RMSE) of 0.11. The prediction of measured fc as a linear function of NDVI (r2 = 0.90) was better than by TGI (r2 = 0.50). In contrast to TGI, the measured fc was highly correlated with estimated fc based on RGB image evaluation (r2 = 0.96). Linear models of Kcb and Kc, developed over the two years of guayule growth, had similar r2 values vs NDVI (r2 = 0.46 and 0.41, respectively) and vs TGI (r2 = 0.48 and 0.40, respectively). Final DB, RY, and ReY were predicted by both NDVI (r2 = 0.75, 0.53, and 0.70, respectively) and TGI (r2 = 0.72, 0.48, and 0.65, respectively). The RS-based models enable estimation of irrigation requirements and yields in guayule production fields in the U.S.NIFA24 month embargo; available online: 11 February 2022This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
Smarca4 Inactivation Promotes Lineage-Specific Transformation and Early Metastatic Features in the Lung
SMARCA4/BRG1 encodes for one of two mutually exclusive ATPases present in mammalian SWI/SNF chromatin remodeling complexes and is frequently mutated in human lung adenocarcinoma. However, the functional consequences of SMARCA4 mutation on tumor initiation, progression, and chromatin regulation in lung cancer remain poorly understood. Here, we demonstrate that loss of Smarca4 sensitizes club cell secretory protein-positive cells within the lung in a cell type-dependent fashion to malignant transformation and tumor progression, resulting in highly advanced dedifferentiated tumors and increased metastatic incidence. Consistent with these phenotypes, Smarca4-deficient primary tumors lack lung lineage transcription factor activities and resemble a metastatic cell state. Mechanistically, we show that Smarca4 loss impairs the function of all three classes of SWI/SNF complexes, resulting in decreased chromatin accessibility at lung lineage motifs and ultimately accelerating tumor progression. Thus, we propose that the SWI/SNF complex via Smarca4 acts as a gatekeeper for lineage-specific cellular transformation and metastasis during lung cancer evolution. SIGNIFICANCE: We demonstrate cell-type specificity in the tumor-suppressive functions of SMARCA4 in the lung, pointing toward a critical role of the cell-of-origin in driving SWI/SNF-mutant lung adenocarcinoma. We further show the direct effects of SMARCA4 loss on SWI/SNF function and chromatin regulation that cause aggressive malignancy during lung cancer evolution.This article is highlighted in the In This Issue feature, p. 275