621 research outputs found
A simulation study of corn production and low-temperature drying for Central Iowa
Two computer models, CORNSIM and FALDRY, were developed and used to conduct a simulation study of corn production and low-temperature drying for Central Iowa conditions. CORNSIM simulates a complete corn production enterprise. Given a specific management strategy, machinery capacity, and cropping season, CORNSIM simulates planting, crop development, yield, and harvesting. CORNSIM was developed to provide the simulated flow of harvested grain. FALDRY simulates a system of low-temperature corn drying bins. FALDRY inputs include bin specifications, weather data, and the incoming flow of harvested grain. FALDRY was developed to predict the success or failure of a low-temperature corn drying system and the amount of electrical energy it would consume;The simulation study addressed three major objectives: (1) Determine the relative benefits of additional fan power versus the use of supplement heat to enhance the performance of a low-temperature drying system. (2) Test the feasibility of designing a low-temperature drying system to match the corn harvesting capacity of a typical Central Iowa farming enterprise. (3) Develop the optimum daily filling strategy and design recommendations for a low-temperature corn drying system for Central Iowa;The following conclusions are given for Central Iowa conditions. It is expected the findings would be similar for most of the Corn Belt. (1) CORNSIM is a valid simulation model of a corn production enterprise. It can be used to determine the relative effects of changes in production strategy. (2) FALDRY is a valid simulation model of a low-temperature corn drying system. It has the flexibility to accommodate layer-filling and is able to predict the grain moisture profile. (3) The most efficient method of using electrical energy to increase drying rate, improve the probability of drying in the fall, and reduce grain deterioration is to increase fan power. (4) If the daily harvest rate is 1/16 or less of the total production, it is feasible to match the capacity of the harvesting and low-temperature drying system. (5) The following design criteria and management strategy are recommended for low-temperature corn drying systems: (a) Maximum grain depth of 16-20 feet. (b) Fan-power-to-grain ratio of 1.25 to 1.75 horsepower per 1000 bushels of bin storage capacity. (c) Use a high-performance axial fan (large bin may require 2 or more fans). (d) Follow the controlled-filling strategy
INVESTMENT ANALYSIS OF ALTERNATIVE FRUIT TREE SPRAYERS IN MICHIGAN ORCHARDS
Changing orchard sprayer technology and rising pesticide costs to fruit growers raise the need to analyze the profitability of alternative sprayer investments. This study analyzes investments in four orchard sprayers for use in Michigan apple production: an air blast sprayer, a tower boom sprayer, a tower boom sprayer equipped with electronic sensors that activate spray nozzles when foliage is detected, and an air curtain sprayer that targets spray with a layer of forced air. Assuming equal pest control efficacy, the study calculates the annualized net present cost per acre of owning and operating each sprayer for ten years using a baseline discount rate of 10 percent over 200 acres of semi-dwarf apple trees. The analysis found the annualized net present cost per acre, from least to greatest, to be 312 for the tower sprayer with electronic sensors, 391 for the conventional air blast sprayer. Sensitivity analysis revealed that the ranking of these cost results was sensitive to farm size, but not to percentage of funds borrowed, discount rate, loan interest rate, or pesticide costs within the ranges investigated. The air curtain sprayer was lowest cost for orchards of 25 acres or more; the conventional air blast sprayer was lowest cost for 10-acre orchards.Crop Production/Industries,
Robust Detection and Genotyping of Single Feature Polymorphisms from Gene Expression Data
It is well known that Affymetrix microarrays are widely used to predict genome-wide gene expression and genome-wide genetic polymorphisms from RNA and genomic DNA hybridization experiments, respectively. It has recently been proposed to integrate the two predictions by use of RNA microarray data only. Although the ability to detect single feature polymorphisms (SFPs) from RNA microarray data has many practical implications for genome study in both sequenced and unsequenced species, it raises enormous challenges for statistical modelling and analysis of microarray gene expression data for this objective. Several methods are proposed to predict SFPs from the gene expression profile. However, their performance is highly vulnerable to differential expression of genes. The SFPs thus predicted are eventually a reflection of differentially expressed genes rather than genuine sequence polymorphisms. To address the problem, we developed a novel statistical method to separate the binding affinity between a transcript and its targeting probe and the parameter measuring transcript abundance from perfect-match hybridization values of Affymetrix gene expression data. We implemented a Bayesian approach to detect SFPs and to genotype a segregating population at the detected SFPs. Based on analysis of three Affymetrix microarray datasets, we demonstrated that the present method confers a significantly improved robustness and accuracy in detecting the SFPs that carry genuine sequence polymorphisms when compared to its rivals in the literature. The method developed in this paper will provide experimental genomicists with advanced analytical tools for appropriate and efficient analysis of their microarray experiments and biostatisticians with insightful interpretation of Affymetrix microarray data
A VLP-based vaccine targeting domain III of the West Nile virus E protein protects from lethal infection in mice
Background. Since its first appearance in the USA in 1999, West Nile virus (WNV) has spread in the Western hemisphere and continues to represent an important public health concern. In the absence of effective treatment, there is a medical need for the development of a safe and efficient vaccine. Live attenuated WNV vaccines have shown promise in preclinical and clinical studies but might carry inherent risks due to the possibility of reversion to more virulent forms. Subunit vaccines based on the large envelope (E) glycoprotein of WNV have therefore been explored as an alternative approach. Although these vaccines were shown to protect from disease in animal models, multiple injections and/or strong adjuvants were required to reach efficacy, underscoring the need for more immunogenic, yet safe DIII-based vaccines. Results. We produced a conjugate vaccine against WNV consisting of recombinantly expressed domain III (DIII) of the E glycoprotein chemically cross-linked to virus-like particles derived from the recently discovered bacteriophage AP205. In contrast to isolated DIII protein, which required three administrations to induce detectable antibody titers in mice, high titers of DIII-specific antibodies were induced after a single injection of the conjugate vaccine. These antibodies were able to neutralize the virus in vitro and provided partial protection from a challenge with a lethal dose of WNV. Three injections of the vaccine induced high titers of virus-neutralizing antibodies, and completely protected mice from WNV infection. Conclusions. The immunogenicity of DIII can be strongly enhanced by conjugation to virus-like particles of the bacteriophage AP205. The superior immunogenicity of the conjugate vaccine with respect to other DIII-based subunit vaccines, its anticipated favourable safety profile and low production costs highlight its potential as an efficacious and cost-effective prophylaxis against WNV
Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV
The performance of muon reconstruction, identification, and triggering in CMS
has been studied using 40 inverse picobarns of data collected in pp collisions
at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection
criteria covering a wide range of physics analysis needs have been examined.
For all considered selections, the efficiency to reconstruct and identify a
muon with a transverse momentum pT larger than a few GeV is above 95% over the
whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4,
while the probability to misidentify a hadron as a muon is well below 1%. The
efficiency to trigger on single muons with pT above a few GeV is higher than
90% over the full eta range, and typically substantially better. The overall
momentum scale is measured to a precision of 0.2% with muons from Z decays. The
transverse momentum resolution varies from 1% to 6% depending on pseudorapidity
for muons with pT below 100 GeV and, using cosmic rays, it is shown to be
better than 10% in the central region up to pT = 1 TeV. Observed distributions
of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO
Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV
The performance of muon reconstruction, identification, and triggering in CMS
has been studied using 40 inverse picobarns of data collected in pp collisions
at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection
criteria covering a wide range of physics analysis needs have been examined.
For all considered selections, the efficiency to reconstruct and identify a
muon with a transverse momentum pT larger than a few GeV is above 95% over the
whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4,
while the probability to misidentify a hadron as a muon is well below 1%. The
efficiency to trigger on single muons with pT above a few GeV is higher than
90% over the full eta range, and typically substantially better. The overall
momentum scale is measured to a precision of 0.2% with muons from Z decays. The
transverse momentum resolution varies from 1% to 6% depending on pseudorapidity
for muons with pT below 100 GeV and, using cosmic rays, it is shown to be
better than 10% in the central region up to pT = 1 TeV. Observed distributions
of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO
Azimuthal anisotropy of charged particles at high transverse momenta in PbPb collisions at sqrt(s[NN]) = 2.76 TeV
The azimuthal anisotropy of charged particles in PbPb collisions at
nucleon-nucleon center-of-mass energy of 2.76 TeV is measured with the CMS
detector at the LHC over an extended transverse momentum (pt) range up to
approximately 60 GeV. The data cover both the low-pt region associated with
hydrodynamic flow phenomena and the high-pt region where the anisotropies may
reflect the path-length dependence of parton energy loss in the created medium.
The anisotropy parameter (v2) of the particles is extracted by correlating
charged tracks with respect to the event-plane reconstructed by using the
energy deposited in forward-angle calorimeters. For the six bins of collision
centrality studied, spanning the range of 0-60% most-central events, the
observed v2 values are found to first increase with pt, reaching a maximum
around pt = 3 GeV, and then to gradually decrease to almost zero, with the
decline persisting up to at least pt = 40 GeV over the full centrality range
measured.Comment: Replaced with published version. Added journal reference and DO
Search for new physics with same-sign isolated dilepton events with jets and missing transverse energy
A search for new physics is performed in events with two same-sign isolated
leptons, hadronic jets, and missing transverse energy in the final state. The
analysis is based on a data sample corresponding to an integrated luminosity of
4.98 inverse femtobarns produced in pp collisions at a center-of-mass energy of
7 TeV collected by the CMS experiment at the LHC. This constitutes a factor of
140 increase in integrated luminosity over previously published results. The
observed yields agree with the standard model predictions and thus no evidence
for new physics is found. The observations are used to set upper limits on
possible new physics contributions and to constrain supersymmetric models. To
facilitate the interpretation of the data in a broader range of new physics
scenarios, information on the event selection, detector response, and
efficiencies is provided.Comment: Published in Physical Review Letter
- …