705 research outputs found
A CLASSIFICATION OF UNREPLICATED FACTORIAL EXPERIMENTS FOR USE WITH THE ANALYSIS OF DETERMINISTIC SIMULATION MODELS
Deterministic simulation models are important in agricultural applications and their use is becoming increasingly common. Therefore, statistical procedures that interpret the output and evaluate the performance of deterministic models are necessary. The fact that deterministic computer simulation experiments cannot be replicated provides opportunities for using several procedures applicable to unreplicated factorial experiments. We discuss a classification scheme that selects the correct technique for most deterministic simulation experiments. The value of these techniques is their capability to estimate the experimental error variance for unreplicated computer experiments. Using these estimates of error, model developers and practitioners can more thoroughly analyze their deterministic simulation experiments
Scalar radius of the pion in the Kroll-Lee-Zumino renormalizable theory
The Kroll-Lee-Zumino renormalizable Abelian quantum field theory of pions and
a massive rho-meson is used to calculate the scalar radius of the pion at next
to leading (one loop) order in perturbation theory. Due to renormalizability,
this determination involves no free parameters. The result is . This value gives for , the low energy constant of
chiral perturbation theory, , and , where F
is the pion decay constant in the chiral limit. Given the level of accuracy in
the masses and the coupling, the only sizable uncertainty in this
result is due to the (uncalculated) NNLO contribution
Pion form factor in the Kroll-Lee-Zumino model
The renormalizable Abelian quantum field theory model of Kroll, Lee, and
Zumino is used to compute the one-loop vertex corrections to the tree-level,
Vector Meson Dominance (VMD) pion form factor. These corrections, together with
the known one-loop vacuum polarization contribution, lead to a substantial
improvement over VMD. The resulting pion form factor in the space-like region
is in excellent agreement with data in the whole range of accessible momentum
transfers. The time-like form factor, known to reproduce the Gounaris-Sakurai
formula at and near the rho-meson peak, is unaffected by the vertex correction
at order (g_\rpp^2).Comment: Revised version corrects a misprint in Eq.(1
Mobile Computing in Physics Analysis - An Indicator for eScience
This paper presents the design and implementation of a Grid-enabled physics
analysis environment for handheld and other resource-limited computing devices
as one example of the use of mobile devices in eScience. Handheld devices offer
great potential because they provide ubiquitous access to data and
round-the-clock connectivity over wireless links. Our solution aims to provide
users of handheld devices the capability to launch heavy computational tasks on
computational and data Grids, monitor the jobs status during execution, and
retrieve results after job completion. Users carry their jobs on their handheld
devices in the form of executables (and associated libraries). Users can
transparently view the status of their jobs and get back their outputs without
having to know where they are being executed. In this way, our system is able
to act as a high-throughput computing environment where devices ranging from
powerful desktop machines to small handhelds can employ the power of the Grid.
The results shown in this paper are readily applicable to the wider eScience
community.Comment: 8 pages, 7 figures. Presented at the 3rd Int Conf on Mobile Computing
& Ubiquitous Networking (ICMU06. London October 200
Design Patterns for Description-Driven Systems
In data modelling, product information has most often been handled separately
from process information. The integration of product and process models in a
unified data model could provide the means by which information could be shared
across an enterprise throughout the system lifecycle from design through to
production. Recently attempts have been made to integrate these two separate
views of systems through identifying common data models. This paper relates
description-driven systems to multi-layer architectures and reveals where
existing design patterns facilitate the integration of product and process
models and where patterns are missing or where existing patterns require
enrichment for this integration. It reports on the construction of a so-called
description-driven system which integrates Product Data Management (PDM) and
Workflow Management (WfM) data models through a common meta-model.Comment: 14 pages, 13 figures. Presented at the 3rd Enterprise Distributed
Object Computing EDOC'99 conference. Mannheim, Germany. September 199
INFORMATION TECHNOLOGIES AND THE DESIGN AND ANALYSIS OF SITE-SPECIFIC EXPERIMENTS WITHIN COMMERCIAL COTTON FIELDS
Information products derived from multi-spectral remote sensing images, LIDAR elevations, or data products from other sensor systems (soil electrical conductivity measurements, yield monitors, etc.) characterize potential crop productivity by mapping biophysical aspects of cropland variability. These sensor systems provide spectral, spatial, and temporal measurements at resolutions and accuracies describing the variability of in-field, physical characteristic phenomena, including management practices from cropland preparation, selection of crop cultivars, and variable-rate applications of inputs. In addition, DGPS-equipped (differential, global positioning system) harvesters monitor yield response at closely spaced, georeferenced points. Geographic information system and image processing techniques fuse diverse information sources to spatially characterize cropland, describe management practices, and quantify the variable yield response. Following fusion of information sources, effectiveness of spatially applied management practices may be evaluated by designed experiments assessing impacts on yield caused by geo-referenced relationships between (1) uncontrollable spatial components (the environment) and (2) controllable management practices (cultivar selection, fertility management, herbicide, insecticide, and plant growth regulator applications, etc.). These kinds of experiments can be designed because farming equipment can be computer controlled through DGPS giving farmers the ability to continuously change applied treatments for many farming operations. A mixed linear model involving both uncontrollable and controllable management attributes attached as spatial descriptors to yield monitor points evaluates effects of management practices on yield. An example based upon cotton production demonstrates the methodology. Additional strategies for designing studies in commercial cotton fields involving spatial information are discussed
APPLICATION OF COMPUTER INTENSIVE METHODS TO EVALUATE THE PERFORMANCE OF A SAMPLING DESIGN FOR USE IN COTTON INSECT PEST MANAGEMENT
A scouting protocol for cotton insect pests was developed which combines high resolution, multispectral remotely sensed imagery with a belt transect that crosses rows of cotton. Imagery was used to determine sample site selection while estimating plant bug abundance in a more than 200 ac. cotton field in 1997. Tarnished plant bug (Lygus lineolaris) counts were acquired using a standard drop cloth for each of eight rows along a transect. The sample data indicated that plant bug population densities spatially vary as a function of different spectral (color) classes present on the imagery. We postulate that such classified images correlate to differences in crop phenology, and plant bug populations (especially from early to mid-season) aggregate themselves by these habitat differences. Therefore, the population dynamics of Lygus, and possibly other species, can be better understood by combining the transect-based sampling plan with remotely sensed imagery. To verify and validate this claim, a computer intensive approach was utilized to simulate the performance of different sampling plans. The comparison is accomplished with a combinatorial algorithm that exhaustively enumerates the original data into unique subsets. These subsets correspond to results that could be expected from the use of traditional or alternative sampling plans and compared to results from the candidate plan actually used. The results of the enumerative analysis show the benefit of multi-band, remotely sensed imagery combined with the use of large sized sample units to improve sampling efficiency (and without the need to have large sample sizes). It is of great benefit that the enumerative algorithm provided answers to questions of interest without having to complete additional fieldwork
- …