595 research outputs found

    A CLASSIFICATION OF UNREPLICATED FACTORIAL EXPERIMENTS FOR USE WITH THE ANALYSIS OF DETERMINISTIC SIMULATION MODELS

    Get PDF
    Deterministic simulation models are important in agricultural applications and their use is becoming increasingly common. Therefore, statistical procedures that interpret the output and evaluate the performance of deterministic models are necessary. The fact that deterministic computer simulation experiments cannot be replicated provides opportunities for using several procedures applicable to unreplicated factorial experiments. We discuss a classification scheme that selects the correct technique for most deterministic simulation experiments. The value of these techniques is their capability to estimate the experimental error variance for unreplicated computer experiments. Using these estimates of error, model developers and practitioners can more thoroughly analyze their deterministic simulation experiments

    Scalar radius of the pion in the Kroll-Lee-Zumino renormalizable theory

    Full text link
    The Kroll-Lee-Zumino renormalizable Abelian quantum field theory of pions and a massive rho-meson is used to calculate the scalar radius of the pion at next to leading (one loop) order in perturbation theory. Due to renormalizability, this determination involves no free parameters. The result is s=0.40fm2_s = 0.40 {fm}^2. This value gives for ˉ4\bar{\ell}_4, the low energy constant of chiral perturbation theory, ˉ4=3.4\bar{\ell}_4 = 3.4, and Fπ/F=1.05F_\pi/F = 1.05, where F is the pion decay constant in the chiral limit. Given the level of accuracy in the masses and the ρππ\rho\pi\pi coupling, the only sizable uncertainty in this result is due to the (uncalculated) NNLO contribution

    Design Patterns for Description-Driven Systems

    Full text link
    In data modelling, product information has most often been handled separately from process information. The integration of product and process models in a unified data model could provide the means by which information could be shared across an enterprise throughout the system lifecycle from design through to production. Recently attempts have been made to integrate these two separate views of systems through identifying common data models. This paper relates description-driven systems to multi-layer architectures and reveals where existing design patterns facilitate the integration of product and process models and where patterns are missing or where existing patterns require enrichment for this integration. It reports on the construction of a so-called description-driven system which integrates Product Data Management (PDM) and Workflow Management (WfM) data models through a common meta-model.Comment: 14 pages, 13 figures. Presented at the 3rd Enterprise Distributed Object Computing EDOC'99 conference. Mannheim, Germany. September 199

    INFORMATION TECHNOLOGIES AND THE DESIGN AND ANALYSIS OF SITE-SPECIFIC EXPERIMENTS WITHIN COMMERCIAL COTTON FIELDS

    Get PDF
    Information products derived from multi-spectral remote sensing images, LIDAR elevations, or data products from other sensor systems (soil electrical conductivity measurements, yield monitors, etc.) characterize potential crop productivity by mapping biophysical aspects of cropland variability. These sensor systems provide spectral, spatial, and temporal measurements at resolutions and accuracies describing the variability of in-field, physical characteristic phenomena, including management practices from cropland preparation, selection of crop cultivars, and variable-rate applications of inputs. In addition, DGPS-equipped (differential, global positioning system) harvesters monitor yield response at closely spaced, georeferenced points. Geographic information system and image processing techniques fuse diverse information sources to spatially characterize cropland, describe management practices, and quantify the variable yield response. Following fusion of information sources, effectiveness of spatially applied management practices may be evaluated by designed experiments assessing impacts on yield caused by geo-referenced relationships between (1) uncontrollable spatial components (the environment) and (2) controllable management practices (cultivar selection, fertility management, herbicide, insecticide, and plant growth regulator applications, etc.). These kinds of experiments can be designed because farming equipment can be computer controlled through DGPS giving farmers the ability to continuously change applied treatments for many farming operations. A mixed linear model involving both uncontrollable and controllable management attributes attached as spatial descriptors to yield monitor points evaluates effects of management practices on yield. An example based upon cotton production demonstrates the methodology. Additional strategies for designing studies in commercial cotton fields involving spatial information are discussed

    APPLICATION OF COMPUTER INTENSIVE METHODS TO EVALUATE THE PERFORMANCE OF A SAMPLING DESIGN FOR USE IN COTTON INSECT PEST MANAGEMENT

    Get PDF
    A scouting protocol for cotton insect pests was developed which combines high resolution, multispectral remotely sensed imagery with a belt transect that crosses rows of cotton. Imagery was used to determine sample site selection while estimating plant bug abundance in a more than 200 ac. cotton field in 1997. Tarnished plant bug (Lygus lineolaris) counts were acquired using a standard drop cloth for each of eight rows along a transect. The sample data indicated that plant bug population densities spatially vary as a function of different spectral (color) classes present on the imagery. We postulate that such classified images correlate to differences in crop phenology, and plant bug populations (especially from early to mid-season) aggregate themselves by these habitat differences. Therefore, the population dynamics of Lygus, and possibly other species, can be better understood by combining the transect-based sampling plan with remotely sensed imagery. To verify and validate this claim, a computer intensive approach was utilized to simulate the performance of different sampling plans. The comparison is accomplished with a combinatorial algorithm that exhaustively enumerates the original data into unique subsets. These subsets correspond to results that could be expected from the use of traditional or alternative sampling plans and compared to results from the candidate plan actually used. The results of the enumerative analysis show the benefit of multi-band, remotely sensed imagery combined with the use of large sized sample units to improve sampling efficiency (and without the need to have large sample sizes). It is of great benefit that the enumerative algorithm provided answers to questions of interest without having to complete additional fieldwork

    Intrinsic and extrinsic x-ray absorption effects in soft x-ray diffraction from the superstructure in magnetite

    Full text link
    We studied the (001/2) diffraction peak in the low-temperature phase of magnetite (Fe3O4) using resonant soft x-ray diffraction (RSXD) at the Fe-L2,3 and O-K resonance. We studied both molecular-beam-epitaxy (MBE) grown thin films and in-situ cleaved single crystals. From the comparison we have been able to determine quantitatively the contribution of intrinsic absorption effects, thereby arriving at a consistent result for the (001/2) diffraction peak spectrum. Our data also allow for the identification of extrinsic effects, e.g. for a detailed modeling of the spectra in case a "dead" surface layer is present that is only absorbing photons but does not contribute to the scattering signal.Comment: to appear in Phys. Rev.
    corecore