329 research outputs found

    Seriation and Multidimensional Scaling: A Data Analysis Approach to Scaling Asymmetric Proximity Matrices

    Get PDF
    A number of model-based scaling methods have been developed that apply to asymmetric proximity matrices. A flexible data analysis approach is pro posed that combines two psychometric procedures— seriation and multidimensional scaling (MDS). The method uses seriation to define an empirical order ing of the stimuli, and then uses MDS to scale the two separate triangles of the proximity matrix defined by this ordering. The MDS solution con tains directed distances, which define an "extra" dimension that would not otherwise be portrayed, because the dimension comes from relations between the two triangles rather than within triangles. The method is particularly appropriate for the analysis of proximities containing temporal information. A major difficulty is the computa tional intensity of existing seriation algorithms, which is handled by defining a nonmetric seriation algorithm that requires only one complete itera tion. The procedure is illustrated using a matrix of co-citations between recent presidents of the Psychometric Society.Yeshttps://us.sagepub.com/en-us/nam/manuscript-submission-guideline

    Electromagnetic vertex function of the pion at T > 0

    Full text link
    The matrix element of the electromagnetic current between pion states is calculated in quenched lattice QCD at a temperature of T=0.93TcT = 0.93 T_c. The nonperturbatively improved Sheikholeslami-Wohlert action is used together with the corresponding O(a){\cal O}(a) improved vector current. The electromagnetic vertex function is extracted for pion masses down to 360MeV360 {\rm MeV} and momentum transfers Q2≀2.7GeV2Q^2 \le 2.7 {\rm GeV}^2.Comment: 17 pages, 8 figure

    The Effect of a Mycotoxin Deactivation Product on Growth of Juvenile Rainbow Trout Fed Distillers Dried Grains

    Get PDF
    Distillers dried grains (DDG) with solubles (DDGS) is a product that has shown potential as a protein source for some fish species, but high inclusion rates of DDGS have not always been successfully achieved for Rainbow Trout Oncorhynchus mykiss. Our objective was to determine whether inclusion of a mycotoxin deactivation product (Biofix Plus) could improve the ability of high-protein DDG (HPDDG) to replace a portion of the fish meal in diets for Rainbow Trout. The 2 × 2 factorial feeding trial examined protein source (menhaden fish meal [MFM] or HPDDG) with or without Biofix Plus. A control diet (42% digestible protein, 20% crude lipid, 25% MFM) was compared to a test diet in which HPDDG replaced 12% of the total MFM on a digestible-protein basis (24% HPDDG inclusion). Diets were fed to juvenile Rainbow Trout (initial weight: mean ± SE = 30.5 ± 1.6 g) in four replicate tanks per treatment for 9 weeks in a 15°C recirculating system. At the conclusion of the feeding trial, we observed no negative effects of fish meal replacement on growth or feed conversion ratio; no benefit of Biofix Plus supplementation was observed. These data indicate that when Rainbow Trout diets containing a high-quality DDGS product are balanced for digestible protein, lysine, methionine, and threonine, dietary fish meal levels can be successfully reduced to 13% without compromising growth and without the need for mycotoxin deactivator inclusion

    An Experimental Investigation of Colonel Blotto Games

    Get PDF
    "This article examines behavior in the two-player, constant-sum Colonel Blotto game with asymmetric resources in which players maximize the expected number of battlefields won. The experimental results support all major theoretical predictions. In the auction treatment, where winning a battlefield is deterministic, disadvantaged players use a 'guerilla warfare' strategy which stochastically allocates zero resources to a subset of battlefields. Advantaged players employ a 'stochastic complete coverage' strategy, allocating random, but positive, resource levels across the battlefields. In the lottery treatment, where winning a battlefield is probabilistic, both players divide their resources equally across all battlefields." (author's abstract)"Dieser Artikel untersucht das Verhalten von Individuen in einem 'constant-sum Colonel Blotto'-Spiel zwischen zwei Spielern, bei dem die Spieler mit unterschiedlichen Ressourcen ausgestattet sind und die erwartete Anzahl gewonnener Schlachtfelder maximieren. Die experimentellen Ergebnisse bestĂ€tigen alle wichtigen theoretischen Vorhersagen. Im Durchgang, in dem wie in einer Auktion der Sieg in einem Schlachtfeld deterministisch ist, wenden die Spieler, die sich im Nachteil befinden, eine 'Guerillataktik' an, und verteilen ihre Ressourcen stochastisch auf eine Teilmenge der Schlachtfelder. Spieler mit einem Vorteil verwenden eine Strategie der 'stochastischen vollstĂ€ndigen Abdeckung', indem sie zufĂ€llig eine positive Ressourcenmenge auf allen Schlachtfeldern positionieren. Im Durchgang, in dem sich der Gewinn eines Schlachtfeldes probabilistisch wie in einer Lotterie bestimmt, teilen beide Spieler ihre Ressourcen gleichmĂ€ĂŸig auf alle Schlachtfelder auf." (Autorenreferat

    Detection of inconsistencies in geospatial data with geostatistics

    Get PDF
    Almost every researcher has come through observations that “drift” from the rest of the sample, suggesting some inconsistency. The aim of this paper is to propose a new inconsistent data detection method for continuous geospatial data based in Geostatistics, independently from the generative cause (measuring and execution errors and inherent variability data). The choice of Geostatistics is based in its ideal characteristics, as avoiding systematic errors, for example. The importance of a new inconsistent detection method proposal is in the fact that some existing methods used in geospatial data consider theoretical assumptions hardly attended. Equally, the choice of the data set is related to the importance of the LiDAR technology (Light Detection and Ranging) in the production of Digital Elevation Models (DEM). Thus, with the new methodology it was possible to detect and map discrepant data. Comparing it to a much utilized detections method, BoxPlot, the importance and functionality of the new method was verified, since the BoxPlot did not detect any data classified as discrepant. The proposed method pointed that, in average, 1,2% of the data of possible regionalized inferior outliers and, in average, 1,4% of possible regionalized superior outliers, in relation to the set of data used in the study

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Amoeba Techniques for Shape and Texture Analysis

    Full text link
    Morphological amoebas are image-adaptive structuring elements for morphological and other local image filters introduced by Lerallut et al. Their construction is based on combining spatial distance with contrast information into an image-dependent metric. Amoeba filters show interesting parallels to image filtering methods based on partial differential equations (PDEs), which can be confirmed by asymptotic equivalence results. In computing amoebas, graph structures are generated that hold information about local image texture. This paper reviews and summarises the work of the author and his coauthors on morphological amoebas, particularly their relations to PDE filters and texture analysis. It presents some extensions and points out directions for future investigation on the subject.Comment: 38 pages, 19 figures v2: minor corrections and rephrasing, Section 5 (pre-smoothing) extende
    • 

    corecore