1,249 research outputs found

    Alien Registration- Preston, Simon (Baileyville, Washington County)

    Get PDF
    https://digitalmaine.com/alien_docs/2566/thumbnail.jp

    Improved classification for compositional data using the α\alpha-transformation

    Get PDF
    In compositional data analysis an observation is a vector containing non-negative values, only the relative sizes of which are considered to be of interest. Without loss of generality, a compositional vector can be taken to be a vector of proportions that sum to one. Data of this type arise in many areas including geology, archaeology, biology, economics and political science. In this paper we investigate methods for classification of compositional data. Our approach centres on the idea of using the α\alpha-transformation to transform the data and then to classify the transformed data via regularised discriminant analysis and the k-nearest neighbours algorithm. Using the α\alpha-transformation generalises two rival approaches in compositional data analysis, one (when α=1\alpha=1) that treats the data as though they were Euclidean, ignoring the compositional constraint, and another (when α=0\alpha=0) that employs Aitchison's centred log-ratio transformation. A numerical study with several real datasets shows that whether using α=1\alpha=1 or α=0\alpha=0 gives better classification performance depends on the dataset, and moreover that using an intermediate value of α\alpha can sometimes give better performance than using either 1 or 0.Comment: This is a 17-page preprint and has been accepted for publication at the Journal of Classificatio

    A data-based power transformation for compositional data

    Get PDF
    Compositional data analysis is carried out either by neglecting the compositional constraint and applying standard multivariate data analysis, or by transforming the data using the logs of the ratios of the components. In this work we examine a more general transformation which includes both approaches as special cases. It is a power transformation and involves a single parameter, {\alpha}. The transformation has two equivalent versions. The first is the stay-in-the-simplex version, which is the power transformation as defined by Aitchison in 1986. The second version, which is a linear transformation of the power transformation, is a Box-Cox type transformation. We discuss a parametric way of estimating the value of {\alpha}, which is maximization of its profile likelihood (assuming multivariate normality of the transformed data) and the equivalence between the two versions is exhibited. Other ways include maximization of the correct classification probability in discriminant analysis and maximization of the pseudo R-squared (as defined by Aitchison in 1986) in linear regression. We examine the relationship between the {\alpha}-transformation, the raw data approach and the isometric log-ratio transformation. Furthermore, we also define a suitable family of metrics corresponding to the family of {\alpha}-transformation and consider the corresponding family of Frechet means.Comment: Published in the proceddings of the 4th international workshop on Compositional Data Analysis. http://congress.cimne.com/codawork11/frontal/default.as

    Alejo Carpentier, Gabriel García Márquez, Salman Rushdie : three moments in the problematics of magic realism

    Get PDF
    Chapter One begins by outlining the space magic occupies in Western culture, clarifying what I mean by the term "magic". I examine aspects of indigenous American sacred traditions which have influenced and which prefigure magic realism. I review the development of the aesthetic in its Latin American context, touching on the Chronicles, the role of nationalism and erotic rhetoric, the influence of European modernism and the role of the intellectual in Latin American society. Chapter Two examines the development of a realist aesthetic in Europe since the Enlightenment. This review of its manifestations and counter-traditions in European culture is founded upon a discussion of aspects of the philosophy of Kant. I focus on the influence of Surrealism which is particularly illuminating of Latin American magic realism. The impacts of anthropology and psychoanalysis on Latin American writers are also reviewed. Chapter Two includes a review of formulations of magic realism influential in the field of English studies and concludes with a working definition which is used as a basis for the discussions of the three novels analysed in this study. Chapter Three is a study of the development of Alejo Carpentier's version of magic realism culminating in the writing of The Kingdom of this World in 1949. Through using both European and indigenous American techniques and perspectives he hoped to create a literature which could represent the complex realities of Latin American life and establish a mythology for the founding of a unified Latin American identity

    Event series prediction via non-homogeneous Poisson process modelling

    Get PDF
    Data streams whose events occur at random arrival times rather than at the regular, tick-tock intervals of traditional time series are increasingly prevalent. Event series are continuous, irregular and often highly sparse, differing greatly in nature to the regularly sampled time series traditionally the concern of hard sciences. As mass sets of such data have become more common, so interest in predicting future events in them has grown. Yet repurposing of traditional forecasting approaches has proven ineffective, in part due to issues such as sparsity, but often due to inapplicable underpinning assumptions such as stationarity and ergodicity. In this paper we derive a principled new approach to forecasting event series that avoids such assumptions, based upon: 1. the processing of event series datasets in order to produce a parameterized mixture model of non-homogeneous Poisson processes; and 2. application of a technique called parallel forecasting that uses these processes’ rate functions to directly generate accurate temporal predictions for new query realizations. This approach uses forerunners of a stochastic process to shed light on the distribution of future events, not for themselves, but for realizations that subsequently follow in their footsteps

    Electron scattering in isotonic chains as a probe of the proton shell structure of unstable nuclei

    Get PDF
    Electron scattering on unstable nuclei is planned in future facilities of the GSI and RIKEN upgrades. Motivated by this fact, we study theoretical predictions for elastic electron scattering in the N=82, N=50, and N=14 isotonic chains from very proton-deficient to very proton-rich isotones. We compute the scattering observables by performing Dirac partial-wave calculations. The charge density of the nucleus is obtained with a covariant nuclear mean-field model that accounts for the low-energy electromagnetic structure of the nucleon. For the discussion of the dependence of scattering observables at low-momentum transfer on the gross properties of the charge density, we fit Helm model distributions to the self-consistent mean-field densities. We find that the changes shown by the electric charge form factor along each isotonic chain are strongly correlated with the underlying proton shell structure of the isotones. We conclude that elastic electron scattering experiments in isotones can provide valuable information about the filling order and occupation of the single-particle levels of protons.Comment: 13 pages; 19 figure

    Local government authority attitudes to road traffic CO<sub>2</sub> emissions modelling: a British case study

    No full text
    Local government authorities (LGAs) play a key role in facilitating mitigation of road traffic CO2 emissions and must engage in emissions modelling to quantify the impact of transport interventions. Existing Emissions Model (EM) methodologies range from aggregate to disaggregate approaches, with more detail normally entailing more resources. However, it is not clear which approaches LGAs actually utilise. This article reports results of a survey designed to discover the level of detail considered practical by British LGAs (n = 34). Results show that resource scarcity is important, with particular importance attached to EM reusability and convenient input data sources. Most LGA EMs use traffic variable inputs (predominantly traffic flow and traffic average speed), with this approach being the best-fit for LGA resources. Link-by-link sources of data rated highly for convenience are road traffic models and urban traffic control systems
    • …
    corecore