5,887 research outputs found

    Recent Developments in Complex and Spatially Correlated Functional Data

    Full text link
    As high-dimensional and high-frequency data are being collected on a large scale, the development of new statistical models is being pushed forward. Functional data analysis provides the required statistical methods to deal with large-scale and complex data by assuming that data are continuous functions, e.g., a realization of a continuous process (curves) or continuous random fields (surfaces), and that each curve or surface is considered as a single observation. Here, we provide an overview of functional data analysis when data are complex and spatially correlated. We provide definitions and estimators of the first and second moments of the corresponding functional random variable. We present two main approaches: The first assumes that data are realizations of a functional random field, i.e., each observation is a curve with a spatial component. We call them 'spatial functional data'. The second approach assumes that data are continuous deterministic fields observed over time. In this case, one observation is a surface or manifold, and we call them 'surface time series'. For the two approaches, we describe software available for the statistical analysis. We also present a data illustration, using a high-resolution wind speed simulated dataset, as an example of the two approaches. The functional data approach offers a new paradigm of data analysis, where the continuous processes or random fields are considered as a single entity. We consider this approach to be very valuable in the context of big data.Comment: Some typos fixed and new references adde

    Training as A Growth Strategy for Trade SMEs in The Veracruz-Boca Del RĂ­o Conurbated Area

    Get PDF
    This research shows, based on the statistics, that the main entities that generate employment are small and medium-sized enterprises; the survey of the National Institute of Statistics and Geography (INEGI) serves as sustenance added to those generated by the National Micro-business Survey (ENAMIN ). In the context of training, the results of this study reflect that training for workers is not a priority. Consequently, they do not have it, even though it is regulated as a worker's right by the Ley Federal del Trabajo and an obligation for the business sector. The result shows the companies' conformity with the duality of the benefit of training since this fundamental part that could allow them to achieve greater competitiveness in the market is not found in their planning

    Replication of Micro Laser Textures by Injection Molding

    Get PDF
    AbstractIncreasingly micro technology becomes more important in order to develop new products with high added value. These new technologies known as micro allow us manufacturing precision components; these new micro components should work and take carrying out the functions previously performed by larger parts. Microinjection is one of these new technologies. This has the capacity to produce parts, for different materials both plastic and metal and for some industries and applications. The main objective in this paper is to determine the replicate microtextures capability for plastic injection molds. For our samples, ABS plastic is injected into four aluminum cavities with different laser textures performed in, using different technologies to get them. In order to analyze how mold texture affects parts, optical interferometry technique was selected to measure it. The superficial topography obtained was processed using MountainsMapTM software, in order to get the replicability of injected parts. It has also been used an electron microscopy (SEM) to evaluate the mold textures and injected parts in a photographically way

    Improving randomness characterization through Bayesian model selection

    Full text link
    Nowadays random number generation plays an essential role in technology with important applications in areas ranging from cryptography, which lies at the core of current communication protocols, to Monte Carlo methods, and other probabilistic algorithms. In this context, a crucial scientific endeavour is to develop effective methods that allow the characterization of random number generators. However, commonly employed methods either lack formality (e.g. the NIST test suite), or are inapplicable in principle (e.g. the characterization derived from the Algorithmic Theory of Information (ATI)). In this letter we present a novel method based on Bayesian model selection, which is both rigorous and effective, for characterizing randomness in a bit sequence. We derive analytic expressions for a model's likelihood which is then used to compute its posterior probability distribution. Our method proves to be more rigorous than NIST's suite and the Borel-Normality criterion and its implementation is straightforward. We have applied our method to an experimental device based on the process of spontaneous parametric downconversion, implemented in our laboratory, to confirm that it behaves as a genuine quantum random number generator (QRNG). As our approach relies on Bayesian inference, which entails model generalizability, our scheme transcends individual sequence analysis, leading to a characterization of the source of the random sequences itself.Comment: 25 page
    • …
    corecore