61,342 research outputs found

    Application of Supercomputer Technologies for Simulation of Socio-Economic Systems

    Full text link
    To date, an extensive experience has been accumulated in investigation of problems related to quality, assessment of management systems, modeling of economic system sustainability. The studies performed have created a basis for formation of a new research area — Economics of Quality. Its tools allow to use opportunities of model simulation for construction of the mathematical models adequately reflecting the role of quality in natural, technical, social regularities of functioning of the complex socioeconomic systems. Extensive application and development of models, and also system modeling with use of supercomputer technologies, on our deep belief, will bring the conducted researches of social and economic systems to essentially new level. Moreover, the current scientific research makes a significant contribution to model simulation of multi-agent social systems and that isn’t less important, it belongs to the priority areas in development of science and technology in our country. This article is devoted to the questions of supercomputer technologies application in public sciences, first of all, — regarding technical realization of the large-scale agent-focused models (AFM). The essence of this tool is that owing to increase in power of computers it became possible to describe the behavior of many separate fragments of a difficult system, as social and economic systems represent. The article also deals with the experience of foreign scientists and practicians in launching the AFM on supercomputers, and also the example of AFM developed in CEMI RAS, stages and methods of effective calculating kernel display of multi-agent system on architecture of a modern supercomputer will be analyzed. The experiments on the basis of model simulation on forecasting the population of St. Petersburg according to three scenarios as one of the major factors influencing the development of social and economic system and quality of life of the population are presented in the conclusion

    Are a set of microarrays independent of each other?

    Full text link
    Having observed an m×nm\times n matrix XX whose rows are possibly correlated, we wish to test the hypothesis that the columns are independent of each other. Our motivation comes from microarray studies, where the rows of XX record expression levels for mm different genes, often highly correlated, while the columns represent nn individual microarrays, presumably obtained independently. The presumption of independence underlies all the familiar permutation, cross-validation and bootstrap methods for microarray analysis, so it is important to know when independence fails. We develop nonparametric and normal-theory testing methods. The row and column correlations of XX interact with each other in a way that complicates test procedures, essentially by reducing the accuracy of the relevant estimators.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS236 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A Path to Alignment: Connecting K-12 and Higher Education via the Common Core and the Degree Qualifications Profile

    Get PDF
    The Common Core State Standards (CCSS), which aim to assure competency in English/language arts and mathematics through the K-12 curriculum, define necessary but not sufficient preparedness for success in college. The Degree Qualifications Profile (DQP), which describes what a college degree should signify, regardless of major, offers useful but not sufficient guidance to high school students preparing for college study. A coordinated strategy to prepare students to succeed in college would align these two undertakings and thus bridge an unfortunate and harmful cultural chasm between the K-12 world and that of higher education. Chasms call for bridges, and the bridge proposed by this white paper could create a vital thoroughfare. The white paper begins with a description of the CCSS and an assessment of their significance. A following analysis then explains why the CCSS, while necessary, are not sufficient as a platform for college success. A corresponding explanation of the DQP clarifies the prompts that led to its development, describes its structure, and offers some guidance for interpreting the outcomes that it defines. Again, a following analysis considers the potential of the DQP and the limitations that must be addressed if that potential is to be more fully realized. The heart of the white paper lies in sections 5 and 6, which provide a crosswalk between the CCSS and the DQP. These sections show how alignments and differences between the two may point to a comprehensive preparedness strategy. They also offer a proposal for a multifaceted strategy to realize the potential synergy of the CCSS and the DQP for the benefit of high school and college educators and their students -- and the nation

    Task Specific Uncertainty in Coordinate Measurement

    Get PDF
    Task specific uncertainty is the measurement uncertainty associated with the measurement of a specific feature using a specific measurement plan. This paper surveys techniques developed to model and estimate task specific uncertainty for coordinate measuring systems, primarily coordinate measuring machines using contacting probes. Sources of uncertainty are also reviewed

    Teaching telecommunication standards: bridging the gap between theory and practice

    Get PDF
    ©2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Telecommunication standards have become a reliable mechanism to strengthen collaboration between industry and research institutions to accelerate the evolution of communications systems. Standards are needed to enable cooperation while promoting competition. Within the framework of a standard, the companies involved in the standardization process contribute and agree on appropriate technical specifications to ensure diversity and compatibility, and facilitate worldwide commercial deployment and evolution. Those parts of the system that can create competitive advantages are intentionally left open in the specifications. Such specifications are extensive, complex, and minimalistic. This makes telecommunication standards education a difficult endeavor, but it is much demanded by industry and governments to spur economic growth. This article describes a methodology for teaching wireless communications standards. We define our methodology around six learning stages that assimilate the standardization process and identify key learning objectives for each. Enabled by software-defined radio technology, we describe a practical learning environment that facilitates developing many of the needed technical and soft skills without the inherent difficulty and cost associated with radio frequency components and regulation. Using only open source software and commercial of-the-shelf computers, this environment is portable and can easily be recreated at other educational institutions and adapted to their educational needs and constraints. We discuss our and our students' experiences when employing the proposed methodology to 4G LTE standard education at Barcelona Tech.Peer ReviewedPostprint (author's final draft

    Temporal Feature Selection with Symbolic Regression

    Get PDF
    Building and discovering useful features when constructing machine learning models is the central task for the machine learning practitioner. Good features are useful not only in increasing the predictive power of a model but also in illuminating the underlying drivers of a target variable. In this research we propose a novel feature learning technique in which Symbolic regression is endowed with a ``Range Terminal\u27\u27 that allows it to explore functions of the aggregate of variables over time. We test the Range Terminal on a synthetic data set and a real world data in which we predict seasonal greenness using satellite derived temperature and snow data over a portion of the Arctic. On the synthetic data set we find Symbolic regression with the Range Terminal outperforms standard Symbolic regression and Lasso regression. On the Arctic data set we find it outperforms standard Symbolic regression, fails to beat the Lasso regression, but finds useful features describing the interaction between Land Surface Temperature, Snow, and seasonal vegetative growth in the Arctic
    corecore