182,204 research outputs found

    21st Century Ergonomic Education, From Little e to Big E

    Full text link
    Despite intense efforts, contemporary educational systems are not enabling individuals to function optimally in modern society. The main reason is that reformers are trying to improve systems that are not designed to take advantage of the centuries of history of the development of today's societies. Nor do they recognize the implications of the millions of years of history of life on earth in which humans are the latest edition of learning organisms. The contemporary educational paradigm of "education for all" is based on a 17th century model of "printing minds" for passing on static knowledge. This characterizes most of K-12 education. In contrast, 21st Century education demands a new paradigm, which we call Ergonomic Education. This is an education system that is designed to fit the students of any age instead of forcing the students to fit the education system. It takes into account in a fundamental way what students want to learn -- the concept "wanting to learn" refers to the innate ability and desire to learn that is characteristic of humans. The Ergonomic Education paradigm shifts to education based on coaching students as human beings who are hungry for productive learning throughout their lives from their very earliest days.Comment: plain latex, 13 pages, 1 tabl

    Structured Review of Code Clone Literature

    Get PDF
    This report presents the results of a structured review of code clone literature. The aim of the review is to assemble a conceptual model of clone-related concepts which helps us to reason about clones. This conceptual model unifies clone concepts from a wide range of literature, so that findings about clones can be compared with each other

    Km4City Ontology Building vs Data Harvesting and Cleaning for Smart-city Services

    Get PDF
    Presently, a very large number of public and private data sets are available from local governments. In most cases, they are not semantically interoperable and a huge human effort would be needed to create integrated ontologies and knowledge base for smart city. Smart City ontology is not yet standardized, and a lot of research work is needed to identify models that can easily support the data reconciliation, the management of the complexity, to allow the data reasoning. In this paper, a system for data ingestion and reconciliation of smart cities related aspects as road graph, services available on the roads, traffic sensors etc., is proposed. The system allows managing a big data volume of data coming from a variety of sources considering both static and dynamic data. These data are mapped to a smart-city ontology, called KM4City (Knowledge Model for City), and stored into an RDF-Store where they are available for applications via SPARQL queries to provide new services to the users via specific applications of public administration and enterprises. The paper presents the process adopted to produce the ontology and the big data architecture for the knowledge base feeding on the basis of open and private data, and the mechanisms adopted for the data verification, reconciliation and validation. Some examples about the possible usage of the coherent big data knowledge base produced are also offered and are accessible from the RDF-Store and related services. The article also presented the work performed about reconciliation algorithms and their comparative assessment and selection

    Using Global Constraints and Reranking to Improve Cognates Detection

    Full text link
    Global constraints and reranking have not been used in cognates detection research to date. We propose methods for using global constraints by performing rescoring of the score matrices produced by state of the art cognates detection systems. Using global constraints to perform rescoring is complementary to state of the art methods for performing cognates detection and results in significant performance improvements beyond current state of the art performance on publicly available datasets with different language pairs and various conditions such as different levels of baseline state of the art performance and different data size conditions, including with more realistic large data size conditions than have been evaluated with in the past.Comment: 10 pages, 6 figures, 6 tables; published in the Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pages 1983-1992, Vancouver, Canada, July 201

    Launching the Grand Challenges for Ocean Conservation

    Get PDF
    The ten most pressing Grand Challenges in Oceans Conservation were identified at the Oceans Big Think and described in a detailed working document:A Blue Revolution for Oceans: Reengineering Aquaculture for SustainabilityEnding and Recovering from Marine DebrisTransparency and Traceability from Sea to Shore:  Ending OverfishingProtecting Critical Ocean Habitats: New Tools for Marine ProtectionEngineering Ecological Resilience in Near Shore and Coastal AreasReducing the Ecological Footprint of Fishing through Smarter GearArresting the Alien Invasion: Combating Invasive SpeciesCombatting the Effects of Ocean AcidificationEnding Marine Wildlife TraffickingReviving Dead Zones: Combating Ocean Deoxygenation and Nutrient Runof

    A Multiple Migration and Stacking Algorithm Designed for Land Mine Detection

    Get PDF
    This paper describes a modification to a standard migration algorithm for land mine detection with a ground-penetrating radar (GPR) system. High directivity from the antenna requires a significantly large aperture in relation to the operating wavelength, but at the frequencies of operation of GPR, this would result in a large and impractical antenna. For operator convenience, most GPR antennas are small and exhibit low directivity and a wide beamwidth. This causes the GPR image to bear little resemblance to the actual target scattering centers. Migration algorithms attempt to reduce this effect by focusing the scattered energy from the source reflector and consequentially improve the target detection rate. However, problems occur due to the varying operational conditions, which result in the migration algorithm requiring vastly different calibration parameters. In order to combat this effect, this migration scheme stacks multiple versions of the same migrated data with different velocity values, whereas some other migration schemes only use a single velocity value

    Automated analysis of feature models: Quo vadis?

    Get PDF
    Feature models have been used since the 90's to describe software product lines as a way of reusing common parts in a family of software systems. In 2010, a systematic literature review was published summarizing the advances and settling the basis of the area of Automated Analysis of Feature Models (AAFM). From then on, different studies have applied the AAFM in different domains. In this paper, we provide an overview of the evolution of this field since 2010 by performing a systematic mapping study considering 423 primary sources. We found six different variability facets where the AAFM is being applied that define the tendencies: product configuration and derivation; testing and evolution; reverse engineering; multi-model variability-analysis; variability modelling and variability-intensive systems. We also confirmed that there is a lack of industrial evidence in most of the cases. Finally, we present where and when the papers have been published and who are the authors and institutions that are contributing to the field. We observed that the maturity is proven by the increment in the number of journals published along the years as well as the diversity of conferences and workshops where papers are published. We also suggest some synergies with other areas such as cloud or mobile computing among others that can motivate further research in the future.Ministerio de Economía y Competitividad TIN2015-70560-RJunta de Andalucía TIC-186
    corecore