225,815 research outputs found

    Packet Delivery: An Investigation of Educational Video Games for Computer Science Education

    Get PDF
    The field of educational video games has rapidly grown since the 1970s, mostly producing video games to teach core education concepts such as mathematics, natural science, and English. Recently, various research groups have developed educational games to address elective topics such as finance and health. Educational video games often target grade school audiences and rarely target high school students, college students, or adults. Computer science topics are not a common theme among educational video games; the games that address Computer Science topics teach computer fundamentals, such as typing or basic programming, to young audiences. Packet Delivery, an educational video game for introductory computer science students, is an investigation into the use of apprenticeship learning, constructivism, and scaffolding learning paradigms to teach the Domain Name System (DNS) lookup process. In Packet Delivery, the player\u27s primary task is delivering letters without addresses to recipients via a search mechanism that emulates the DNS lookup process. Through practice and in-game upgrades, the player\u27s goal is to learn the basics of DNS lookup and its optimizations. To analyze comprehension and retention of students playing Packet Delivery, a study containing three tests were given to participants over the course of a few weeks; a pretest gauging prior knowledge, a post-test gauging immediate comprehension, and a follow-up post-test gauging retention. The study provided a proof of concept that educational video games not only have a significant place in higher education, but that apprenticeship learning, constructivism, and scaffolding are highly effective learning paradigms for use within educational video games. Adviser: Shruti Bolma

    PhD Students Perceptions of the Relationship between Philosophy and Research: A Qualitative Investigation

    Get PDF
    This study explored, described, and discovered meaning in the lived experiences of PhD students regarding two courses: Philosophy of Science and Qualitative Methods. The philosophical underpinning was constructivism. The phenomenological methodology employed a structured questionnaire to collect data. It involved mailed computer disks with questions. Twenty of 43 students returned the disks. Content analysis and QSR N6 software were employed in data analysis. Findings included three broad areas: Thinking about Thinking, The Ah-Ha of Me and Thee, and The Never-Ending Journey of Darkness to Light. Philosophy of Science appears to have value for students in every aspect of their lives. Recognizing strengths and limitations of various paradigms could lead to different and new ways of approaching research. Philosophy of Science was a useful course for the participants

    Concepts in Action

    Get PDF
    This open access book is a timely contribution in presenting recent issues, approaches, and results that are not only central to the highly interdisciplinary field of concept research but also particularly important to newly emergent paradigms and challenges. The contributors present a unique, holistic picture for the understanding and use of concepts from a wide range of fields including cognitive science, linguistics, philosophy, psychology, artificial intelligence, and computer science. The chapters focus on three distinct points of view that lie at the core of concept research: representation, learning, and application. The contributions present a combination of theoretical, experimental, computational, and applied methods that appeal to students and researchers working in these fields

    Concepts in Action

    Get PDF
    This open access book is a timely contribution in presenting recent issues, approaches, and results that are not only central to the highly interdisciplinary field of concept research but also particularly important to newly emergent paradigms and challenges. The contributors present a unique, holistic picture for the understanding and use of concepts from a wide range of fields including cognitive science, linguistics, philosophy, psychology, artificial intelligence, and computer science. The chapters focus on three distinct points of view that lie at the core of concept research: representation, learning, and application. The contributions present a combination of theoretical, experimental, computational, and applied methods that appeal to students and researchers working in these fields

    An Empirical Evaluation of a Historical Data Warehouse

    Get PDF
    Computing is widely regarded as a scientific discipline that emphasizes on three different perspectives: mathematics, present in the development of formalisms, theories and algorithms; engineering, linked to the goal of making things better, faster, smaller, cheaper and, finally, the science that can be defined as the activity to develop general and predictive theories that allow these theories to be evaluated and tested. However, research in software engineering rarely describes explicitly its research paradigms and standards to assess the quality of its results. Due to a growing understanding in the computer science community that empirical studies are needed to improve processes, methods and tools for the development and maintenance of software, an emerging area in software engineering is developed: the Empirical Software Engineering. This subarea is one step down in the claims of scientificity but it aims to address this shortcoming. The objective of this work is to conduct an empirical corroboration for developing a method of a Historical Data Warehouse, the temporal data model and the associated query interface.Sociedad Argentina de Informática e Investigación Operativa (SADIO

    An Empirical Evaluation of a Historical Data Warehouse

    Get PDF
    Computing is widely regarded as a scientific discipline that emphasizes on three different perspectives: mathematics, present in the development of formalisms, theories and algorithms; engineering, linked to the goal of making things better, faster, smaller, cheaper and, finally, the science that can be defined as the activity to develop general and predictive theories that allow these theories to be evaluated and tested. However, research in software engineering rarely describes explicitly its research paradigms and standards to assess the quality of its results. Due to a growing understanding in the computer science community that empirical studies are needed to improve processes, methods and tools for the development and maintenance of software, an emerging area in software engineering is developed: the Empirical Software Engineering. This subarea is one step down in the claims of scientificity but it aims to address this shortcoming. The objective of this work is to conduct an empirical corroboration for developing a method of a Historical Data Warehouse, the temporal data model and the associated query interface.Sociedad Argentina de Informática e Investigación Operativa (SADIO

    What is a quantum computer, and how do we build one?

    Full text link
    The DiVincenzo criteria for implementing a quantum computer have been seminal in focussing both experimental and theoretical research in quantum information processing. These criteria were formulated specifically for the circuit model of quantum computing. However, several new models for quantum computing (paradigms) have been proposed that do not seem to fit the criteria well. The question is therefore what are the general criteria for implementing quantum computers. To this end, a formal operational definition of a quantum computer is introduced. It is then shown that according to this definition a device is a quantum computer if it obeys the following four criteria: Any quantum computer must (1) have a quantum memory; (2) facilitate a controlled quantum evolution of the quantum memory; (3) include a method for cooling the quantum memory; and (4) provide a readout mechanism for subsets of the quantum memory. The criteria are met when the device is scalable and operates fault-tolerantly. We discuss various existing quantum computing paradigms, and how they fit within this framework. Finally, we lay out a roadmap for selecting an avenue towards building a quantum computer. This is summarized in a decision tree intended to help experimentalists determine the most natural paradigm given a particular physical implementation

    Particular: A Functional Approach to 3D Particle Simulation

    Get PDF
    Simulating large bodies of entities in various environments is an old science that traces back decades in computer science. There are existing software frameworks with well built mathematical models for approximating various environments. These frameworks are however built on imperative programming fundamentals often following a object oriented paradigm. This thesis presents Particular a 3d particle simulator software library for simulating movements of independent entities on a time dependant three-dimensional vector field using a functional approach. Particular uses functional programming paradigms to create a quite customizable, flexible and maintainable library based on lambda functions with all relevant parameters encapsulated in closures. Particular uses a functional implementation of a Entity Component System software architecture usually found in game development to create a highly performant, flexible, data oriented design. Which uncouples the data with the aforementioned lambda functions that predicate particle behaviour. According to evaluations particular shows a significant performance increase with regards to execution time compared to comparison to other contemporary trajectory simulation frameworks such as opendrift. With some evaluations showing a 900% faster execution time under certain conditions
    corecore