1,492 research outputs found

    The Progress of Computing

    Get PDF
    The present study analyzes computer performance over the last century and a half. Three results stand out. First, there has been a phenomenal increase in computer power over the twentieth century. Performance in constant dollars or in terms of labor units has improved since 1900 by a factor in the order of 1 trillion to 5 trillion, which represent compound growth rates of over 30 percent per year for a century. Second, there were relatively small improvements in efficiency (perhaps a factor of ten) in the century before World War II. Around World War II, however, there was a substantial acceleration in productivity, and the growth in computer power from 1940 to 2001 has averaged 55 percent per year. Third, this study develops estimates of the growth in computer power relying on performance rather than on input-based measures typically used by official statistical agencies. The price declines using performance-based measures are markedly higher than those reported in the official statistics.Productivity, hedonic pricing, history of computing

    Trends in the Development of Basic Computer Education at Universities

    Get PDF
    Basic computer education in universities is experiencing huge problems. On the one hand, the amount of knowledge that a university graduate must have is increasing very quickly. On the other hand, the contingent of students varies greatly in terms of the level of training and motivation, and the level of this differentiation is constantly growing. As a result, the complexity of training and the percentage of dropouts increase. Scientists and educators are looking for a solution to these problems in the following areas: revising the knowledge necessary for obtaining at the university in the direction of the reality of their receipt in the allotted time; the use of new information technologies to simplify the learning process and improve its quality; development of the latest teaching methods that take into account the realities. This paper presents a strategic document in the field of computer education at universities - Computing Circulum 2020, as well as an overview of the areas of development of basic computer education, such as learning using artificial intelligence, virtual laboratories, microprocessor kits and robotics, WEB - systems for distance and blended learning, mobile application development, visual programming, gamification, computer architecture & organization, programming languages, learning technologies. In addition, the author gives his experience and vision of teaching basic computer education at universities

    High School Stem Curriculum and Example of Laboratory Work That Shows How Microcomputers Can Help in Understanding of Physical Concepts

    Get PDF
    We are witnessing the rapid development of technologies that change the world around us. However, curriculums and teaching processes are often slow to adapt to the change; it takes time, money and expertise to implement technology in the classroom. Therefore, the University of Split, Croatia, partnered with local school Marko Marulić High School and created the project "Modern competence in modern high schools" as part of which five different curriculums for STEM areas were developed. One of the curriculums involves combining information technology with physics. The main idea was to teach students how to use different circuits and microcomputers to explore nature and physical phenomena. As a result, using electrical circuits, students are able to recreate in the classroom the phenomena that they observe every day in their environment. So far, high school students had very little opportunity to perform experiments independently, and especially, those physics experiment did not involve ICT. Therefore, this project has a great importance, because the students will finally get a chance to develop themselves in accordance to modern technologies. This paper presents some new methods of teaching physics that will help students to develop experimental skills through the study of deterministic nature of physical laws. Students will learn how to formulate hypotheses, model physical problems using the electronic circuits and evaluate their results. While doing that, they will also acquire useful problem solving skills

    A brief network analysis of Artificial Intelligence publication

    Full text link
    In this paper, we present an illustration to the history of Artificial Intelligence(AI) with a statistical analysis of publish since 1940. We collected and mined through the IEEE publish data base to analysis the geological and chronological variance of the activeness of research in AI. The connections between different institutes are showed. The result shows that the leading community of AI research are mainly in the USA, China, the Europe and Japan. The key institutes, authors and the research hotspots are revealed. It is found that the research institutes in the fields like Data Mining, Computer Vision, Pattern Recognition and some other fields of Machine Learning are quite consistent, implying a strong interaction between the community of each field. It is also showed that the research of Electronic Engineering and Industrial or Commercial applications are very active in California. Japan is also publishing a lot of papers in robotics. Due to the limitation of data source, the result might be overly influenced by the number of published articles, which is to our best improved by applying network keynode analysis on the research community instead of merely count the number of publish.Comment: 18 pages, 7 figure

    The Progress of Computing

    Get PDF
    The present study analyzes computer performance over the last century and a half. Three results stand out. First, there has been a phenomenal increase in computer power over the twentieth century. Performance in constant dollars or in terms of labor units has improved since 1900 by a factor in the order of 1 trillion to 5 trillion, which represent compound growth rates of over 30 percent per year for a century. Second, there were relatively small improvements in efficiency (perhaps a factor of ten) in the century before World War II. Around World War II, however, there was a substantial acceleration in productivity, and the growth in computer power from 1940 to 2001 has averaged 55 percent per year. Third, this study develops estimates of the growth in computer power relying on performance rather than on input-based measures typically used by official statistical agencies. The price declines using performance-based measures are markedly higher than those reported in the official statistics

    Using heterogeneous wireless sensor networks in a telemonitoring system for healthcare

    Get PDF
    Abstract—Ambient intelligence has acquired great importance in recent years and requires the development of new innovative solutions. This paper presents a distributed telemonitoring system, aimed at improving healthcare and assistance to dependent people at their homes. The system implements a service-oriented architecture based platform, which allows heterogeneous wireless sensor networks to communicate in a distributed way independent of time and location restrictions. This approach provides the system with a higher ability to recover from errors and a better flexibility to change their behavior at execution time. Preliminary results are presented in this paper. Index Terms—Ambient intelligence (AmI), healthcare, servicesoriented architectures (SOAs), wireless sensors networks (WSNs)

    Validating non-motivated methods and equipment for studying mouse olfactory behavior

    Get PDF
    Mouse olfactory behavior has traditionally been difficult to assess due, in part, to the expensive nature of behavioral equipment and the lengthy process of training animals. The present study aims to validate a new behavioral paradigm requiring no prior animal training using existing liquid dilution behavioral olfactometers. We also aim to validate Triton-100x, a detergent, as a new anosmia inducing agent, as well as self-built, Do-it-yourself (DIY) behavioral olfactometers. Equipment and methods were tested using a variety of common-discrimination and detectionthreshold assays. Difficulties maintaining stimulus control arose during testing as mice routinely detected volume-to-volume concentrations of amyl acetate diluted in mineral oil below reported thresholds (1x10-8: n = 8, p \u3c 0.05). Stimulus control was corrected by using individual vials for each odor presentation. These results demonstrate that non-motivated behavior using existing equipment is an effective alternative to traditional training methods when stimulus control is properly accounted for. Furthermore, intranasal irrigation with 0.1% Triton successfully induced recoverable anosmia in mice (Day 6: p \u3e 0.05, n = 6; Day 7: p \u3c 0.05, n = 6; PBS: p \u3c 0.05, n = 6). Finally, a behavioral olfactometer was successfully constructed from Arduino microcontrollers for ~$750. At a fraction of the cost, our DIY behavioral olfactometer produced behavioral data comparable to commercial equipment in common olfactory assays. We hope this cost-effective, easy-to-use equipment will be used for both research and teaching purposes

    The Design and Implementation of an Extensible Brain-Computer Interface

    Get PDF
    An implantable brain computer interface: BCI) includes tissue interface hardware, signal conditioning circuitry, analog-to-digital conversion: ADC) circuitry and some sort of computing hardware to discriminate desired waveforms from noise. Within an experimental paradigm the tissue interface and ADC hardware will rarely change. Recent literature suggests it is often the specific implementation of waveform discrimination that can limit the usefulness and lifespan of a particular BCI design. If the discrimination techniques are implemented in on-board software, experimenters gain a level of flexibility not currently available in published designs. To this end, I have developed a firmware library to acquire data sampled from an ADC, discriminate the signal for desired waveforms employing a user-defined function, and perform arbitrary tasks. I then used this design to develop an embedded BCI built upon the popular Texas Instruments MSP430 microcontroller platform. This system can operate on multiple channels simultaneously and is not fundamentally limited in the number of channels that can be processed. The resulting system represents a viable platform that can ease the design, development and use of BCI devices for a variety of applications
    • …
    corecore