75,878 research outputs found

    Profiling a decade of information systems frontiers’ research

    Get PDF
    This article analyses the first ten years of research published in the Information Systems Frontiers (ISF) from 1999 to 2008. The analysis of the published material includes examining variables such as most productive authors, citation analysis, universities associated with the most publications, geographic diversity, authors’ backgrounds and research methods. The keyword analysis suggests that ISF research has evolved from establishing concepts and domain of information systems (IS), technology and management to contemporary issues such as outsourcing, web services and security. The analysis presented in this paper has identified intellectually significant studies that have contributed to the development and accumulation of intellectual wealth of ISF. The analysis has also identified authors published in other journals whose work largely shaped and guided the researchers published in ISF. This research has implications for researchers, journal editors, and research institutions

    Profiling research published in the journal of enterprise information management (JEIM)

    Get PDF
    Purpose – The purpose of this paper is to analyse research published in the Journal of Enterprise Information Management (JEIM) in the last ten years (1999 to 2008). Design/methodology/approach – Employing a profiling approach, the analysis of the 381 JEIM publications includes examining variables such as the most active authors, geographic diversity, authors' backgrounds, co-author analysis, research methods and keyword analysis. Findings – All the finding are in relation to the period of analysis (1999 to 2008). (a) Research categorised under descriptive, theoretical and conceptual methods is the most dominant research approach followed by JEIM authors. This is followed by case study research. (b) The largest proportion of contributions came from researchers and practitioners with an information systems background, followed by those with a background in business and computer science and IT. (c) The keyword analysis suggests that ‘information systems’, ‘electronic commerce’, ‘internet’, ‘logistics’, ‘supply chain management’, ‘decision making’, ‘small to medium-sized enterprises’, ‘information management’, ‘outsourcing’, and ‘modelling’ were the most frequently investigated keywords. (d) The paper presents and discusses the findings obtained from the citation analysis that determines the impact of the research published in the JEIM. Originality/value – The primary value of this paper lies in extending the understanding of the evolution and patterns of IS research. This has been achieved by analysing and synthesising existing JEIM publications

    Strategies and mechanisms for electronic peer review

    Get PDF
    This journal article published at the October 2000 Frontiers in Education Conference discusses strategies and mechanisms for electronic peer review. It outlines a peer-grading system for review of student assignments over the World-Wide Web called Peer Grader. The system allows authors and reviewers to communicate and authors to update their submissions. This system facilitates collaborative learning and makes it possible to break up a large project into smaller portions. The article summarizes a unique and innovative method of peer-review. Educational levels: Graduate or professional

    Introductory programming: a systematic literature review

    Get PDF
    As computing becomes a mainstream discipline embedded in the school curriculum and acts as an enabler for an increasing range of academic disciplines in higher education, the literature on introductory programming is growing. Although there have been several reviews that focus on specific aspects of introductory programming, there has been no broad overview of the literature exploring recent trends across the breadth of introductory programming. This paper is the report of an ITiCSE working group that conducted a systematic review in order to gain an overview of the introductory programming literature. Partitioning the literature into papers addressing the student, teaching, the curriculum, and assessment, we explore trends, highlight advances in knowledge over the past 15 years, and indicate possible directions for future research

    Asynchronous spiking neurons, the natural key to exploit temporal sparsity

    Get PDF
    Inference of Deep Neural Networks for stream signal (Video/Audio) processing in edge devices is still challenging. Unlike the most state of the art inference engines which are efficient for static signals, our brain is optimized for real-time dynamic signal processing. We believe one important feature of the brain (asynchronous state-full processing) is the key to its excellence in this domain. In this work, we show how asynchronous processing with state-full neurons allows exploitation of the existing sparsity in natural signals. This paper explains three different types of sparsity and proposes an inference algorithm which exploits all types of sparsities in the execution of already trained networks. Our experiments in three different applications (Handwritten digit recognition, Autonomous Steering and Hand-Gesture recognition) show that this model of inference reduces the number of required operations for sparse input data by a factor of one to two orders of magnitudes. Additionally, due to fully asynchronous processing this type of inference can be run on fully distributed and scalable neuromorphic hardware platforms

    Accuracy and feasibility of an android-based digital assessment tool for post stroke visual disorders - The StrokeVision App

    Get PDF
    Background: Visual impairment affects up to 70% of stroke survivors. We designed an app (StrokeVision) to facilitate screening for common post stroke visual issues (acuity, visual fields and visual inattention). We sought to describe the test-time, feasibility, acceptability and accuracy of our app based digital visual assessments against a) current methods used for bedside screening, and b) gold standard measures. Methods: Patients were prospectively recruited from acute stroke settings. Index tests were app based assessments of fields and inattention performed by a trained researcher. We compared against usual clinical screening practice of visual fields to confrontation including inattention assessment (simultaneous stimuli). We also compared app to gold standard assessments of formal kinetic perimetry (Goldman or Octopus Visual Field Assessment); and pencil and paper based tests of inattention (Albert’s, Star Cancellation, and Line Bisection). Results of inattention and field tests were adjudicated by a specialist Neuro-Ophthalmologist. All assessors were masked to each other’s results. Participants and assessors graded acceptability using a bespoke scale that ranged from 0 (completely unacceptable) to 10 (perfect acceptability). Results: Of 48 stroke survivors recruited, the complete battery of index and reference tests for fields was successfully completed in 45. Similar acceptability scores were observed for app-based (assessor median score 10 [IQR:9-10]; patient 9 [IQR:8-10]) and traditional bedside testing (assessor 10 [IQR:9-10; patient 10 [IQR:9-10]). Median test time was longer for app-based testing (combined time-to-completion of all digital tests 420 seconds [IQR:390-588]) when compared with conventional bedside testing (70 seconds, [IQR:40-70]) but shorter than gold standard testing (1260 seconds, [IQR:1005-1620]). Compared with gold standard assessments, usual screening practice demonstrated 79% sensitivity and 82% specificity for detection of a stroke-related field defect. This compares with 79% sensitivity and 88% specificity for StrokeVision digital assessment. Conclusion: StrokeVision shows promise as a screening tool for visual complications in the acute phase of stroke. The app is at least as good as usual screening and offers other functionality that may make it attractive for use in acute stroke
    • 

    corecore