34,234 research outputs found

    Innovation and Employability in Knowledge Management Curriculum Design

    Get PDF
    During 2007/8, Southampton Solent University worked on a Leadership Foundation project focused on the utility of the multi-functional team approach as a vehicle to deliver innovation in strategic and operational terms in higher education (HE). The Task-Orientated Multi-Functional Team Approach (TOMFTA) project took two significant undertakings for Southampton Solent as key areas for investigation, one academic and one administrative in focus. The academic project was the development of an innovative and novel degree programme in knowledge management (KM). The new KM Honours degree programme is timely both in recognition of the increasing importance to organisations of knowledge as a commodity, and in its adoption of a distinctive structure and pedagogy. The methodology for the KM curriculum design brings together student-centred and market-driven approaches: positioning the programme for the interests of students and requirements of employers, rather than just the capabilities of staff; while looking at ways that courses can be delivered with more flexibility, e.g. accelerated and block-mode; with level-differentiated activities, common cross-year content and material that is multi-purpose for use in short courses. In order to permit context at multiple levels in common, a graduate skills strand is taught separately as part of the University’s business-facing education agenda. The KM portfolio offers a programme of practically-based courses integrating key themes in knowledge management, business, information distribution and development of the media. They develop problem-solving, communications, teamwork and other employability skills as well as the domain skills needed by emerging information management technologies. The new courses are built on activities which focus on different aspects of KM, drawing on existing content as a knowledge base. This paper presents the ongoing development of the KM programme through the key aspects in its conception and design

    Stochastic Data Clustering

    Full text link
    In 1961 Herbert Simon and Albert Ando published the theory behind the long-term behavior of a dynamical system that can be described by a nearly uncoupled matrix. Over the past fifty years this theory has been used in a variety of contexts, including queueing theory, brain organization, and ecology. In all these applications, the structure of the system is known and the point of interest is the various stages the system passes through on its way to some long-term equilibrium. This paper looks at this problem from the other direction. That is, we develop a technique for using the evolution of the system to tell us about its initial structure, and we use this technique to develop a new algorithm for data clustering.Comment: 23 page

    Mapping National Innovation Systems in the OECD Area

    Get PDF
    The purpose of this paper is to present new findings about the structure and the organization of innovative activities in selected OECD countries. By using the approach of national sys-tems of innovation as a conceptual framework and by applying multivariate data analysis techniques, this paper aims to add new insights into the specific structures of the eighteen national systems of innovation under study. A central result from this comparative study is a categorisation of national systems of innovation into different clusters, with each cluster rep-resenting distinctive cross-national structural similarities. By accounting for sectoral specif-ics, the commonly taken perspective on national innovation systems is extended. Thereby, a more precise picture of the structural composition of the analyzed national innovation sys-tems accrues. Also, a new linkage between the two approaches of national and sectoral inno-vation systems is created.national innovation systems, comparative study, classification

    The artificial retina processor for track reconstruction at the LHC crossing rate

    Get PDF
    We present results of an R&D study for a specialized processor capable of precisely reconstructing, in pixel detectors, hundreds of charged-particle tracks from high-energy collisions at 40 MHz rate. We apply a highly parallel pattern-recognition algorithm, inspired by studies of the processing of visual images by the brain as it happens in nature, and describe in detail an efficient hardware implementation in high-speed, high-bandwidth FPGA devices. This is the first detailed demonstration of reconstruction of offline-quality tracks at 40 MHz and makes the device suitable for processing Large Hadron Collider events at the full crossing frequency.Comment: 4th draft of WIT proceedings modified according to JINST referee's comments. 10 pages, 6 figures, 2 table
    • …
    corecore