5,324 research outputs found

    Army Information Technology Procurement: a Business Process Analysis

    Get PDF
    This thesis presents a business process analysis of the Army\u27s ICT procurement system. The research identified several inefficiencies and proposes several potential solutions. the contributions of this research include a unified taxonomy, a method to prioritize requests, and system architecture products for development of an automated and sustainable collaboration interface for the CIO/G6 to streamline their IT acquisition process. Development of a centralized system would reduce waste in the request process from submission to formal accounting, hasten the movement of requests between stakeholders, maintain a digital signature authorization for each approval authority, provide a reporting database to recognize reprogramming thresholds, and deliver relevant metrics and analysis to help inform the Army\u27s IT resourcing decisions

    The RAGE Game Software Components Repository for Supporting Applied Game Development

    Get PDF
    This paper presents the architecture of the RAGE repository, which is a unique and dedicated infrastructure that provides access to a wide variety of advanced technology components for applied game development. The RAGE project, which is the principal Horizon2020 research and innovation project on applied gaming, develops up to three dozens of software components (RAGE software assets) that are reusable across a wide diversity of game engines, game platforms and programming languages. The RAGE repository provides storage space for assets and their artefacts and is designed as an asset life-cycle management system for defining, publishing, updating, searching and packaging for distribution of these assets. It will be embedded in a social platform for asset developers and other users. A dedicated Asset Repository Manager provides the main functionality of the repository and its integration with other systems. Tools supporting the Asset Manager are presented and discussed. When the RAGE repository is in full operation, applied game developers will be able to easily enhance the quality of their games by including selected advanced game software assets. Making available the RAGE repository system and its variety of software assets aims to enhance the coherence and decisiveness of the applied game industry

    Managing large amounts of data generated by a Smart City internet of things deployment

    Get PDF
    The Smart City concept is being developed from a lot of different axes encompassing multiple areas of social and technical sciences. However, something that is common to all these approaches is the central role that the capacity of sharing information has. Hence, Information and Communication Technologies (ICT) are seen as key enablers for the transformation of urban regions into Smart Cities. Two of these technologies, namely Internet of Things and Big Data, have a predominant position among them. The capacity to "sense the city" and access all this information and provide added-value services based on knowledge derived from it are critical to achieving the Smart City vision. This paper reports on the specification and implementation of a software platform enabling the management and exposure of the large amount of information that is continuously generated by the IoT deployment in the city of Santander.This work has been partially funded by the research project SmartSantander, under FP7- ICT-2009-5 of the 7th Framework Programme of the European Community. The authors would also like to express their gratitude to the Spanish government for the funding in the following project: "Connectivity as a Service: Access for the Internet of the Future", COSAIF (TEC2012-38574-C02-01)

    Knowledge management initiative in Universiti Putra Malaysia (UPM)

    Get PDF
    UPM realizes that its organizational knowledge which resides in individual brain or stored in organizational processes, products, facilities, systems and documentation is quickly becoming a sustainable competitive advantage.This growing attention has lead to the idea that these resources must be protected, cultivated and shared among its members.Knowledge Management Centre (KMC) in UPM was established in 2002. Based on the vision and mission of UPM and KMC, five critical management areas have been identified: Infrastructure, Knowledge Repository, Marketing and Customer Service, Intellectual Property Rights (IPR), and Knowledge Management Research and Development. This paper discusses the knowledge framework adapted by KMC.These key elements represent the building blocks in implementing the Knowledge Management System (KMS) in sustaining and extending knowledge sharing culture in UPM. An overview of technologies used in (KMS) components, will be discussed in their actual and potential contribution in the process of KMS in UPM

    Exploring Design Requirements of Fleet Telematics Systems Supporting Road Freight Transportation: A Digital Service Side Perspective

    Get PDF
    Road freight operators (RFOs) optimize their fleet management processes using fleet telematics systems (FTSs). Therefore, the selection of FTSs by RFOs is driven by transport specifications from the customer side leading to substantial search costs. However, FTSs vary significantly in their design requirements to assist road freight operations. Hence, we analyze 74 web pages from FTSs of existing telematics vendors to elicit 31 design requirements (DRs) which we aggregated into nine requirement sets (RSs). Subsequently, 42 practitioners from five digital road freight service enterprises experienced in using FTSs validate the DRs and evaluate their importance with RSs following the Analytical Hierarchy Process (AHP) method. The results reveal that DRs and RSs promoting driver monitoring and IT integration are perceived more important than items promoting fleet and logistics support. Our contribution sheds light on an emerging topic in logistics and establishes a knowledge base that guides the design of future FTSs

    Keeping Research Data Safe 2: Final Report

    Get PDF
    The first Keeping Research Data Safe study funded by JISC made a major contribution to understanding of long-term preservation costs for research data by developing a cost model and indentifying cost variables for preserving research data in UK universities (Beagrie et al, 2008). However it was completed over a very constrained timescale of four months with little opportunity to follow up other major issues or sources of preservation cost information it identified. It noted that digital preservation costs are notoriously difficult to address in part because of the absence of good case studies and longitudinal information for digital preservation costs or cost variables. In January 2009 JISC issued an ITT for a study on the identification of long-lived digital datasets for the purposes of cost analysis. The aim of this work was to provide a larger body of material and evidence against which existing and future data preservation cost modelling exercises could be tested and validated. The proposal for the KRDS2 study was submitted in response by a consortium consisting of 4 partners involved in the original Keeping Research Data Safe study (Universities of Cambridge and Southampton, Charles Beagrie Ltd, and OCLC Research) and 4 new partners with significant data collections and interests in preservation costs (Archaeology Data Service, University of London Computer Centre, University of Oxford, and the UK Data Archive). A range of supplementary materials in support of this main report have been made available on the KRDS2 project website at http://www.beagrie.com/jisc.php. That website will be maintained and continuously updated with future work as a resource for KRDS users

    Integration of Configurable Dynamic Notification System with CSIBER Website

    Get PDF
    In this digital era every academic institution and commercial setup investments enormously in hosting and maintenance of the website which plays a critical role in the success of an organization by making it reachable across wide geographical area at any time. A carefully designed website reflects institute’s best assets and delivers tremendous first-hand information to any user at any time irrespective of his/her geographical location. To stay in market there is a constant requirement for changing the look and feel and content of the website and incorporating dynamism into the website. It is inevitable to keep the website constantly updated since it is accessible to the public. As the new website data pertaining to event information, notification etc is constantly generated and old data soon becomes obsolete, it demands for continuous manual efforts from the human resource to keep the dynamically changing data current and up-to-date. It can save tremendous amount of human effort and time, if such a task is automated which in turn enables meaningful data to be displayed on the website with very little human intervention. To facilitate this new technologies such as jQuery, JSON, angular JS etc. are emerging continuously to name a few. In the current paper, the author has proposed an algorithm for the integration of dynamic notification system with existing website of CSIBER. The algorithm is implemented in PHP and MySQL and hosted on web server employing the web hosting service availed by the organization. The dynamic module is scheduled to be executed periodically on a daily basic by the Cron utility and server-side include is dynamically created and embedded in home page. Every month’s events can be scheduled and stored in the backend database which is parsed by dynamic module and the required data is accordingly generated. As a measure towards efficiency improvement, the tool is executed once per day instead of executing it for every user request. Two options are proposed for integration, one on client-side and the second one on the server-side. The dialog displaying the notification data is rendered mobile friendly and is subject to Google’s mobile friendly test
    • …
    corecore