4,974 research outputs found

    Knowledge based economy –technological perspective: implications and solutions for agility improvement and innovation achievement in higher education

    Get PDF
    Nowadays, the universities, as driving forces of innovative economy and as components of modern society, based on knowledge and collaboration, face a number of challenges and difficulties. In order to overcome them and to create/ensure the bases of eScience education and research activities, universities have to change culturally, strategically, and operationally. The paper highlights the need for ICT (Information and Communications Technology) use and its implications for higher education. In addition, the study places the theoretical aspects into a specific context, combining technologies through interfunctionality in order to ensure academic education agility and innovation. This involves the use of knowledge, process management, service oriented architectures, and Cloud solutions, exemplifying on the Academy of Economic Studies, Bucharest case. The integrated approach is extended using the SharePoint 2010 platform to improve academic management and achieve harmonization of teaching and research and development content and methods with European Union standards. The platform has been implemented and tested within two AES departments and the Master’s Degree Studies in Computer Economics. The results have encouraged the integration of the proposed solution within the institution. The study was based on the authors' competences in the areas addressed and was joined with a rigorous analysis of technology trends and various EU countries (Italy, Germany France, Belgium, Netherlands etc.) universities outputs regarding knowledge economy implications for economic higher education studies.knowledge-based economy, information and communications technology, university studies in economics, university management, agility, innovation, SharePoint 2010

    Semantic discovery and reuse of business process patterns

    Get PDF
    Patterns currently play an important role in modern information systems (IS) development and their use has mainly been restricted to the design and implementation phases of the development lifecycle. Given the increasing significance of business modelling in IS development, patterns have the potential of providing a viable solution for promoting reusability of recurrent generalized models in the very early stages of development. As a statement of research-in-progress this paper focuses on business process patterns and proposes an initial methodological framework for the discovery and reuse of business process patterns within the IS development lifecycle. The framework borrows ideas from the domain engineering literature and proposes the use of semantics to drive both the discovery of patterns as well as their reuse

    ERP implementation methodologies and frameworks: a literature review

    Get PDF
    Enterprise Resource Planning (ERP) implementation is a complex and vibrant process, one that involves a combination of technological and organizational interactions. Often an ERP implementation project is the single largest IT project that an organization has ever launched and requires a mutual fit of system and organization. Also the concept of an ERP implementation supporting business processes across many different departments is not a generic, rigid and uniform concept and depends on variety of factors. As a result, the issues addressing the ERP implementation process have been one of the major concerns in industry. Therefore ERP implementation receives attention from practitioners and scholars and both, business as well as academic literature is abundant and not always very conclusive or coherent. However, research on ERP systems so far has been mainly focused on diffusion, use and impact issues. Less attention has been given to the methods used during the configuration and the implementation of ERP systems, even though they are commonly used in practice, they still remain largely unexplored and undocumented in Information Systems research. So, the academic relevance of this research is the contribution to the existing body of scientific knowledge. An annotated brief literature review is done in order to evaluate the current state of the existing academic literature. The purpose is to present a systematic overview of relevant ERP implementation methodologies and frameworks as a desire for achieving a better taxonomy of ERP implementation methodologies. This paper is useful to researchers who are interested in ERP implementation methodologies and frameworks. Results will serve as an input for a classification of the existing ERP implementation methodologies and frameworks. Also, this paper aims also at the professional ERP community involved in the process of ERP implementation by promoting a better understanding of ERP implementation methodologies and frameworks, its variety and history

    Cloud based testing of business applications and web services

    Get PDF
    This paper deals with testing of applications based on the principles of cloud computing. It is aimed to describe options of testing business software in clouds (cloud testing). It identifies the needs for cloud testing tools including multi-layer testing; service level agreement (SLA) based testing, large scale simulation, and on-demand test environment. In a cloud-based model, ICT services are distributed and accessed over networks such as intranet or internet, which offer large data centers deliver on demand, resources as a service, eliminating the need for investments in specific hardware, software, or on data center infrastructure. Businesses can apply those new technologies in the contest of intellectual capital management to lower the cost and increase competitiveness and also earnings. Based on comparison of the testing tools and techniques, the paper further investigates future trend of cloud based testing tools research and development. It is also important to say that this comparison and classification of testing tools describes a new area and it has not yet been done

    A Governance Reference Model For Service-oriented Architecture-based Common Data Initialization A Case Study Of Military Simulation Federation Systems

    Get PDF
    Military simulation and command and control federations have become large, complex distributed systems that integrate with a variety of legacy and current simulations, and real command and control systems locally as well as globally. As these systems continue to become increasingly more complex so does the data that initializes them. This increased complexity has introduced a major problem in data initialization coordination which has been handled by many organizations in various ways. Serviceoriented architecture (SOA) solutions have been introduced to promote easier data interoperability through the use of standards-based reusable services and common infrastructure. However, current SOA-based solutions do not incorporate formal governance techniques to drive the architecture in providing reliable, consistent, and timely information exchange. This dissertation identifies the need to establish governance for common data initialization service development oversight, presents current research and applicable solutions that address some aspects of SOA-based federation data service governance, and proposes a governance reference model for development of SOA-based common data initialization services in military simulation and command and control federations

    Surface Warfare Center Contributions for Addressing Warfare System Development Challenges and Goals

    Get PDF
    Proceedings Paper (for Acquisition Research Program)The size, interdependencies, and complexity of Navy software intensive warfare systems are continuing to rapidly increase. Numerous studies and reports indicate that the majority of DoD/Navy warfare system development efforts are failing to consistently successfully deliver high quality software systems on schedule and within budget. This paper provides several examples of successful development efforts that utilized Naval Surface Warfare Center (WC) in-house expertise to successfully deliver open architecture (OA)''based multi-system and multi-platform capable software systems with reusable components. This paper also provides insight into how government in-house software expertise can be utilized to mitigate many of the documented software system acquisition challenges that prevent the successful development and delivery of high quality software systems on schedule and within budget.Acquisition Research ProgramApproved for public release; distribution is unlimited

    AEGIS Platforms: The Potential Impact of Open Architecture in Sustaining Engineering

    Get PDF
    Sponsored Report (for Acquisition Research Program)This proof-of-concept case study analyzes the potential benefits of open architecture (OA) in the AEGIS software maintenance and upgrade process. In a multi-phased approach, the Knowledge value Added/Real-Options (KVA+RO) framework was applied to sustaining engineering on specific AEGIS software processes.Naval Postgraduate School Acquisition Research ProgramApproved for public release; distribution is unlimited

    Dynamic Vision Sensor integration on FPGA-based CNN accelerators for high-speed visual classification

    Get PDF
    Deep-learning is a cutting edge theory that is being applied to many fields. For vision applications the Convolutional Neural Networks (CNN) are demanding significant accuracy for classification tasks. Numerous hardware accelerators have populated during the last years to improve CPU or GPU based solutions. This technology is commonly prototyped and tested over FPGAs before being considered for ASIC fabrication for mass production. The use of commercial typical cameras (30fps) limits the capabilities of these systems for high speed applications. The use of dynamic vision sensors (DVS) that emulate the behavior of a biological retina is taking an incremental importance to improve this applications due to its nature, where the information is represented by a continuous stream of spikes and the frames to be processed by the CNN are constructed collecting a fixed number of these spikes (called events). The faster an object is, the more events are produced by DVS, so the higher is the equivalent frame rate. Therefore, these DVS utilization allows to compute a frame at the maximum speed a CNN accelerator can offer. In this paper we present a VHDL/HLS description of a pipelined design for FPGA able to collect events from an Address-Event-Representation (AER) DVS retina to obtain a normalized histogram to be used by a particular CNN accelerator, called NullHop. VHDL is used to describe the circuit, and HLS for computation blocks, which are used to perform the normalization of a frame needed for the CNN. Results outperform previous implementations of frames collection and normalization using ARM processors running at 800MHz on a Zynq7100 in both latency and power consumption. A measured 67% speedup factor is presented for a Roshambo CNN real-time experiment running at 160fps peak rate.Comment: 7 page
    corecore