5,290 research outputs found

    Exploratory study to explore the role of ICT in the process of knowledge management in an Indian business environment

    Get PDF
    In the 21st century and the emergence of a digital economy, knowledge and the knowledge base economy are rapidly growing. To effectively be able to understand the processes involved in the creating, managing and sharing of knowledge management in the business environment is critical to the success of an organization. This study builds on the previous research of the authors on the enablers of knowledge management by identifying the relationship between the enablers of knowledge management and the role played by information communication technologies (ICT) and ICT infrastructure in a business setting. This paper provides the findings of a survey collected from the four major Indian cities (Chennai, Coimbatore, Madurai and Villupuram) regarding their views and opinions about the enablers of knowledge management in business setting. A total of 80 organizations participated in the study with 100 participants in each city. The results show that ICT and ICT infrastructure can play a critical role in the creating, managing and sharing of knowledge in an Indian business environment

    The enablers and implementation model for mobile KMS in Australian healthcare

    Get PDF
    In this research project, the enablers in implementing mobile KMS in Australian regional healthcare will be investigated, and a validated framework and guidelines to assist healthcare in implementing mobile KMS will also be proposed with both qualitative and quantitative approaches. The outcomes for this study are expected to improve the understanding the enabling factors in implementing mobile KMS in Australian healthcare, as well as provide better guidelines for this process

    An intrusion detection system on network security for web application

    Get PDF
    For the last 15 years, significant amount of resources are invested to enhance the security at system and network level, such as firewalls, IDS, anti-virus, etc. IT infrastructure tends to be more and more secure than ever before. As an ever-increasing number of businesses move to take advantage of the Internet, web applications are becoming more prevalent and increasingly more sophisticated, and as such they are critical to almost all major online businesses. The very nature of web applications, their abilities to collect, process and disseminate information over the Internet, exposes thern to rnalicious hackers. However, the traditional security solutions such as firewall, network and host IDS, do not provide comprehensive protection against the attacks common in the web applications. The thesis concentrates on the research of an advanced intrusion detection framework. An intrusion detection framework was designed which works along with any custom web application to collect and analyze HTTP traffic with various advanced algorithms. Two intrusion detection algorithms are tested and adopted in the framework. Pattern Matching is the most popular intrusion detection technology adopted by most of the commercial intrusion detection system. Behavior Modeling is a new technology that can dynamically adapt the detection algorithms in accordance with the application behavior. The combination of the two intrusion technologies has dramatically reduced false positive and false negative alarms. Moreover, a Servlet filter-based Web Agent is used to capture HTTP request. An isolated Response Module is developed to execute pre-defined action according to the analysis result. A database is involved to provide persistence support for the framework. Also, several simulation experiments are developed for evaluating the efficiency of detecting capability.\ud _____________________________________________________________________________

    ANCHOR: logically-centralized security for Software-Defined Networks

    Get PDF
    While the centralization of SDN brought advantages such as a faster pace of innovation, it also disrupted some of the natural defenses of traditional architectures against different threats. The literature on SDN has mostly been concerned with the functional side, despite some specific works concerning non-functional properties like 'security' or 'dependability'. Though addressing the latter in an ad-hoc, piecemeal way, may work, it will most likely lead to efficiency and effectiveness problems. We claim that the enforcement of non-functional properties as a pillar of SDN robustness calls for a systemic approach. As a general concept, we propose ANCHOR, a subsystem architecture that promotes the logical centralization of non-functional properties. To show the effectiveness of the concept, we focus on 'security' in this paper: we identify the current security gaps in SDNs and we populate the architecture middleware with the appropriate security mechanisms, in a global and consistent manner. Essential security mechanisms provided by anchor include reliable entropy and resilient pseudo-random generators, and protocols for secure registration and association of SDN devices. We claim and justify in the paper that centralizing such mechanisms is key for their effectiveness, by allowing us to: define and enforce global policies for those properties; reduce the complexity of controllers and forwarding devices; ensure higher levels of robustness for critical services; foster interoperability of the non-functional property enforcement mechanisms; and promote the security and resilience of the architecture itself. We discuss design and implementation aspects, and we prove and evaluate our algorithms and mechanisms, including the formalisation of the main protocols and the verification of their core security properties using the Tamarin prover.Comment: 42 pages, 4 figures, 3 tables, 5 algorithms, 139 reference

    A Covert Channel Using Named Resources

    Full text link
    A network covert channel is created that uses resource names such as addresses to convey information, and that approximates typical user behavior in order to blend in with its environment. The channel correlates available resource names with a user defined code-space, and transmits its covert message by selectively accessing resources associated with the message codes. In this paper we focus on an implementation of the channel using the Hypertext Transfer Protocol (HTTP) with Uniform Resource Locators (URLs) as the message names, though the system can be used in conjunction with a variety of protocols. The covert channel does not modify expected protocol structure as might be detected by simple inspection, and our HTTP implementation emulates transaction level web user behavior in order to avoid detection by statistical or behavioral analysis.Comment: 9 page

    Full Reference Objective Quality Assessment for Reconstructed Background Images

    Full text link
    With an increased interest in applications that require a clean background image, such as video surveillance, object tracking, street view imaging and location-based services on web-based maps, multiple algorithms have been developed to reconstruct a background image from cluttered scenes. Traditionally, statistical measures and existing image quality techniques have been applied for evaluating the quality of the reconstructed background images. Though these quality assessment methods have been widely used in the past, their performance in evaluating the perceived quality of the reconstructed background image has not been verified. In this work, we discuss the shortcomings in existing metrics and propose a full reference Reconstructed Background image Quality Index (RBQI) that combines color and structural information at multiple scales using a probability summation model to predict the perceived quality in the reconstructed background image given a reference image. To compare the performance of the proposed quality index with existing image quality assessment measures, we construct two different datasets consisting of reconstructed background images and corresponding subjective scores. The quality assessment measures are evaluated by correlating their objective scores with human subjective ratings. The correlation results show that the proposed RBQI outperforms all the existing approaches. Additionally, the constructed datasets and the corresponding subjective scores provide a benchmark to evaluate the performance of future metrics that are developed to evaluate the perceived quality of reconstructed background images.Comment: Associated source code: https://github.com/ashrotre/RBQI, Associated Database: https://drive.google.com/drive/folders/1bg8YRPIBcxpKIF9BIPisULPBPcA5x-Bk?usp=sharing (Email for permissions at: ashrotreasuedu
    • …
    corecore