495,964 research outputs found

    Security Analysis and Improvement Model for Web-based Applications

    Get PDF
    Today the web has become a major conduit for information. As the World Wide Web?s popularity continues to increase, information security on the web has become an increasing concern. Web information security is related to availability, confidentiality, and data integrity. According to the reports from http://www.securityfocus.com in May 2006, operating systems account for 9% vulnerability, web-based software systems account for 61% vulnerability, and other applications account for 30% vulnerability. In this dissertation, I present a security analysis model using the Markov Process Model. Risk analysis is conducted using fuzzy logic method and information entropy theory. In a web-based application system, security risk is most related to the current states in software systems and hardware systems, and independent of web application system states in the past. Therefore, the web-based applications can be approximately modeled by the Markov Process Model. The web-based applications can be conceptually expressed in the discrete states of (web_client_good; web_server_good, web_server_vulnerable, web_server_attacked, web_server_security_failed; database_server_good, database_server_vulnerable, database_server_attacked, database_server_security_failed) as state space in the Markov Chain. The vulnerable behavior and system response in the web-based applications are analyzed in this dissertation. The analyses focus on functional availability-related aspects: the probability of reaching a particular security failed state and the mean time to the security failure of a system. Vulnerability risk index is classified in three levels as an indicator of the level of security (low level, high level, and failed level). An illustrative application example is provided. As the second objective of this dissertation, I propose a security improvement model for the web-based applications using the GeoIP services in the formal methods. In the security improvement model, web access is authenticated in role-based access control using user logins, remote IP addresses, and physical locations as subject credentials to combine with the requested objects and privilege modes. Access control algorithms are developed for subjects, objects, and access privileges. A secure implementation architecture is presented. In summary, the dissertation has developed security analysis and improvement model for the web-based application. Future work will address Markov Process Model validation when security data collection becomes easy. Security improvement model will be evaluated in performance aspect

    An Implementation of Internet-connected Indoor Dust Bin Waste Level Monitoring System for Office Use

    Get PDF
    The paper discusses an implementation of an internet-connected monitoring system to optimize waste collection within office space environment. The implementation consists of four parts – the physical dustbin module, network, middleware and user interface. The implementation uses NodeMCU board with embedded microcontroller to sense and transmit waste level data to a middleware over 802.11 wireless network, while the middleware aggregate data from all registered dustbins within the office area. The aggregated data is then stored in a database server for further analysis. The web-based user interface is used to monitor the dustbins waste level as well as historical data report. This information is vital for waste level management planning and prediction. Additionally, a Technology Acceptance Model survey result has shown that the monitoring system is perceived to be easy to use and useful for office management in monitoring and managing their waste

    Security Analysis and Improvement Model for Web-based Applications

    Get PDF
    Today the web has become a major conduit for information. As the World Wide Web?s popularity continues to increase, information security on the web has become an increasing concern. Web information security is related to availability, confidentiality, and data integrity. According to the reports from http://www.securityfocus.com in May 2006, operating systems account for 9% vulnerability, web-based software systems account for 61% vulnerability, and other applications account for 30% vulnerability. In this dissertation, I present a security analysis model using the Markov Process Model. Risk analysis is conducted using fuzzy logic method and information entropy theory. In a web-based application system, security risk is most related to the current states in software systems and hardware systems, and independent of web application system states in the past. Therefore, the web-based applications can be approximately modeled by the Markov Process Model. The web-based applications can be conceptually expressed in the discrete states of (web_client_good; web_server_good, web_server_vulnerable, web_server_attacked, web_server_security_failed; database_server_good, database_server_vulnerable, database_server_attacked, database_server_security_failed) as state space in the Markov Chain. The vulnerable behavior and system response in the web-based applications are analyzed in this dissertation. The analyses focus on functional availability-related aspects: the probability of reaching a particular security failed state and the mean time to the security failure of a system. Vulnerability risk index is classified in three levels as an indicator of the level of security (low level, high level, and failed level). An illustrative application example is provided. As the second objective of this dissertation, I propose a security improvement model for the web-based applications using the GeoIP services in the formal methods. In the security improvement model, web access is authenticated in role-based access control using user logins, remote IP addresses, and physical locations as subject credentials to combine with the requested objects and privilege modes. Access control algorithms are developed for subjects, objects, and access privileges. A secure implementation architecture is presented. In summary, the dissertation has developed security analysis and improvement model for the web-based application. Future work will address Markov Process Model validation when security data collection becomes easy. Security improvement model will be evaluated in performance aspect

    Atomistic modelling of scattering data in the Collaborative Computational Project for Small Angle Scattering (CCP-SAS)

    Get PDF
    The capabilities of current computer simulations provide a unique opportunity to model small-angle scattering (SAS) data at the atomistic level, and to include other structural constraints ranging from molecular and atomistic energetics to crystallography, electron microscopy and NMR. This extends the capabilities of solution scattering and provides deeper insights into the physics and chemistry of the systems studied. Realizing this potential, however, requires integrating the experimental data with a new generation of modelling software. To achieve this, the CCP-SAS collaboration (http://www.ccpsas.org/) is developing open-source, high-throughput and user-friendly software for the atomistic and coarse-grained molecular modelling of scattering data. Robust state-of-the-art molecular simulation engines and molecular dynamics and Monte Carlo force fields provide constraints to the solution structure inferred from the small-angle scattering data, which incorporates the known physical chemistry of the system. The implementation of this software suite involves a tiered approach in which GenApp provides the deployment infrastructure for running applications on both standard and high-performance computing hardware, and SASSIE provides a workflow framework into which modules can be plugged to prepare structures, carry out simulations, calculate theoretical scattering data and compare results with experimental data. GenApp produces the accessible web-based front end termed SASSIE-web, and GenApp and SASSIE also make community SAS codes available. Applications are illustrated by case studies: (i) inter-domain flexibility in two- to six-domain proteins as exemplified by HIV-1 Gag, MASP and ubiquitin; (ii) the hinge conformation in human IgG2 and IgA1 antibodies; (iii) the complex formed between a hexameric protein Hfq and mRNA; and (iv) synthetic 'bottlebrush' polymers

    Semantic Gateway as a Service architecture for IoT Interoperability

    Get PDF
    The Internet of Things (IoT) is set to occupy a substantial component of future Internet. The IoT connects sensors and devices that record physical observations to applications and services of the Internet. As a successor to technologies such as RFID and Wireless Sensor Networks (WSN), the IoT has stumbled into vertical silos of proprietary systems, providing little or no interoperability with similar systems. As the IoT represents future state of the Internet, an intelligent and scalable architecture is required to provide connectivity between these silos, enabling discovery of physical sensors and interpretation of messages between things. This paper proposes a gateway and Semantic Web enabled IoT architecture to provide interoperability between systems using established communication and data standards. The Semantic Gateway as Service (SGS) allows translation between messaging protocols such as XMPP, CoAP and MQTT via a multi-protocol proxy architecture. Utilization of broadly accepted specifications such as W3C's Semantic Sensor Network (SSN) ontology for semantic annotations of sensor data provide semantic interoperability between messages and support semantic reasoning to obtain higher-level actionable knowledge from low-level sensor data.Comment: 16 page

    Two Case Studies of Subsystem Design for General-Purpose CSCW Software Architectures

    Get PDF
    This paper discusses subsystem design guidelines for the software architecture of general-purpose computer supported cooperative work systems, i.e., systems that are designed to be applicable in various application areas requiring explicit collaboration support. In our opinion, guidelines for subsystem level design are rarely given most guidelines currently given apply to the programming language level. We extract guidelines from a case study of the redesign and extension of an advanced commercial workflow management system and place them into the context of existing software engineering research. The guidelines are then validated against the design decisions made in the construction of a widely used web-based groupware system. Our approach is based on the well-known distinction between essential (logical) and physical architectures. We show how essential architecture design can be based on a direct mapping of abstract functional concepts as found in general-purpose systems to modules in the essential architecture. The essential architecture is next mapped to a physical architecture by applying software clustering and replication to achieve the required distribution and performance characteristics

    High-Performance Cloud Computing: A View of Scientific Applications

    Full text link
    Scientific computing often requires the availability of a massive number of computers for performing large scale experiments. Traditionally, these needs have been addressed by using high-performance computing solutions and installed facilities such as clusters and super computers, which are difficult to setup, maintain, and operate. Cloud computing provides scientists with a completely new model of utilizing the computing infrastructure. Compute resources, storage resources, as well as applications, can be dynamically provisioned (and integrated within the existing infrastructure) on a pay per use basis. These resources can be released when they are no more needed. Such services are often offered within the context of a Service Level Agreement (SLA), which ensure the desired Quality of Service (QoS). Aneka, an enterprise Cloud computing solution, harnesses the power of compute resources by relying on private and public Clouds and delivers to users the desired QoS. Its flexible and service based infrastructure supports multiple programming paradigms that make Aneka address a variety of different scenarios: from finance applications to computational science. As examples of scientific computing in the Cloud, we present a preliminary case study on using Aneka for the classification of gene expression data and the execution of fMRI brain imaging workflow.Comment: 13 pages, 9 figures, conference pape

    A maturity model for implementation and application of Enterprise Resource Planning systems and ERP utilization to Industry 4.0

    Get PDF
    This study analyzes the evolution of ERP systems and the current trends of their implementation and application, also the organization's readiness for further digitalization. Throughout the study, it is identified that there are many critical factors that have an impact on the process of implementation and application of ERP systems. There are different studies and reports that confirm that organizations are struggling with this process. The researchers have identified and proposed different stages of implementation and application of ERP systems. Moreover, three maturity models are identified, which aim to measure the ERP maturity level in the organizations, but they lack defining the complete process which supports the organizations to check their maturity level of ERP systems by themselves. On the other hand, Industry 4.0, as a new technological concept, aims to support organizations to complete digitalization and automation of their processes and functions, specifically in the manufacturing industry. Based on the undertaken study, a new ERP Maturity Model (ERPMM) is developed with the aim to measure the maturity of implementation and application of the ERP system in the organizations. With the purpose to check the reliability and validity of the developed model, the quantitative methodology was applied. The proposed model (ERPMM) to measure the maturity of ERP system implementation and application will support organizations in generating a clear picture of their organization status related to the implementation and application of the ERP system. In this way, they are able to evaluate the benefits of implementation and application of an ERP system, and whether they should do anything in the way they are applying such a system. The study also analyzes the impact of different factors that have an effect on a successful ERP system implementation and application. In addition, this study has investigated whether strategic use of IT positively affects the ERP selection, implementation, and application process, as well as if appropriate ERP selection has a positive effect on the implementation and application and the role of ERP implementation on the application. Also, the impact of the ERP application on Business Performance. The study shows that there is a relationship between all the stages of ERP implementation and application, starting with the organization's IT strategy to the ERP application. The results of the study present that the ERP application has a positive impact on business performance. Also, the study presents an analysis of the integration of ERP and Industry 4.0, which is done based on secondary data. Industry 4.0 is seen as the beginning phase, where computers and automation become connected and as an opportunity to increase the efficiency and effectiveness in the manufacturing industry, with the application of real-time data and information by integrating physical machinery and devices with networked sensors and software to predict, control and reduce costs in a long-term view. Primary data was used to analyze if the ERP application can be used to predict the readiness of the organizations for Industry 4.0. Based on the statistical analysis, it is proved that partially ERP application can be used to predict the readiness of the organizations for Industry 4.0. Based on the findings, there are many challenges related to the integration of Industry 4.0 and current ERP systems, especially when it comes to machine to machine, machine to ERP communication, and the security of the data. Based on the proposed ERP Maturity Model (ERPMM), a prototype is developed, which supports the organizations in evaluating their status of ERP system implementation and application by themselves. The prototype is a web-based application that is developed on PHP and MySQL database. The main contributions of this study are: Identifies and presents the current status of implementation and application of the ERP system and ERP maturity models; Analyzes the role of strategic use of IT in the process of ERP selection, implementation, and application; Analyzes the impact of ERP selection on implementation and application also the effect of ERP implementation on the application; Identifies the ERP application effect on business performance also the ability of the organization to evaluate their readiness for further digitalization based on the ERP application; A new ERP Maturity Model (ERPMM) to support the organizations for the evaluation of implementation and application of the ERP system is developed; A developed prototype that applies ERPMM to support organizations for ERP maturity level assessment
    • …
    corecore