19 research outputs found

    An adaptive trust based service quality monitoring mechanism for cloud computing

    Get PDF
    Cloud computing is the newest paradigm in distributed computing that delivers computing resources over the Internet as services. Due to the attractiveness of cloud computing, the market is currently flooded with many service providers. This has necessitated the customers to identify the right one meeting their requirements in terms of service quality. The existing monitoring of service quality has been limited only to quantification in cloud computing. On the other hand, the continuous improvement and distribution of service quality scores have been implemented in other distributed computing paradigms but not specifically for cloud computing. This research investigates the methods and proposes mechanisms for quantifying and ranking the service quality of service providers. The solution proposed in this thesis consists of three mechanisms, namely service quality modeling mechanism, adaptive trust computing mechanism and trust distribution mechanism for cloud computing. The Design Research Methodology (DRM) has been modified by adding phases, means and methods, and probable outcomes. This modified DRM is used throughout this study. The mechanisms were developed and tested gradually until the expected outcome has been achieved. A comprehensive set of experiments were carried out in a simulated environment to validate their effectiveness. The evaluation has been carried out by comparing their performance against the combined trust model and QoS trust model for cloud computing along with the adapted fuzzy theory based trust computing mechanism and super-agent based trust distribution mechanism, which were developed for other distributed systems. The results show that the mechanisms are faster and more stable than the existing solutions in terms of reaching the final trust scores on all three parameters tested. The results presented in this thesis are significant in terms of making cloud computing acceptable to users in verifying the performance of the service providers before making the selection

    A consumer perspective e-commerce websites evaluation model

    Get PDF
    Existing website evaluation methods have some weaknesses such as neglecting consumer criteria in their evaluation, being unable to deal with qualitative criteria, and involving complex weight and score calculations. This research aims to develop a hybrid consumer-oriented e-commerce website evaluation model based on the Fuzzy Analytical Hierarchy Process (FAHP) and the Hardmard Method (HM). Four phases were involved in developing the model: requirements identification, empirical study, model construction, and model confirmation. Requirements identification and empirical study were to identify critical web-design criteria and gather online consumers' preferences. Data, collected from 152 Malaysian consumers using online questionnaires, were used to identify critical e-commerce website features and scale of importance. The new evaluation model comprised of three components. First, the consumer evaluation criteria that consist of the important principles considered by consumers; second, the evaluation mechanisms that integrate FAHP and HM consisting of mathematical expressions that handle subjective judgments, new formulas to calculate the weight and score for each criterion; and third, the evaluation procedures consisting of activities that comprise of goal establishment, document preparation, and identification of website performance. The model was examined by six experts and applied to four case studies. The results show that the new model is practical, and appropriate to evaluate e-commerce websites from consumers' perspectives, and is able to calculate weights and scores for qualitative criteria in a simple way. In addition, it is able to assist decision-makers to make decisions in a measured objective way. The model also contributes new knowledge to the software evaluation fiel

    Enhanced artificial bee colony-least squares support vector machines algorithm for time series prediction

    Get PDF
    Over the past decades, the Least Squares Support Vector Machines (LSSVM) has been widely utilized in prediction task of various application domains. Nevertheless, existing literature showed that the capability of LSSVM is highly dependent on the value of its hyper-parameters, namely regularization parameter and kernel parameter, where this would greatly affect the generalization of LSSVM in prediction task. This study proposed a hybrid algorithm, based on Artificial Bee Colony (ABC) and LSSVM, that consists of three algorithms; ABC-LSSVM, lvABC-LSSVM and cmABC-LSSVM. The lvABC algorithm is introduced to overcome the local optima problem by enriching the searching behaviour using Levy mutation. On the other hand, the cmABC algorithm that incorporates conventional mutation addresses the over- fitting or under-fitting problem. The combination of lvABC and cmABC algorithm, which is later introduced as Enhanced Artificial Bee Colony–Least Squares Support Vector Machine (eABC-LSSVM), is realized in prediction of non renewable natural resources commodity price. Upon the completion of data collection and data pre processing, the eABC-LSSVM algorithm is designed and developed. The predictability of eABC-LSSVM is measured based on five statistical metrics which include Mean Absolute Percentage Error (MAPE), prediction accuracy, symmetric MAPE (sMAPE), Root Mean Square Percentage Error (RMSPE) and Theils’ U. Results showed that the eABC-LSSVM possess lower prediction error rate as compared to eight hybridization models of LSSVM and Evolutionary Computation (EC) algorithms. In addition, the proposed algorithm is compared to single prediction techniques, namely, Support Vector Machines (SVM) and Back Propagation Neural Network (BPNN). In general, the eABC-LSSVM produced more than 90% prediction accuracy. This indicates that the proposed eABC-LSSVM is capable of solving optimization problem, specifically in the prediction task. The eABC-LSSVM is hoped to be useful to investors and commodities traders in planning their investment and projecting their profit

    A Protected Single Sign-On Technique Using 2D Password in Distributed Computer Networks

    Get PDF
    Single Sign-On (SSO) is a new authentication mechanism that enables a legal user with a single credential to be authenticated by multiple service providers in a distributed computer network. Recently, a new SSO scheme providing well-organized security argument failed to meet credential privacy and soundness of authentication. The main goal of this project is to provide security using Single Sign-On scheme meeting at least three basic security requirements, i.e., unforgetability, credential privacy, and soundness. User identification is an important access control mechanism for client–server networking architectures. The concept of Single Sign-On can allow legal users to use the unitary token to access different service providers in distributed computer networks. To overcome few drawbacks like not preserving user anonymity when possible attacks occur and extensive overhead costs of time-synchronized mechanisms, we propose a secure Single Sign-On mechanism that is efficient, secure, and suitable for mobile devices in distributed computer networks. In a real-life application, the mobile user can use the mobile device, e.g., a cell phone, with the unitary token to access multiservice, such as downloading music; receive/reply electronic mails etc. Our scheme is based on one-way hash functions and random nonce to solve the weaknesses described above and to decrease the overhead of the system. The proposed scheme is more secure with two types of password scheme namely, Text password and Graphical Password referred as 2D password in distributed computer networks that yields a more efficient system that consumes lower energy. The proposed system has less communication overhead. It eliminates the need for time synchronization and there is no need of holding multiple passwords for different services

    Intention to use new mobile payment systems: a comparative analysis of SMS and NFC payments

    Get PDF
    The rapid growth of mobile technology among the world’s population has led many companies to attempt to exploit mobile devices as an additional tool in the business of sales. In this sense, the main objective of our study resides in comparing the factors that determine the acceptance by consumers of the SMS (Short Message Service) and NFC (Near Field Communication) mobile payment systems as examples of means of future payment. The model used in our research applies the classic variables of the Technology Acceptance Model, as well as that of Perceived Security, a model deriving from the review of the major relevant recent literature. The results achieved in this study demonstrate that there are differences in the factors that determine the acceptance in each of the systems, as well as the level of the Intention to Use. Finally, we highlight the main implications for management and cite some strategies to reinforce this new business in the context of new technical developments

    Trust management in cloud computing: A critical review

    Get PDF
    Cloud computing has been attracting the attention of several researchers both in the academia and the industry as it provides many opportunities for organizations by offering a range of computing services.For cloud computing to become widely adopted by both the enterprises and individuals, several issues have to be solved.A key issue that needs special attention is security of clouds, and trust management is an important component of cloud security.In this paper, the authors look at what trust is and how trust has been applied in distributed computing. Trust models proposed for various distributed system has then been summarized.The trust management systems proposed for cloud computing have been investigated with special emphasis on their capability, applicability in practical heterogonous cloud environment and implementabilty. Finally, the proposed models/systems have been compared with each other based on a selected set of cloud computing parameters in a table

    Security Engineering of Patient-Centered Health Care Information Systems in Peer-to-Peer Environments: Systematic Review

    Get PDF
    Background: Patient-centered health care information systems (PHSs) enable patients to take control and become knowledgeable about their own health, preferably in a secure environment. Current and emerging PHSs use either a centralized database, peer-to-peer (P2P) technology, or distributed ledger technology for PHS deployment. The evolving COVID-19 decentralized Bluetooth-based tracing systems are examples of disease-centric P2P PHSs. Although using P2P technology for the provision of PHSs can be flexible, scalable, resilient to a single point of failure, and inexpensive for patients, the use of health information on P2P networks poses major security issues as users must manage information security largely by themselves. Objective: This study aims to identify the inherent security issues for PHS deployment in P2P networks and how they can be overcome. In addition, this study reviews different P2P architectures and proposes a suitable architecture for P2P PHS deployment. Methods: A systematic literature review was conducted following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) reporting guidelines. Thematic analysis was used for data analysis. We searched the following databases: IEEE Digital Library, PubMed, Science Direct, ACM Digital Library, Scopus, and Semantic Scholar. The search was conducted on articles published between 2008 and 2020. The Common Vulnerability Scoring System was used as a guide for rating security issues. Results: Our findings are consolidated into 8 key security issues associated with PHS implementation and deployment on P2P networks and 7 factors promoting them. Moreover, we propose a suitable architecture for P2P PHSs and guidelines for the provision of PHSs while maintaining information security. Conclusions: Despite the clear advantages of P2P PHSs, the absence of centralized controls and inconsistent views of the network on some P2P systems have profound adverse impacts in terms of security. The security issues identified in this study need to be addressed to increase patients\u27 intention to use PHSs on P2P networks by making them safe to use

    South African Generation Y students’ behavioral intentions to use university websites

    Get PDF
    University websites are increasingly crucial in meeting the evolving digital demands of students. To effectively manage university websites, it is necessary to first determine students’ behavioral usage intentions of university websites and the factors that influence their intentions, which forms the purpose of this study. Data were collected at a single point in time and described the characteristics of the sample. This study, involving 319 Generation Y students registered at two South African university campuses (one traditional and one university of technology campus), utilizes structural equation modeling to explore the predictive relationships among information quality, system quality, playfulness, ease of use, trust, attitude, satisfaction, and behavioral intentions related to university website use. The study underscores the pivotal role of the university’s website in shaping student satisfaction, with information quality standing out as a significant positive influence. Additionally, playfulness significantly impacts both satisfaction and overall attitudes toward university websites. The system quality of the university website is also noteworthy, showing a statistically significant positive effect on ease of use and fostering trust among students. Furthermore, satisfaction is anticipated by ease of use, creating a cascade effect where satisfaction predicts trust and trust predicts attitudes. Ultimately, students’ attitudes emerge as a critical predictor for their behavioral intentions to use university websites. The model exhibits acceptable fit indices, demonstrating substantial explanatory power (SRMR = 0.1, RMSEA = 0.06, IFI = 0.94, TLI = 0.93, CFI = 0.94). These findings offer insights for university management and web designers to enhance online platforms, fostering student satisfaction, trust, and usage

    Research on performance enhancement for electromagnetic analysis and power analysis in cryptographic LSI

    Get PDF
    制度:新 ; 報告番号:甲3785号 ; 学位の種類:博士(工学) ; 授与年月日:2012/11/19 ; 早大学位記番号:新6161Waseda Universit

    A Novel Methodology for Calculating Large Numbers of Symmetrical Matrices on a Graphics Processing Unit: Towards Efficient, Real-Time Hyperspectral Image Processing

    Get PDF
    Hyperspectral imagery (HSI) is often processed to identify targets of interest. Many of the quantitative analysis techniques developed for this purpose mathematically manipulate the data to derive information about the target of interest based on local spectral covariance matrices. The calculation of a local spectral covariance matrix for every pixel in a given hyperspectral data scene is so computationally intensive that real-time processing with these algorithms is not feasible with today’s general purpose processing solutions. Specialized solutions are cost prohibitive, inflexible, inaccessible, or not feasible for on-board applications. Advances in graphics processing unit (GPU) capabilities and programmability offer an opportunity for general purpose computing with access to hundreds of processing cores in a system that is affordable and accessible. The GPU also offers flexibility, accessibility and feasibility that other specialized solutions do not offer. The architecture for the NVIDIA GPU used in this research is significantly different from the architecture of other parallel computing solutions. With such a substantial change in architecture it follows that the paradigm for programming graphics hardware is significantly different from traditional serial and parallel software development paradigms. In this research a methodology for mapping an HSI target detection algorithm to the NVIDIA GPU hardware and Compute Unified Device Architecture (CUDA) Application Programming Interface (API) is developed. The RX algorithm is chosen as a representative stochastic HSI algorithm that requires the calculation of a spectral covariance matrix. The developed methodology is designed to calculate a local covariance matrix for every pixel in the input HSI data scene. A characterization of the limitations imposed by the chosen GPU is given and a path forward toward optimization of a GPU-based method for real-time HSI data processing is defined
    corecore