70 research outputs found

    Risk Assessment of Nautical Navigational Environment Based on Grey Fixed Weight Cluster

    Get PDF
    In order to set up a mathematical model suitable for nautical navigational environment risk evaluation and systematically master the navigational environment risk characteristics of the Qiongzhou Strait in a quantitative way, a risk assessment model with approach steps is set up based on the grey fixed weight cluster (GFWC). The evaluation index system is structured scientifically through both literature review and expert investigation. The relative weight of each index is designed to be obtained via fuzzy analytic hierarchy process (FAHP); Index membership degree of every grey class is proposed to be achieved by fuzzy statistics (FS) to avoid the difficulty of building whiten weight functions. By using the model, nautical navigational environment risk of the Qiongzhou Strait is determined at a “moderate” level according to the principle of maximum membership degree. The comprehensive risk evaluation of the Qiongzhou Strait nautical navigational environment can provide theoretical reference for implementing targeted risk control measures. It shows that the constructed GFWC risk assessment model as well as the presented steps are workable in case of incomplete information. The proposed strategy can excavate the collected experts’ knowledge mathematically, quantify the weight of each index and risk level, and finally lead to a comprehensive risk evaluation result. Besides, the adoptions of probability and statistic theory, fuzzy theory, aiming at solving the bottlenecks in case of uncertainty, will give the model a better adaptability and executability.</p

    Enhanced artificial bee colony-least squares support vector machines algorithm for time series prediction

    Get PDF
    Over the past decades, the Least Squares Support Vector Machines (LSSVM) has been widely utilized in prediction task of various application domains. Nevertheless, existing literature showed that the capability of LSSVM is highly dependent on the value of its hyper-parameters, namely regularization parameter and kernel parameter, where this would greatly affect the generalization of LSSVM in prediction task. This study proposed a hybrid algorithm, based on Artificial Bee Colony (ABC) and LSSVM, that consists of three algorithms; ABC-LSSVM, lvABC-LSSVM and cmABC-LSSVM. The lvABC algorithm is introduced to overcome the local optima problem by enriching the searching behaviour using Levy mutation. On the other hand, the cmABC algorithm that incorporates conventional mutation addresses the over- fitting or under-fitting problem. The combination of lvABC and cmABC algorithm, which is later introduced as Enhanced Artificial Bee Colony–Least Squares Support Vector Machine (eABC-LSSVM), is realized in prediction of non renewable natural resources commodity price. Upon the completion of data collection and data pre processing, the eABC-LSSVM algorithm is designed and developed. The predictability of eABC-LSSVM is measured based on five statistical metrics which include Mean Absolute Percentage Error (MAPE), prediction accuracy, symmetric MAPE (sMAPE), Root Mean Square Percentage Error (RMSPE) and Theils’ U. Results showed that the eABC-LSSVM possess lower prediction error rate as compared to eight hybridization models of LSSVM and Evolutionary Computation (EC) algorithms. In addition, the proposed algorithm is compared to single prediction techniques, namely, Support Vector Machines (SVM) and Back Propagation Neural Network (BPNN). In general, the eABC-LSSVM produced more than 90% prediction accuracy. This indicates that the proposed eABC-LSSVM is capable of solving optimization problem, specifically in the prediction task. The eABC-LSSVM is hoped to be useful to investors and commodities traders in planning their investment and projecting their profit

    NM-LEACH: A Novel Modified LEACH Protocol to Improve Performance in WSN

    Get PDF
    Saving energy and improving the lifetime of wireless sensor networks (WSNs) has remained as a key research challenge for some time. Low-energy adaptive clustering hierarchy (LEACH), a classical protocol is designed originally for the purpose of reducing and balancing the network’s energy consumption. However, as the distances between the cluster head (CH) and the member nodes are not taken into consideration, it results in the uneven distribution of the clusters and uneven consumption of the energy in the network. Choosing the CHs with no distinction is an issue as well. Based on the original algorithm, a novel modified LEACH (NM-LEACH) has been proposed, considering critical problems that exist in the network. NM-LEACH protocol is capable of reasonably solving the number of the CHs in each round and takes the energy as a factor of weight under consideration in selecting the CH. The proposed protocol enhances performance by extending the WSN lifecycle, which results in increasing the balance of the energy consumption in the network, and improving the efficiency of the network

    An adaptive trust based service quality monitoring mechanism for cloud computing

    Get PDF
    Cloud computing is the newest paradigm in distributed computing that delivers computing resources over the Internet as services. Due to the attractiveness of cloud computing, the market is currently flooded with many service providers. This has necessitated the customers to identify the right one meeting their requirements in terms of service quality. The existing monitoring of service quality has been limited only to quantification in cloud computing. On the other hand, the continuous improvement and distribution of service quality scores have been implemented in other distributed computing paradigms but not specifically for cloud computing. This research investigates the methods and proposes mechanisms for quantifying and ranking the service quality of service providers. The solution proposed in this thesis consists of three mechanisms, namely service quality modeling mechanism, adaptive trust computing mechanism and trust distribution mechanism for cloud computing. The Design Research Methodology (DRM) has been modified by adding phases, means and methods, and probable outcomes. This modified DRM is used throughout this study. The mechanisms were developed and tested gradually until the expected outcome has been achieved. A comprehensive set of experiments were carried out in a simulated environment to validate their effectiveness. The evaluation has been carried out by comparing their performance against the combined trust model and QoS trust model for cloud computing along with the adapted fuzzy theory based trust computing mechanism and super-agent based trust distribution mechanism, which were developed for other distributed systems. The results show that the mechanisms are faster and more stable than the existing solutions in terms of reaching the final trust scores on all three parameters tested. The results presented in this thesis are significant in terms of making cloud computing acceptable to users in verifying the performance of the service providers before making the selection

    A Protected Single Sign-On Technique Using 2D Password in Distributed Computer Networks

    Get PDF
    Single Sign-On (SSO) is a new authentication mechanism that enables a legal user with a single credential to be authenticated by multiple service providers in a distributed computer network. Recently, a new SSO scheme providing well-organized security argument failed to meet credential privacy and soundness of authentication. The main goal of this project is to provide security using Single Sign-On scheme meeting at least three basic security requirements, i.e., unforgetability, credential privacy, and soundness. User identification is an important access control mechanism for client–server networking architectures. The concept of Single Sign-On can allow legal users to use the unitary token to access different service providers in distributed computer networks. To overcome few drawbacks like not preserving user anonymity when possible attacks occur and extensive overhead costs of time-synchronized mechanisms, we propose a secure Single Sign-On mechanism that is efficient, secure, and suitable for mobile devices in distributed computer networks. In a real-life application, the mobile user can use the mobile device, e.g., a cell phone, with the unitary token to access multiservice, such as downloading music; receive/reply electronic mails etc. Our scheme is based on one-way hash functions and random nonce to solve the weaknesses described above and to decrease the overhead of the system. The proposed scheme is more secure with two types of password scheme namely, Text password and Graphical Password referred as 2D password in distributed computer networks that yields a more efficient system that consumes lower energy. The proposed system has less communication overhead. It eliminates the need for time synchronization and there is no need of holding multiple passwords for different services

    RISK MANAGEMENT DECISION MAKING IN SERVICE DESIGN

    Get PDF
    This paper explores the use of risk management techniques to promote the design of resilient services. Success in achieving any benefit from a new service will be directly affected by the resiliency of the supporting service architecture including technical and non-technical domains. The concept of resiliency in services and enterprises is examined. We present a framework to analyze risks and threats to service resiliency, and offer specific guidance to support the development of resilient services and service architectures. The risk assessment framework was created by combining a model of service provider gaps that represent dimensions of service quality with a risk analysis model. The framework includes identification of threats and inhibitors to closing service provider gaps. We maintain that risk in services will remain if service provider gaps are not closed. Service-based business models and economies will succeed only if we view service resiliency as a strategic imperative. Effective service design techniques should be adopted, therefore, to include identification and mapping of the provider gaps and creation of appropriate mitigation strategies. This is accomplished by application of service blueprinting techniques and subsequent analysis of the visible risks. The model that we present facilitates the identification of weaknesses or vulnerabilities in services as well as the impac

    Risk Management Decision Making in ICT for Development

    Get PDF
    This paper explores the concept of enterprise resiliency in Information and Communications Technologies (ICT) for development initiatives. ICT are necessary to improve access to vital services and to ultimately support efforts to improve economic conditions in developing regions. Access to information resources provides substantial benefits in the public and private sectors of regions with low standards of living. Success in achieving any benefit from ICT investment in any development enterprise will be directly affected by the resiliency of the ICT systems and services, including technical and non-technical domains. We explore a framework to analyze risks and threats to enterprise resiliency, and present guidance to support the development of resilient ICT for development

    Fuzzy Weight Cluster-Based Routing Algorithm for Wireless Sensor Networks

    Get PDF
    Cluster-based protocol is a kind of important routing in wireless sensor networks. However, due to the uneven distribution of cluster heads in classical clustering algorithm, some nodes may run out of energy too early, which is not suitable for large-scale wireless sensor networks. In this paper, a distributed clustering algorithm based on fuzzy weighted attributes is put forward to ensure both energy efficiency and extensibility. On the premise of a comprehensive consideration of all attributes, the corresponding weight of each parameter is assigned by using the direct method of fuzzy engineering theory. Then, each node works out property value. These property values will be mapped to the time axis and be triggered by a timer to broadcast cluster headers. At the same time, the radio coverage method is adopted, in order to avoid collisions and to ensure the symmetrical distribution of cluster heads. The aggregated data are forwarded to the sink node in the form of multihop. The simulation results demonstrate that clustering algorithm based on fuzzy weighted attributes has a longer life expectancy and better extensibility than LEACH-like algorithms

    A consumer perspective e-commerce websites evaluation model

    Get PDF
    Existing website evaluation methods have some weaknesses such as neglecting consumer criteria in their evaluation, being unable to deal with qualitative criteria, and involving complex weight and score calculations. This research aims to develop a hybrid consumer-oriented e-commerce website evaluation model based on the Fuzzy Analytical Hierarchy Process (FAHP) and the Hardmard Method (HM). Four phases were involved in developing the model: requirements identification, empirical study, model construction, and model confirmation. Requirements identification and empirical study were to identify critical web-design criteria and gather online consumers' preferences. Data, collected from 152 Malaysian consumers using online questionnaires, were used to identify critical e-commerce website features and scale of importance. The new evaluation model comprised of three components. First, the consumer evaluation criteria that consist of the important principles considered by consumers; second, the evaluation mechanisms that integrate FAHP and HM consisting of mathematical expressions that handle subjective judgments, new formulas to calculate the weight and score for each criterion; and third, the evaluation procedures consisting of activities that comprise of goal establishment, document preparation, and identification of website performance. The model was examined by six experts and applied to four case studies. The results show that the new model is practical, and appropriate to evaluate e-commerce websites from consumers' perspectives, and is able to calculate weights and scores for qualitative criteria in a simple way. In addition, it is able to assist decision-makers to make decisions in a measured objective way. The model also contributes new knowledge to the software evaluation fiel
    corecore