48,906 research outputs found

    Extending twin support vector machine classifier for multi-category classification problems

    Get PDF
    Ā© 2013 ā€“ IOS Press and the authors. All rights reservedTwin support vector machine classifier (TWSVM) was proposed by Jayadeva et al., which was used for binary classification problems. TWSVM not only overcomes the difficulties in handling the problem of exemplar unbalance in binary classification problems, but also it is four times faster in training a classifier than classical support vector machines. This paper proposes one-versus-all twin support vector machine classifiers (OVA-TWSVM) for multi-category classification problems by utilizing the strengths of TWSVM. OVA-TWSVM extends TWSVM to solve k-category classification problems by developing k TWSVM where in the ith TWSVM, we only solve the Quadratic Programming Problems (QPPs) for the ith class, and get the ith nonparallel hyperplane corresponding to the ith class data. OVA-TWSVM uses the well known one-versus-all (OVA) approach to construct a corresponding twin support vector machine classifier. We analyze the efficiency of the OVA-TWSVM theoretically, and perform experiments to test its efficiency on both synthetic data sets and several benchmark data sets from the UCI machine learning repository. Both the theoretical analysis and experimental results demonstrate that OVA-TWSVM can outperform the traditional OVA-SVMs classifier. Further experimental comparisons with other multiclass classifiers demonstrated that comparable performance could be achieved.This work is supported in part by the grant of the Fundamental Research Funds for the Central Universities of GK201102007 in PR China, and is also supported by Natural Science Basis Research Plan in Shaanxi Province of China (Program No.2010JM3004), and is at the same time supported by Chinese Academy of Sciences under the Innovative Group Overseas Partnership Grant as well as Natural Science Foundation of China Major International Joint Research Project (NO.71110107026)

    Research on trust model in container-based cloud service

    Get PDF
    Container virtual technology aims to provide program independence and resource sharing. The container enables flexible cloud service. Compared with traditional virtualization, traditional virtual machines have difficulty in resource and expense requirements. The container technology has the advantages of smaller size, faster migration, lower resource overhead, and higher utilization. Within container-based cloud environment, services can adopt multi-target nodes. This paper reports research results to improve the traditional trust model with consideration of cooperation effects. Cooperation trust means that in a container-based cloud environment, services can be divided into multiple containers for different container nodes. When multiple target nodes work for one service at the same time, these nodes are in a cooperation state. When multi-target nodes cooperate to complete the service, the target nodes evaluate each other. The calculation of cooperation trust evaluation is used to update the degree of comprehensive trust. Experimental simulation results show that the cooperation trust evaluation can help solving the trust problem in the container-based cloud environment and can improve the success rate of following cooperation

    Spin-current diode with a ferromagnetic semiconductor

    Full text link
    Diode is a key device in electronics: the charge current can flow through the device under a forward bias, while almost no current flows under a reverse bias. Here we propose a corresponding device in spintronics: the spin-current diode, in which the forward spin current is large but the reversed one is negligible. We show that the lead/ferromagnetic quantum dot/lead system and the lead/ferromagnetic semiconductor/lead junction can work as spin-current diodes. The spin-current diode, a low dissipation device, may have important applications in spintronics, as the conventional charge-current diode does in electronics.Comment: 5 pages, 3 figure
    • ā€¦
    corecore