5 research outputs found

    Existence and Exponential Stability of Periodic Solution for a Class of Generalized Neural Networks with Arbitrary Delays

    Get PDF
    By the continuation theorem of coincidence degree and M-matrix theory, we obtain some sufficient conditions for the existence and exponential stability of periodic solutions for a class of generalized neural networks with arbitrary delays, which are milder and less restrictive than those of previous known criteria. Moreover our results generalize and improve many existing ones

    An LMI approach to global asymptotic stability of the delayed Cohen-Grossberg neural network via nonsmooth analysis

    Get PDF
    In this paper, a linear matrix inequality (LMI) to global asymptotic stability of the delayed Cohen-Grossberg neural network is investigated by means of nonsmooth analysis. Several new sufficient conditions are presented to ascertain the uniqueness of the equilibrium point and the global asymptotic stability of the neural network. It is noted that the results herein require neither the smoothness of the behaved function, or the activation function nor the boundedness of the activation function. In addition, from theoretical analysis, it is found that the condition for ensuring the global asymptotic stability of the neural network also implies the uniqueness of equilibrium. The obtained results improve many earlier ones and are easy to apply. Some simulation results are shown to substantiate the theoretical results

    Existence and Global Exponential Stability of Periodic Solution to Cohen-Grossberg BAM Neural Networks with Time-Varying Delays

    Get PDF
    We investigate first the existence of periodic solution in general Cohen-Grossberg BAM neural networks with multiple time-varying delays by means of using degree theory. Then using the existence result of periodic solution and constructing a Lyapunov functional, we discuss global exponential stability of periodic solution for the above neural networks. Our result on global exponential stability of periodic solution is different from the existing results. In our result, the hypothesis for monotonicity ineqiality conditions in the works of Xia (2010) Chen and Cao (2007) on the behaved functions is removed and the assumption for boundedness in the works of Zhang et al. (2011) and Li et al. (2009) is also removed. We just require that the behaved functions satisfy sign conditions and activation functions are globally Lipschitz continuous

    Exploring the Learnability of Numeric Datasets

    Get PDF
    When doing classification, it has often been observed that datasets may exhibit different levels of difficulty with respect to how accurately they can be classified. That is, there are some datasets which can be classified very accurately by many classification algorithms, and there also exist some other datasets that no classifier can classify them with high accuracy. Based on this observation, we try to address the following problems: a)what are the factors that make a dataset easy or difficult to be accurately classified? b) how to use such factors to predict the difficulties of unclassified datasets? and c) how to use such factors to improve classification. It turns out that the monotonic features of the datasets, along with some other closely related structural properties, play an important role in determining how difficult datasets can be accurately classified. More importantly, datasets which are comprised of highly monotonic data, can usually be classified more accurately than datasets with low monotonically distributed data. By further exploring these monotonicity based properties, we observed that datasets can always be decomposed into a family of subsets while each of them is highly monotonic locally. Moreover, it is proposed in this dissertation a methodology to use the classification models inferred from the smaller but highly monotonic subsets to construct a highly accurate classification model for the original dataset. Two groups of experiments were implemented in this dissertation. The first group of experiments were performed to discover the relationships between the data difficulty and data monotonic properties, and represent such relationships in regression models. Such models were later used to predict the classification difficulty of unclassified datasets. It seems that in more than 95% of the predictions, the deviations between the predicted value and the real difficulty are smaller than 2.4%. The second group of experiments focused on the performance of the proposed meta-learning approach. According to the experimental results, the proposed approach can consistently achieve significant improvements
    corecore