13 research outputs found

    Informacijos perdavimo koncepcija paremtas bangų sklidimo sužadinamoje terpėje modelis

    No full text
    A new information transmission concept based model of excitable media with continuous outputs of the model’s cells and variable excitation time is proposed. Continuous character of the outputs instigates infinitesimal inaccuracies in calculations. It generates countless number of the cells’ excitation variants that occur in front of the wave even in the homogenous and isotropic grid. New approach allows obtain many wave propagation patterns observed in real world experiments and known simulation studies. The model suggests a new spiral breakup mechanism based on tensions and gradually deepening clefts that appear in front of the wave caused by uneven propagation speed of curved and planar segments of the wave. The analysis hints that the wave breakdown and daughter wavelet bursting behavior possibly is inherent peculiarity of excitable media with weak ties between the cells, short refractory period and granular structure. The model suggested is located between cellular automaton with discrete outputs and differential equation based models and gives a new tool to simulate wave propagation patterns in applied disciplines. It is also a new line of attack aimed to understand wave bursting, propagation and annihilation processes in isotropic homogenous media

    Intrinsic dimensionality and small sample properties of classifiers

    Get PDF
    summary:Small learning-set properties of the Euclidean distance, the Parzen window, the minimum empirical error and the nonlinear single layer perceptron classifiers depend on an “intrinsic dimensionality” of the data, however the Fisher linear discriminant function is sensitive to all dimensions. There is no unique definition of the “intrinsic dimensionality”. The dimensionality of the subspace where the data points are situated is not a sufficient definition of the “intrinsic dimensionality”. An exact definition depends both, on a true distribution of the pattern classes, and on the type of the classifier used

    Sustainable economy inspired large-scale feed-forward portfolio construction

    Get PDF
    To understand large-scale portfolio construction tasks we analyse sustainable economy problems by splitting up large tasks into smaller ones and offer an evolutional feed-forward system-based approach. The theoretical justification for our solution is based on multivariate statistical analysis of multidimensional investment tasks, particularly on relations between data size, algorithm complexity and portfolio efficacy. To reduce the dimensionality/sample size problem, a larger task is broken down into smaller parts by means of item similarity – clustering. Similar problems are given to smaller groups to solve. Groups, however, vary in many aspects. Pseudo randomly-formed groups compose a large number of modules of feed-forward decision-making systems. The evolution mechanism forms collections of the best modules for each single short time period. Final solutions are carried forward to the global scale where a collection of the best modules is chosen using a multiclass cost-sensitive perceptron. Collected modules are combined in a final solution in an equally weighted approach (1/N Portfolio). The efficacy of the novel decision-making approach was demonstrated through a financial portfolio optimization problem, which yielded adequate amounts of real world data. For portfolio construction, we used 11,730 simulated trading robot performances. The dataset covered the period from 2003 to 2012 when environmental changes were frequent and largely unpredictable. Walk-forward and out-of-sample experiments show that an approach based on sustainable economy principles outperforms benchmark methods and that shorter agent training history demonstrates better results in periods of a changing environment

    Immunology-based sustainable portfolio management

    No full text
    Immunological principles can be used to build a sustainable investment portfolio. The theory of immunology states that information about recognized pathogens is stored in the memory of the immune system. Information about previous illnesses can be helpful when the pathogen re-enters the body. Real-time analysis of 11 automated financial trading datasets confirmed this phenomenon in financial time series. Therefore, in order to increase the sustainability of the portfolio, we propose to train the portfolio with the most similar segments of historical data. The segment size and offset may vary depending on the data set and time

    A neural network based investigation of high frequency components of the ECG

    No full text
    New information retrieval method is applied to detect low amplitude high frequency components of electrocardiogram (ECG). The special neural network using similarities to prototype features is suggested. Prognosis error is chosen as similarity measure of a signal to a prototype. This measure is preferable in the case of a poor signal to noise ratio. New technique was successfully applied for classification of ECG recordings of myocardial infarction (MI) patients with the complication of ventricular fibrillation (VF) vs. the MI patients who have not had the VF, a problem where standard methods failed to provide satisfactory separation of pattern classesInformatikos fakultetasVytauto Didžiojo universiteta

    Biologically inspired architecture of feedforward networks for signal classification

    No full text
    The hypothesis is that in the lowest bidden layers of biological systems "local subnetworks" are smoothing an input signal. The smoothing accuracy may serve as a feature to feed the subsequent layers of the pattern classification network. The present paper suggests a multistage supervised and "unsupervised" training approach for design and training of multilayer feed-forward networks. Following to the methodology used in the statistical pattern recognition systems we split functionally the decision making process into two stages. In an initial stage, we smooth the input signal in a number of different ways and, in the second stage, we use the smoothing accuracy as anew feature to perform a final classificationTaikomosios informatikos katedraVytauto Didžiojo universiteta

    Multiple classifiers system for reducing influences of atypical observations

    No full text
    Abstract. Atypical observations, which are called outliers, are one of difficulties to apply standard Gaussian density based pattern classification methods. Large number of outliers makes distribution densities of input features multimodal. The problem becomes especially challenging in highdimensional feature space. To tackle atypical observations, we propose multiple classifiers systems (MCSs) whose base classifiers have different representations of the original feature by transformations. This enables to deal with outliers in different ways. As the base classifier, we employ the integrated approach of statistical and neural networks. This consists of data whitening and training of single layer perceptron (SLP). Data whitening makes marginal distributions close to unimodal, and SLP is robust to outliers. Various kinds of combination strategies of the base classifiers achieved reduction of generalization error in comparison with the benchmark method, the regularized discriminant analysis (RDA).

    Structures of Covariance Matrix in Handwritten Character Recognition

    No full text
    Abstract. The integrated approach is a classifier established on statistical estimator and artificial neural network. This consists of preliminary data whitening transformation which provides good starting weight vector, and fast training of single layer perceptron (SLP). If sample size is extremely small in comparison with dimensionality, this approach could be ineffective. In the present paper, we consider joint utilization of structures and conventional regularization techniques of sample covariance matrices in order to improve recognition performance in very difficult case where dimensionality and sample size do not differ essentially. The techniques considered reduce a number of parameters estimated from training set. We applied our methodology to handwritten Japanese character recognition and found that combination of the integrated approach, conventional regularization and various structurization methods of covariance matrix outperform other methods including optimized Regularized Discriminant Analysis (RDA).

    Semiempirical robust algorithm for investment portfolio formation

    No full text
    When analyzing stock market data, it is common to encounter observations that differ from the overall pattern. It is known as the problem of robustness. Presence of outlying observations in different data sets may strongly influence the result of classical (mean and standard deviation based) analysis methods or models based on this data. The problem of outliers can be handled by using robust estimators, therefore making aberrations less influential or ignoring them completely. An example of applying such procedures for outlier elimination in stock trading system optimization process is presented
    corecore