17 research outputs found

    Effective Feature Selection for Classification of Promoter Sequences

    No full text
    <div><p>Exploring novel computational methods in making sense of biological data has not only been a necessity, but also productive. A part of this trend is the search for more efficient in silico methods/tools for analysis of promoters, which are parts of DNA sequences that are involved in regulation of expression of genes into other functional molecules. Promoter regions vary greatly in their function based on the sequence of nucleotides and the arrangement of protein-binding short-regions called motifs. In fact, the regulatory nature of the promoters seems to be largely driven by the selective presence and/or the arrangement of these motifs. Here, we explore computational classification of promoter sequences based on the pattern of motif distributions, as such classification can pave a new way of functional analysis of promoters and to discover the functionally crucial motifs. We make use of Position Specific Motif Matrix (PSMM) features for exploring the possibility of accurately classifying promoter sequences using some of the popular classification techniques. The classification results on the complete feature set are low, perhaps due to the huge number of features. We propose two ways of reducing features. Our test results show improvement in the classification output after the reduction of features. The results also show that decision trees outperform SVM (Support Vector Machine), KNN (K Nearest Neighbor) and ensemble classifier LibD3C, particularly with reduced features. The proposed feature selection methods outperform some of the popular feature transformation methods such as PCA and SVD. Also, the methods proposed are as accurate as MRMR (feature selection method) but much faster than MRMR. Such methods could be useful to categorize new promoters and explore regulatory mechanisms of gene expressions in complex eukaryotic species.</p></div

    SVM Classification Results for five different kernels for Test v/s Background1 (Variance Reduced).

    No full text
    <p>SVM Classification Results for five different kernels for Test v/s Background1 (Variance Reduced).</p

    Feature reduction (Variance) pattern for 3 files of dataset 2.

    No full text
    <p>Feature reduction (Variance) pattern for 3 files of dataset 2.</p

    Analysis of classification accuracies on dataset 2.

    No full text
    <p>10 (a): Decision Trees. 10 (b): different classifiers 10 (c): different feature selections/transformations.</p

    LibD3C classification accuracies for MRMR and P value reduced features on dataset 2.

    No full text
    <p>LibD3C classification accuracies for MRMR and P value reduced features on dataset 2.</p

    SVM Classification Results for Linear Kernel for test v/s all five backgrounds (Variance Reduced).

    No full text
    <p>SVM Classification Results for Linear Kernel for test v/s all five backgrounds (Variance Reduced).</p

    Decision Tree Classification Results for test v/s all five backgrounds (Variance Reduced).

    No full text
    <p>Decision Tree Classification Results for test v/s all five backgrounds (Variance Reduced).</p

    Hypothetical feature matrix of PSMMs of 4 promoters from two classes and their P values.

    No full text
    <p>Hypothetical feature matrix of PSMMs of 4 promoters from two classes and their P values.</p
    corecore