9 research outputs found

    Classification performances of an information flow on two scales.

    No full text
    <p>95% Box-plots of the three-cluster scales is in Yellow color with observed entropy being marked in Blue, while the nine-cluster scale one in Black with observed entropies being marked in Red. The clusters from the left-to-right are arranged exactly to correspond to clusters from bottom-to-top in <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0198253#pone.0198253.g009" target="_blank">Fig 9(B)</a>. Each box is built based on 1000 simulated entropy values via simple random sampling without replacements.</p

    An expandable logistic regression setup and possibly heterogeneity.

    No full text
    <p>(A) Binary horizontal layout with respect to with MLE .; (B) Histogram of with calculated entropies for each cluster. A high degree of overlapping between the two horizontal layout in (A) indicates inefficiency of Logistic regression. In contrast, the heterogeneity within each gender categories in (B) gives rise to precise results in clusters of with low entropies.</p

    Heatmaps via DM on heart disease data.

    No full text
    <p>(a) Mutual entropy matrix of all features with two synergistic groups; (b) Coupling geometries of all features. Red color for patients, Black for healthy subjects.</p

    Information flow of height data.

    No full text
    <p>(A) and (B) for the mutual conditional-entropy matrices for response and covariate features; (C) Information flow to all covariate features; (D) Information flow to #1 feature-group and then #2 feature-group.</p

    Two features’ hierarchical clustering trees and corresponding empirical distributions and possibly-gapped histograms.

    No full text
    <p>(A)Brain weight’s hierarchical clustering tree marked with 7 clusters; (B)Head size’s hierarchical clustering tree marked with 8 clusters; (C)The empirical distribution of head size superimposed with an 8-piece linear approximations showing with possibly-gaps; (D) The possibly-gapped histogram with 8 bins colored with gender proportions. (E)The empirical distribution of brain weight superimposed with a 7-piece linear approximations showing with possibly-gaps; (F) The possibly-gapped histogram with 7 bins colored with gender proportions. It is noted that the both histograms in (D) and (F) have two visible gaps separating the far-left and far-right bins. This is the strong evidence of dependency between these two features. The Red color code is for female and Blue for male.</p

    Information flows from response’s fine-scale perspective.

    No full text
    <p>(A) Mutual conditional-entropy matrix superimposed with DCG tree with 4 synergistic feature-groups; The information flows from the response to (B) #2 synergistic feature-group; (C) #1 (D) #1 synergistic feature-groups. The misclassified subjects’ ID numbers are attached to the right side of each heatmap.</p

    Mutual conditional entropy matrices and two information flows on bird data.

    No full text
    <p>(A) Mutual conditional entropy matrix of 3 response features divided into two synergistic groups; (B) 7 Ă— 7 mutual conditional entropy matrix of covariate features with two color-coded synergistic groups; (C) Information flow from the response heatmap to the covariate heatmap showing heterogeneity; (D) Information flow from two response features v2 and V3 to two covariate heatmaps pertaining to the two synergistic groups.</p

    Information flows from response’s coarse-scale perspective.

    No full text
    <p>The information flows from the response to (A) #1 synergistic feature-groups; (B) serial #2, #3, #1 and then #4 synergistic feature-groups.</p

    Information flows; (A)for binary-gender; (B)for binary-age.

    No full text
    <p>The information flow (A) shows rather evident associative patterns from the gender-tree with male- and female-specific clusters to the DCG-tree based on head size and brain weight with 6 clusters. Except one, all clusters have extremely or relative low entropies. This result shows the effectiveness of information flow over classic logistic regression. The information flows (A) and (B) share a cluster with extreme low values of the two features.</p
    corecore