7 research outputs found
Non-Negative Discriminative Data Analytics
Due to advancements in data acquisition techniques, collecting datasets representing samples from multi-views has become more common recently (Jia et al. 2019). For instance, in genomics, a lymphoma patient’s dataset may include data on gene expression, single nucleotide polymorphism (SNP), and array Comparative genomic hybridization (aCGH) measurements. Learning from multiple views about the same objective, in general, obtains a better understanding of the hidden patterns of the data compared to learning from a single view data. Most of the existing multi-view learning techniques such as canonical correlation analysis (Hotelling et al. 1936) and multi-view support vector machine (Farquhar et al. 2006), multiple kernel learning (Zhang et al. 2016) are focused on extracting the shared information among multiple datasets.
However, in some real-world applications, it’s appealing to extract the discriminative knowledge of multiple datasets, namely discriminative data analytics. For example, consider the one dataset as gene-expression measurements of cancer patients, and the other dataset as the gene-expression levels of healthy volunteers and the goal is to cluster cancer patients according to the molecular sub-types. Performing a single view analysis such as principal component analysis (PCA) on any of the dataset yields information related to the common knowledge between the two datasets (Garte et al. 1996). Addressing such challenge, contrastive PCA (Abid et al. 2017) and discriminative (d) PCA in (Jia et al. 2019) are proposed in to extract one dataset-specific information often missed by PCA.
Inspired by dPCA, we propose a novel discriminative multi-view learning algorithm, namely Non-negative Discriminative Analysis (DNA), to extract the unique information of one dataset (a.k.a. view) with respect to the other dataset. This boils down to solving a non-negative matrix factorization problem. Furthermore, we apply the proposed DNA framework in various real-world down-stream machine learning applications such as feature selections, dimensionality reduction, classification, and clustering
An entire function sharing fixed points with its linear differential polynomial
We study the uniqueness of entire functions, when they share a linear polynomial, in particular, fixed points, with their linear differential polynomials
Uniqueness of an entire function sharing a polynomial with its linear differential polynomial
In this paper we consider an entire function when it shares a polynomial with its linear differential polynomial. Our result is an improvement of a result of P. Li
Uniqueness of an entire function sharing fixed points with its derivatives
The uniqueness problems of an entire functions that share a nonzero finite value have been studied and many results on this topic have been obtained. In this paper we prove a uniqueness theorem for an entire function, which share a linear polynomial, in particular fixed points, with its higher order derivatives
An entire function sharing a polynomial with its linear differential polynomial
summary:We study the uniqueness of entire functions which share a polynomial with their linear differential polynomials
COVID-19 or Flu? Discriminative Knowledge Discovery of COVID-19 Symptoms from Google Trends Data
Google Trends data analytics is gaining more attention in the past few years, and most of the state-of-the-art algorithms are focused on forecasting. How to extract knowledge about symptoms mostly related to COVID-19 by contrasting periods of time with and without the spread of COVID-19 from Google Trends data has not been investigated. To this end, we propose a novel nonnegative discriminative analysis (DNA) to extract the unique information of one dataset relative to another dataset. Numerical tests corroborated the efficacy of our proposed approaches to discover the three unique COVID-19 symptoms w.r.t. flu including ageusia, shortness of breath, and anosmia while prior arts are not able to