126 research outputs found
Model Selection for Nonnegative Matrix Factorization by Support Union Recovery
Nonnegative matrix factorization (NMF) has been widely used in machine
learning and signal processing because of its non-subtractive, part-based
property which enhances interpretability. It is often assumed that the latent
dimensionality (or the number of components) is given. Despite the large amount
of algorithms designed for NMF, there is little literature about automatic
model selection for NMF with theoretical guarantees. In this paper, we propose
an algorithm that first calculates an empirical second-order moment from the
empirical fourth-order cumulant tensor, and then estimates the latent
dimensionality by recovering the support union (the index set of non-zero rows)
of a matrix related to the empirical second-order moment. By assuming a
generative model of the data with additional mild conditions, our algorithm
provably detects the true latent dimensionality. We show on synthetic examples
that our proposed algorithm is able to find an approximately correct number of
components
Solving Quadratic Systems with Full-Rank Matrices Using Sparse or Generative Priors
The problem of recovering a signal from a
quadratic system $\{y_i=\boldsymbol{x}^\top\boldsymbol{A}_i\boldsymbol{x},\
i=1,\ldots,m\}\boldsymbol{A}_i\boldsymbol{A}_im\ll n\boldsymbol{x}k\boldsymbol{x}k\boldsymbol{x}m=O(k^2\log n)\boldsymbol{x}m=O(k\log n)\boldsymbol{x}Lk\ell_2rO\big(\sqrt{\frac{k \log L}{m}}\big)\ell_2m=O(k\log(Lnr))\ell_2O(\delta)m=O(k\log\frac{Lrn}{\delta^2})$. Experimental results corroborate our
theoretical findings and show that: (i) our approach for the sparse case
notably outperforms the existing provable algorithm sparse power factorization;
(ii) leveraging the generative prior allows for precise image recovery in the
MNIST dataset from a small number of quadratic measurements
Enabling Quality Control for Entity Resolution: A Human and Machine Cooperation Framework
Even though many machine algorithms have been proposed for entity resolution,
it remains very challenging to find a solution with quality guarantees. In this
paper, we propose a novel HUman and Machine cOoperation (HUMO) framework for
entity resolution (ER), which divides an ER workload between the machine and
the human. HUMO enables a mechanism for quality control that can flexibly
enforce both precision and recall levels. We introduce the optimization problem
of HUMO, minimizing human cost given a quality requirement, and then present
three optimization approaches: a conservative baseline one purely based on the
monotonicity assumption of precision, a more aggressive one based on sampling
and a hybrid one that can take advantage of the strengths of both previous
approaches. Finally, we demonstrate by extensive experiments on real and
synthetic datasets that HUMO can achieve high-quality results with reasonable
return on investment (ROI) in terms of human cost, and it performs considerably
better than the state-of-the-art alternatives in quality control.Comment: 12 pages, 11 figures. Camera-ready version of the paper submitted to
ICDE 2018, In Proceedings of the 34th IEEE International Conference on Data
Engineering (ICDE 2018
DiffFit: Unlocking Transferability of Large Diffusion Models via Simple Parameter-Efficient Fine-Tuning
Diffusion models have proven to be highly effective in generating
high-quality images. However, adapting large pre-trained diffusion models to
new domains remains an open challenge, which is critical for real-world
applications. This paper proposes DiffFit, a parameter-efficient strategy to
fine-tune large pre-trained diffusion models that enable fast adaptation to new
domains. DiffFit is embarrassingly simple that only fine-tunes the bias term
and newly-added scaling factors in specific layers, yet resulting in
significant training speed-up and reduced model storage costs. Compared with
full fine-tuning, DiffFit achieves 2 training speed-up and only needs
to store approximately 0.12\% of the total model parameters. Intuitive
theoretical analysis has been provided to justify the efficacy of scaling
factors on fast adaptation. On 8 downstream datasets, DiffFit achieves superior
or competitive performances compared to the full fine-tuning while being more
efficient. Remarkably, we show that DiffFit can adapt a pre-trained
low-resolution generative model to a high-resolution one by adding minimal
cost. Among diffusion-based methods, DiffFit sets a new state-of-the-art FID of
3.02 on ImageNet 512512 benchmark by fine-tuning only 25 epochs from a
public pre-trained ImageNet 256256 checkpoint while being 30
more training efficient than the closest competitor.Comment: Tech Repor
Threshold Recognition Based on Non-stationarity of Extreme Rainfall in the Middle and Lower Reaches of the Yangtze River Basin
Analyzing the hydrological sequence from the non-stationary characteristics can better understand the responses of changes in extreme rainfall to climate change. Taking the plain area in the middle and lower reaches of the Yangtze River basin (MLRYRB) as the study area, this study adopted a set of extreme rainfall indices and used the Bernaola-Galvan Segmentation Algorithm (BGSA) method to test the non-stationarity of extreme rainfall events. The General Pareto Distribution (GPD) was used to fit extreme rainfall and was calculated to select the optimal threshold of extreme rainfall. In addition, the cross-wavelet technique was used to explore the correlations of extreme rainfall with El Niño-Southern Oscillation (ENSO) and Western Pacific Subtropical High (WPSH) events. The results showed that: (1) extreme rainfall under different thresholds had different non-stationary characteristics; (2) the GPD distribution could well fit the extreme rainfall in the MLRYRB, and 40–60 mm was considered as the suitable optimal threshold by comparing the uncertainty of the return period; and (3) ENSO and WPSH had significant periodic effects on extreme rainfall in the MLRYRB. These findings highlighted the significance of non-stationary assumptions in hydrological frequency analysis, which were of great importance for hydrological forecasting and water conservancy project management
- …