Efficient Online Minimization for Low-Rank Subspace Clustering

28 Mar 2015  ·  Jie Shen, Ping Li, Huan Xu ·

Low-rank representation~(LRR) has been a significant method for segmenting data that are generated from a union of subspaces. It is, however, known that solving the LRR program is challenging in terms of time complexity and memory footprint, in that the size of the nuclear norm regularized matrix is $n$-by-$n$ (where $n$ is the number of samples). In this paper, we thereby develop a fast online implementation of LRR that reduces the memory cost from $O(n^2)$ to $O(pd)$, with $p$ being the ambient dimension and $d$ being some estimated rank~($d < p \ll n$). The crux for this end is a non-convex reformulation of the LRR program, which pursues the basis dictionary that generates the (uncorrupted) observations. We build the theoretical guarantee that the sequence of the solutions produced by our algorithm converges to a stationary point of the empirical and the expected loss function asymptotically. Extensive experiments on synthetic and realistic datasets further substantiate that our algorithm is fast, robust and memory efficient.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here