Image classification by sparse coding Feature learning problem Given a 14×14 ima

Image classification by sparse coding Feature learning problem Given a 14x14 ima www.phwiki.com

Image classification by sparse coding Feature learning problem Given a 14×14 ima

Watson, Mac, On-Air Personality has reference to this Academic Journal, PHwiki organized this Journal Image classification by sparse coding Feature learning problem Given a 14×14 image patch x, can represent it using 196 real numbers. Problem: Can we find a learn a better representation as long as this Unsupervised feature learning Given a set of images, learn a better way to represent image than pixels.

Johnson State College US www.phwiki.com

This Particular University is Related to this Particular Journal

First stage of visual processing in brain: V1 Schematic of simple cell Actual simple cell Green: Responds to white dot. Red: Responds to black dot. [Images from DeAngelis, Ohzawa & Freeman, 1995] “Gabor functions.” Also used in image compression in addition to denoising. The first stage of visual processing in the brain (V1) does “edge detection.” Learning an image representation Sparse coding (Olshausen & Field,1996) Input: Images x(1), x(2), , x(m) (each in Rn x n) Learn: Dictionary of bases f1, f2, , fk (also Rn x n), so that each input x can be approximately decomposed as: s.t. aj’s are mostly zero (“sparse”) Use to represent 14×14 image patch succinctly, as [a7=0.8, a36=0.3, a41 = 0.5]. I.e., this indicates which “basic edges” make up the image. [NIPS 2006, 2007] Sparse coding illustration Natural Images Learned bases (f1 , , f64): “Edges” x » 0.8 f36 + 0.3 f42 + 0.5 f63 [0, 0, , 0, 0.8, 0, , 0, 0.3, 0, , 0, 0.5, ] = [a1, , a64] (feature representation) Test example Compact & easily interpretable

More examples Represent as: [0, 0, , 0, 0.6, 0, , 0, 0.8, 0, , 0, 0.4, ] Represent as: [0, 0, , 0, 1.3, 0, , 0, 0.9, 0, , 0, 0.3, ] Method hypothesizes that edge-like patches are the most “basic” elements of a scene, in addition to represents an image in terms of the edges that appear in it. Use to obtain a more compact, higher-level representation of the scene than pixels. [Evan Smith & Mike Lewicki, 2006] Digression: Sparse coding applied to audio Digression: Sparse coding applied to audio [Evan Smith & Mike Lewicki, 2006]

Sparse coding details Input: Images x(1), x(2), , x(m) (each in Rn x n) L1 sparsity term (causes most s to be 0) Alternating minimization: Alternately minimize with respect to fi‘s (easy) in addition to a’s (harder). Solving as long as bases Early versions of sparse coding were used to learn about this many bases: 32 learned bases How to scale this algorithm up Sparse coding details Input: Images x(1), x(2), , x(m) (each in Rn x n) L1 sparsity term Alternating minimization: Alternately minimize with respect to fi‘s (easy) in addition to a’s (harder).

Goal: Minimize objective with respect to ai’s. Simplified example: Suppose I tell you: Problem simplifies to: This is a quadratic function of the ai’s. Can be solved efficiently in closed as long as m. Algorithm: Repeatedly guess sign (+, – or 0) of each of the ai’s. Solve as long as ai’s in closed as long as m. Refine guess as long as signs. Feature sign search (solve as long as ai’s) The feature-sign search algorithm: Visualization Starting from zero (default) Current guess: The feature-sign search algorithm: Visualization 1: Activate a2 with “+” sign Active set ={a2} Starting from zero (default) Current guess:

The feature-sign search algorithm: Visualization 1: Activate a2 with “+” sign Active set ={a2} Starting from zero (default) Current guess: The feature-sign search algorithm: Visualization 2: Update a2 (closed as long as m) Starting from zero (default) 1: Activate a2 with “+” sign Active set ={a2} Current guess: The feature-sign search algorithm: Visualization 3: Activate a1 with “+” sign Active set ={a1,a2} Starting from zero (default) Current guess:

The feature-sign search algorithm: Visualization 4: Update a1 & a2 (closed as long as m) Starting from zero (default) 3: Activate a1 with “+” sign Active set ={a1,a2} Current guess: Be as long as e feature sign search 32 learned bases With feature signed search

Recap of sparse coding as long as feature learning Relate to histograms view, in addition to so sparse-coding on top of SIFT features. Input: Images x(1), x(2), , x(m) (each in Rn x n) Learn: Dictionary of bases f1, f2, , fk (also Rn x n). SIFT descriptors x(1), x(2), , x(m) (each in R128) R128. Training time Sparse coding recap x » 0.8 f36 + 0.3 f42 + 0.5 f63 [0, 0, , 0, 0.8, 0, , 0, 0.3, 0, , 0, 0.5, ] Much better than pixel representation. But still not competitive with SIFT, etc. Three ways to make it competitive: Combine this with SIFT. Advanced versions of sparse coding (LCC). Deep learning. Combining sparse coding with SIFT Input: Images x(1), x(2), , x(m) (each in Rn x n) Learn: Dictionary of bases f1, f2, , fk (also Rn x n). SIFT descriptors x(1), x(2), , x(m) (each in R128) R128. Test time: Given novel SIFT descriptor, x (in R128), represent as

Watson, Mac KTAR-FM On-Air Personality www.phwiki.com

Putting it together Relate to histograms view, in addition to so sparse-coding on top of SIFT features. Feature representation Learning algorithm x(1) a(1) x(2) x(3) a(2) a(3) or Learning algorithm Suppose you’ve already learned bases f1, f2, , fk. Here’s how you represent an image. E.g., 73-75% on Caltech 101 (Yang et al., 2009, Boreau et al., 2009) K-means vs. sparse coding Centroid 1 Centroid 2 Centroid 3 K-means Represent as: K-means vs. sparse coding Centroid 1 Centroid 2 Centroid 3 K-means Represent as: Basis f1 Sparse coding Represent as: Basis f2 Basis f3 Intuition: “Soft” version of k-means (membership in multiple clusters).

K-means vs. sparse coding Rule of thumb: Whenever using k-means to get a dictionary, if you replace it with sparse coding it’ll often work better.

Watson, Mac On-Air Personality

Watson, Mac is from United States and they belong to KTAR-FM and they are from  Phoenix, United States got related to this Particular Journal. and Watson, Mac deal with the subjects like Music

Journal Ratings by Johnson State College

This Particular Journal got reviewed and rated by Johnson State College and short form of this particular Institution is US and gave this Journal an Excellent Rating.