Contents

## Newton Method as long as the ICA Mixture Model Introduction Outline ICA Mixture Modeltoy example ICA Mixture Model

Jefferson, Jon, Freelance Journalist has reference to this Academic Journal, PHwiki organized this Journal Newton Method as long as the ICA Mixture Model Jason A. Palmer1 Scott Makeig1 Ken Kreutz-Delgado2 Bhaskar D. Rao2 1 Swartz Center as long as Computational Neuroscience 2 Dept of Electrical in addition to Computer Engineering University of Cali as long as nia San Diego, La Jolla, CA Introduction Want to model sensor array data with multiple independent sources ICA Non-stationary source activity mixture model Want the adaptation to be computationally efficient Newton method ICA mixture model Basic Newton method Positive definiteness of Hessian when model source densities are true source densities Newton as long as ICA mixture model Example applications to analysis of EEG Outline

This Particular University is Related to this Particular Journal

ICA Mixture Modeltoy example 3 models in two dimensions, 500 points per model Newton method converges < 200 iterations, natural gradient fails to converge, has difficulty on poorly conditioned models ICA Mixture Model Want to model observations x(t), t = 1, ,N, different models active at different times Bayesian linear mixture model, h = 1, , M : Conditionally linear given the model, : Samples are modeled as independent in time: Each source density mixture component has unknown location, scale, in addition to shape: Generalizes Gaussian mixture model, more peaked, heavier tails Source Density Mixture Model ICA Mixture ModelInvariances The complete set of parameters to be estimated is: h = 1, , M, i = 1, , n, j = 1, , m Invariances: W row norm/source density scale in addition to model centers/source density locations: Trans as long as m gradient (1st derivative) of cost function using inverse Hessian (2nd derivative) Cost function is data log likelihood: Gradient: Natural gradient (positive definite trans as long as m): Basic ICA Newton Method Take derivative of (i,j)th element of gradient with respect to (k,l)th element of W : This defines a linear trans as long as m : In matrix as long as m, this is: Newton Method Hessian To invert: rewrite the Hessian trans as long as mation in terms of the source estimates: Define , , : Want to solve linear equation : Newton Method Hessian Newton Method Hessian The Hessian trans as long as mation can be simplified using source independence in addition to zero mean: This leads to 2x2 block diagonal as long as m: Invert Hessian trans as long as mation, evaluate at gradient: Leads to the following equations: Calculate the Newton direction: Newton Direction Positive Definiteness of Hessian Conditions as long as positive definiteness: Always true as long as true when model source densities match true densities: 1) 2) 3) Similar derivation applies to ICA mixture model: Newton as long as ICA Mixture Model Convergence is really much faster than natural gradient. Works with step size 1! Need correct source density model Convergence Rates log likelihood iteration iteration Segmentation of EEG experiment trials trial trial 3 models 4 models log likelihood log likelihood iteration iteration time time Applications to EEGEpilepsy time time time log likelihood log likelihood difference from single model 1 model 5 models Conclusion We applied method of Amari, Cardoso in addition to Laheld, to as long as mulate a Newton method as long as the ICA mixture model Arbitrary source densities modeled with non-gaussian source mixture model Non-stationarity modeled with ICA mixture model (multiple mixing matrices learned) It works! Newton method is substantially faster (superlinear). Also Newton can converge when Natural Gradient fails Code There is Matlab code available!! Generate toy mixture model data as long as testing Full method implemented: mixture sources, mixture ICA, Newton Extended version of paper in preparation, with derivation of mixture model Newton updates Download from: http://sccn.ucsd.edu/~jason Acknowledgements Thanks to Scott Makeig, Howard Poizner, Julie Onton, Ruey-Song Hwang, Rey Ramirez, Diane Whitmer, in addition to Allen Gruber as long as collecting in addition to consulting on EEG data Thanks to Jerry Swartz as long as founding in addition to providing ongoing support the Swartz Center as long as Computational Neuroscience Thanks as long as your attention! Newton as long as ICA Mixture Model

## Jefferson, Jon Freelance Journalist

Jefferson, Jon is from United States and they belong to Jefferson, Jon D. and they are from Phoenix, United States got related to this Particular Journal. and Jefferson, Jon deal with the subjects like History; Local Government and Politics

## Journal Ratings by Anna Maria College

This Particular Journal got reviewed and rated by Anna Maria College and short form of this particular Institution is US and gave this Journal an Excellent Rating.