This thoroughly revised second edition provides an updated treatment of numerical linear algebra techniques for solving problems in data mining and pattern recognition. Adopting an application-oriented approach, the author introduces matrix theory and decompositions, describes how modern matrix methods can be applied in real life scenarios, and provides a set of tools that students can modify for a particular application.
Building on material from the first edition, the author discusses basic graph concepts and their matrix counterparts. He introduces the graph Laplacian and properties of its eigenvectors needed in spectral partitioning and describes spectral graph partitioning applied to social networks and text classification. Examples are included to help readers visualize the results. This new edition also presents matrix-based methods that underlie many of the algorithms used for big data.
The book provides a solid foundation to further explore related topics and presents applications such as classification of handwritten digits, text mining, text summarization, PageRank computations related to the Google search engine, and facial recognition. Exercises and computer assignments are available on a Web page that supplements the book.
Matrix Methods in Data Mining and Pattern Recognition, Second Edition is primarily for undergraduate students who have previously taken an introductory scientific computing/numerical analysis course and graduate students in data mining and pattern recognition areas who need an introduction to linear algebra techniques.