I published this about a year ago and maybe had 10 people read it? Part of the challenges of writing is marketing your ideas to a receptive audience. If you can program and perform advanced math there's plenty of jobs to go around.

NO BS Guide to Linear Algebra Book Review

I will preface this book review by briefly stating my background. I started studying data science in late March 2019 after being inspired by a podcast to research this field. As of this date I have spent about 50+ hours reading and listening to material. Linear algebra is an essential subject you have to understand to succeed in data science. I ordered this book because Imperial College London’s Mathematics for Machine Learning Specialization Coursera course utterly stumped me. When I was searching for linear algebra help I found Jason Brownlee’s Machine Learning Mastery’s site and I heeded Dr Brownlee’s advice to order the book.

The author, Ivan Savov, has an electrical engineering, physics, and computer science academic background with a master’s and PHD. He has “ more than 15 years of teaching experience as a private tutor” and prides himself on helping his “students overcome their fear of math and ace their exams” (minireference).

While I have studied somewhat advanced math classes in college and high school it has been a while, hence why I had to order the book. Machine Learning and Dr Savov are absolutely correct that this book cuts straight to the chase and explains concepts that people with an average IQ will be able to understand. The book doesn’t assume you have a math background.

Machine Learning Mastery recommends reading these certain parts:

Concept Maps. Page v. A collection of mind-map type diagrams are provided directly after the table of contents that show how the concepts in the book, and, in fact, the concepts in the field of linear algebra, relate. If you are a visual thinker, these may help fit the pieces together.

Section 1.15, Vectors. Page 69. Provides a terse introduction to vectors, prior to any vector algebra. Useful background.

Chapter 2, Intro to Linear Algebra. Pages 101-130. Read this whole chapter. It covers: * * Definitions of terms in linear algebra.

Vector operations such as arithmetic and vector norm.

Matrix operations such as arithmetic and dot product.

Linearity and what exactly this key concept means in linear algebra

Overview of how the different aspects of linear algebra (geometric, theory, etc.) relate.

Section 3.2 Matrix Equations. Page 147. Includes explanations and clear diagrams for calculating matrix operations, not least the must-know matrix multiplication.

Section 6.1 Eigenvalues and eigenvectors. Page 262. Provides an introduction to the eigendecomposition that is used as a key operation in methods such as the principal component analysis.

Section 6.2 Special types of matrices. Page 275. Provides an introduction to various different types of matrices such as diagonal, symmetric, orthogonal, and more.

Section 6.6 Matrix Decompositions. Page 295. An introduction matrix factorization methods, re-covering the eigendecomposition, but also covering the LU, QR, and Singular-Value decomposition.

Section 7.7 Least squares approximate solutions. Page 241. An introduction to the matrix formulation of least squares called linear least squares.

Appendix B, Notation. A summary of math and linear algebra notation.

Slight quibble I want to address is Dr Brownlee’s pages don’t match up but with slightly more effort you will be able to locate the information. With that being said, this book helped me better understand vectors, Eigenvalues, etc.

Pour Conclure

Dr Savov shines in this book in explaining linear algebra topics relevant to machine learning and it was worth every penny. I highly recommend you pick it up!

Work Cited

https://minireference.com/