Machine Learning Intuition: Understanding Taylor Series Approximation

We have talked before about the intuition behind cost function optimization in machine learning. We took a look at where cost functions come from and what they look like. We also talked about how one might get to the bottom of one with gradient descent.

When you open a machine learning textbook, you’ll see much more math than we used in that introduction. But the math is, in fact, important; the math gives us tools that we can use to quickly find the minima of our cost functions.

We talked about derivatives in the last post, and we’re talking about Taylor series in this post. Both of these tools come from calculus and help us identify where to find the minima on our cost functions.

Continue reading “Machine Learning Intuition: Understanding Taylor Series Approximation”

Machine Learning Intuition: Using Derivatives to Minimize the Cost Function

We have talked before about the intuition behind cost function optimization in machine learning. We took a look at where cost functions come from and what they look like. We also talked about how one might get to the bottom of one with gradient descent.

When you open a machine learning textbook, you’ll see much more math than we used in that introduction. But the math is, in fact, important; the math gives us tools that we can use to quickly find the minima of our cost functions.

We’re going to talk about two of those tools: derivatives (in this post) and Taylor series (in the next post). Both of these tools come from calculus and help us identify where to find the minima on our cost functions.

Continue reading “Machine Learning Intuition: Using Derivatives to Minimize the Cost Function”

One-Page Notes: On Intelligence, by Hawkins and Blakeslee

I was researching Numenta, a company with a mission to reverse-engineer the human neocortex in software. One of the engineers gave a talk that introduced Numenta’s software at OSCON 2013. He offered a couple of resources for audience members who wanted to learn more about the science: a white paper on hierarchal temporal memory and a book called On Intelligence: How a New Understanding of the Brain Will Lead to the Creation of Truly Intelligent Machines. In the book, Jeff Hawkins and Sandra Blakeslee summarize a hypothesis on the structure and function of the neocortex. As I read through the book, I decided to take notes along the way and synthesize what I was learning in pictures. The result fit onto one page:

Continue reading “One-Page Notes: On Intelligence, by Hawkins and Blakeslee”

Machine Learning, Part 2: Classification

I’m working my way through a Coursera specialization on Machine Learning. The specialization includes several courses, the first of which provides a high-level overview of ML. I finished that one before I began talking about the coursework on my blog because I didn’t want to identify myself as a student of machine learning until I had actually gone through with something.

Going forward, I’ll share a post on each of the in-depth classes in the specialization. The first in-depth class was called Regression, and you can find my post about that course right here. The second class is called Classification, and this post will cover key concepts from the course as well as my observations and any supplementary readings that help me understand the material.

**Publishing unfinished. Will update as course continues.

Continue reading “Machine Learning, Part 2: Classification”

Machine Learning, Part 1: Regression

I’m working my way through a Coursera specialization on Machine Learning. The specialization includes several courses, the first of which provides a high-level overview of ML. I finished that one before I began talking about the coursework on my blog because I didn’t want to identify myself as a student of machine learning until I had actually gone through with something.

Going forward, I’ll share a post on each of the in-depth classes in the specialization. The first in-depth class is called Regression, and it includes six modules. Below I will share a little information about each module, my thoughts on some topics, and links to supplementary reading that I used to deepen my understanding of the concepts in the course.

Continue reading “Machine Learning, Part 1: Regression”

Blog at WordPress.com.

Up ↑