A Survival Guide for Female Employees in Male-Dominated Companies

*This post originally appeared on the blog of The Digital Dames under one of my pseudonyms.

No. Way.

You just got an offer from that amahhhhhzing company with the $70M venture round and the [insert tech buzzword here].

Maybe the business is super-secretive, or maybe all their glassdoor reviews rave about how fun it is to work there. Beer! And Starcraft!

You show up on your first day, eager to meet all the badass women in leadership.

All zero of them.

Continue reading “A Survival Guide for Female Employees in Male-Dominated Companies”

Machine Learning Intuition: Understanding Taylor Series Approximation

We have talked before about the intuition behind cost function optimization in machine learning. We took a look at where cost functions come from and what they look like. We also talked about how one might get to the bottom of one with gradient descent.

When you open a machine learning textbook, you’ll see much more math than we used in that introduction. But the math is, in fact, important; the math gives us tools that we can use to quickly find the minima of our cost functions.

We talked about derivatives in the last post, and we’re talking about Taylor series in this post. Both of these tools come from calculus and help us identify where to find the minima on our cost functions.

Continue reading “Machine Learning Intuition: Understanding Taylor Series Approximation”

Machine Learning Intuition: Using Derivatives to Minimize the Cost Function

We have talked before about the intuition behind cost function optimization in machine learning. We took a look at where cost functions come from and what they look like. We also talked about how one might get to the bottom of one with gradient descent.

When you open a machine learning textbook, you’ll see much more math than we used in that introduction. But the math is, in fact, important; the math gives us tools that we can use to quickly find the minima of our cost functions.

We’re going to talk about two of those tools: derivatives (in this post) and Taylor series (in the next post). Both of these tools come from calculus and help us identify where to find the minima on our cost functions.

Continue reading “Machine Learning Intuition: Using Derivatives to Minimize the Cost Function”

One-Page Notes: Weapons of Math Destruction, by Cathy O’Neil

Folks ask me about the dangers of trusting computer-generated algorithms and artificial intelligence. The conversation usually brings up a future scenario in which the machines outsmart humans.

But there’s a more current problem: we trust machines to build algorithms based on incomplete or biased data that we feed them, and they perpetuate poor and unfounded decisions under the guise of ‘scientificness’ because a computer made the decision. Continue reading “One-Page Notes: Weapons of Math Destruction, by Cathy O’Neil”

Test-Driven iOS: Mocking Dependencies

Once upon a time we talked about how to initialize and launch view controllers manually. We did that so we could inject our dependencies via the initializer, then unit test our view controllers independently of those components.

Then we talked about how you can inject dependencies while loading your view controllers from the storyboard, instead of manually instantiating them.

We isolate the system under test from its dependencies by using  faked versions of those dependencies.

Today, we’ll look at a fake in more detail for unit testing your iOS app.

Continue reading “Test-Driven iOS: Mocking Dependencies”

One-Page Notes: On Intelligence, by Hawkins and Blakeslee

I was researching Numenta, a company with a mission to reverse-engineer the human neocortex in software. One of the engineers gave a talk that introduced Numenta’s software at OSCON 2013. He offered a couple of resources for audience members who wanted to learn more about the science: a white paper on hierarchal temporal memory and a book called On Intelligence: How a New Understanding of the Brain Will Lead to the Creation of Truly Intelligent Machines. In the book, Jeff Hawkins and Sandra Blakeslee summarize a hypothesis on the structure and function of the neocortex. As I read through the book, I decided to take notes along the way and synthesize what I was learning in pictures. The result fit onto one page:

Continue reading “One-Page Notes: On Intelligence, by Hawkins and Blakeslee”

Blog at WordPress.com.

Up ↑