Month: July 2020

  • Keyword Search

  • Post Date Range

activation function

Activation Functions

By Bryon Moyer | July 31, 2020 | 0 Comments

[From the last episode: We looked at the notion of sparsity and how it helps with the math.] We saw before that there are three main elements in a CNN: the convolution, the pooling, and the activation . Today we focus on activation . I’ll start by saying that the […]

Read More
sparsity

The Benefits of Sparsity

By Bryon Moyer | July 24, 2020 | 0 Comments

[From the last episode: We looked at the reason for doing all the multiply-accumulate math with machine learning.] Before we leave this matrix multiplication thing, there’s one other notion floating about a lot these days. Anyone listening to conversations about how to make machine-learning (ML) algorithms faster will eventually hear […]

Read More

Why All the Multiplication for Machine Learning?

By Bryon Moyer | July 17, 2020 | 0 Comments

[From the last episode: We looked at the reasons for all of the multiply-accumulate hardware in machine-learning engines.] OK, we’ve spent a couple of weeks deep in the math used for machine learning (ML). Now let’s back up a second to look at what this all means in the bigger […]

Read More
matrix multiplication

Where Do the MACs Come From?

By Bryon Moyer | July 10, 2020 | 0 Comments

[From the last episode: We looked at the convolution that defines the CNNs that are so popular for machine vision applications.] This week we’re going to do some more math, although, in this case, it won’t be as obscure and bizarre as convolution – and yet we will find some […]

Read More
convolution

What the Heck Is Convolution?

By Bryon Moyer | July 3, 2020 | 0 Comments

[From the last episode: We looked at CNNs for vision as well as other neural networks for other applications.] We’re going to take a quick detour into math today. For those of you that have done advanced math, this may be a review, or it might even seem to be […]

Read More
Site by Consistent Image Web Design