You are here |
tomaugspurger.net | ||
| | | |
teddykoker.com
|
|
| | | | A few posts back I wrote about a common parameter optimization method known as Gradient Ascent. In this post we will see how a similar method can be used to create a model that can classify data. This time, instead of using gradient ascent to maximize a reward function, we will use gradient descent to minimize a cost function. Lets start by importing all the libraries we need: | |
| | | |
matbesancon.xyz
|
|
| | | | Learning by doing: detecting fraud on bank notes using Python in 3 steps. | |
| | | |
www.ethanrosenthal.com
|
|
| | | | Having built machine learning products at two different companies with very different engineering cultures, I've made all of the following wrong assumptions All other data orgs do things like my company, so we're doing fine. My org is way behind other orgs. My org is uniquely advanced, so we can rest on our laurels. In order to escape the small bubble of my existence, I posted a survey in May 2019 to a private slack group and on Twitter. | |
| | | |
blog.ml.cmu.edu
|
|
| | The latest news and publications regarding machine learning, artificial intelligence or related, brought to you by the Machine Learning Blog, a spinoff of the Machine Learning Department at Carnegie Mellon University. |