|
You are here |
www.machinedlearnings.com | ||
| | | | |
www.paepper.com
|
|
| | | | | Recent advances in training deep neural networks have led to a whole bunch of impressive machine learning models which are able to tackle a very diverse range of tasks. When you are developing such a model, one of the notable downsides is that it is considered a "black-box" approach in the sense that your model learns from data you feed it, but you don't really know what is going on inside the model. | |
| | | | |
windowsontheory.org
|
|
| | | | | Previous post: ML theory with bad drawings Next post: What do neural networks learn and when do they learn it, see also all seminar posts and course webpage. Lecture video (starts in slide 2 since I hit record button 30 seconds too late - sorry!) - slides (pdf) - slides (Powerpoint with ink and animation)... | |
| | | | |
yasha.solutions
|
|
| | | | | A loss function, also known as a cost function or objective function, is a critical component in training machine learning models, particularly in neural networks and deep learning... | |
| | | | |
ayvlasov.wordpress.com
|
|
| | | Recent debates on possibility of quantum computer provoked a specific prize. Between all, Scott Aaronson wrote: [...] whether scalable quantum computing is possible is a question about the laws of physics. It's perfectly conceivable that future developments in physics would conflict with scalable quantum computing, in the same way that relativity conflicts with faster-than-light communication,... | ||