|
You are here |
profmattstrassler.com | ||
| | | | |
scottaaronson.blog
|
|
| | | | | So I've written an article about the above questionfor PBS's website---a sort of tl;dr version of my 2005 survey paperNP-Complete Problems and Physical Reality, butupdated with new material about the simulation of quantum field theories and about AdS/CFT. Go over there, read the article (it's free), then come back here to talk about it if... | |
| | | | |
susskindsblogphysicsforeveryone.blogspot.com
|
|
| | | | | From Leonard Susskind to Everyone: A number of years ago I became aware of the large number of physics enthusiasts out there who have no ven... | |
| | | | |
www.bretthall.org
|
|
| | | | | This is a paper originally submitted to Swinburne University as part of a postgraduate project. An Anthropic Universe? Imagine a puddle waking up one morning and thinking, "This is an... | |
| | | | |
programmathically.com
|
|
| | | Sharing is caringTweetIn this post, we develop an understanding of why gradients can vanish or explode when training deep neural networks. Furthermore, we look at some strategies for avoiding exploding and vanishing gradients. The vanishing gradient problem describes a situation encountered in the training of neural networks where the gradients used to update the weights [] | ||