Explore >> Select a destination


You are here

chalkdustmagazine.com
| | corbettmaths.com
20.4 parsecs away

Travel
| | The Ultimate GCSE Higher Maths Revision Video and Booklet - Edexcel AQA OCR - Corbettmaths
| | aimatters.wordpress.com
16.3 parsecs away

Travel
| | Introduction This post demonstrates the calculations behind the evaluation of the Softmax Derivative using Python. It is based on the excellent article by Eli Bendersky which can be found here. The Softmax Function The softmax function simply takes a vector of N dimensions and returns a probability distribution also of N dimensions. Each element of...
| | randorithms.com
20.5 parsecs away

Travel
| | The Taylor series is a widely-used method to approximate a function, with many applications. Given a function \(y = f(x)\), we can express \(f(x)\) in terms ...
| | fa.bianp.net
105.3 parsecs away

Travel
| Backtracking step-size strategies (also known as adaptive step-size or approximate line-search) that set the step-size based on a sufficient decrease condition are the standard way to set the step-size on gradient descent and quasi-Newton methods. However, these techniques are typically not used for Frank-Wolfe-like algorithms. In this blog post I discuss a backtracking line-search for the Frank-Wolfe algorithm.