Explore >> Select a destination


You are here

analytixon.com
| | teddykoker.com
10.1 parsecs away

Travel
| | Gradient-descent-based optimizers have long been used as the optimization algorithm of choice for deep learning models. Over the years, various modifications to the basic mini-batch gradient descent have been proposed, such as adding momentum or Nesterovs Accelerated Gradient (Sutskever et al., 2013), as well as the popular Adam optimizer (Kingma & Ba, 2014). The paper Learning to Learn by Gradient Descent by Gradient Descent (Andrychowicz et al., 2016) demonstrates how the optimizer itself can be replac...
| | pyimagesearch.com
9.7 parsecs away

Travel
| | In this tutorial, you will learn what gradient descent is, how gradient descent enables us to train neural networks, variations of gradient descent, including Stochastic Gradient Descent (SGD), and how SGD can be improved using momentum and Nesterov acceleration.
| | ssc.io
13.9 parsecs away

Travel
| | When a machine learning (ML) model exhibits poor quality (e.g., poor accuracy or fairness), the problem can often be traced back to errors in the training data. Being able to discover the data examples that are the most likely culprits is a fundamental concern that has received a lot of attention recently. One prominent way to measure 'data importance' with respect to model quality is the Shapley value. Unfortunately, existing methods only focus on the ML model in isolation, without considering the broader ML pipeline for data preparation and feature extraction, which appears in the majority of real-world ML code. This presents a major limitation to applying existing methods in practical settings. In this paper, we propose Canonpipe, a method for efficiently computing Shapley-based data importance over ML pipelines. We introduce several approximations that lead to dramatic improvements in terms of computational speed. Finally, our experimental evaluation demonstrates that our methods are capable of data error discovery that is as effective as existing Monte Carlo baselines, and in some cases even outperform them.
| | gist.github.com
52.6 parsecs away

Travel
| GitHub is where people build software. More than 150 million people use GitHub to discover, fork, and contribute to over 420 million projects.