Explore >> Select a destination


You are here

www.markhw.com
| | yasha.solutions
8.1 parsecs away

Travel
| | A loss function, also known as a cost function or objective function, is a critical component in training machine learning models, particularly in neural networks and deep learning...
| | algobeans.com
13.9 parsecs away

Travel
| | Outliers can be detected by algorithms used for predictions. To illustrate, we use the k-nearest neighbor (kNN) clustering algorithm.
| | ssc.io
14.7 parsecs away

Travel
| | When a machine learning (ML) model exhibits poor quality (e.g., poor accuracy or fairness), the problem can often be traced back to errors in the training data. Being able to discover the data examples that are the most likely culprits is a fundamental concern that has received a lot of attention recently. One prominent way to measure 'data importance' with respect to model quality is the Shapley value. Unfortunately, existing methods only focus on the ML model in isolation, without considering the broader ML pipeline for data preparation and feature extraction, which appears in the majority of real-world ML code. This presents a major limitation to applying existing methods in practical settings. In this paper, we propose Canonpipe, a method for efficiently computing Shapley-based data importance over ML pipelines. We introduce several approximations that lead to dramatic improvements in terms of computational speed. Finally, our experimental evaluation demonstrates that our methods are capable of data error discovery that is as effective as existing Monte Carlo baselines, and in some cases even outperform them.
| | blog.google
82.2 parsecs away

Travel
| Neural networks can train computers to learn in a way similar to humans. Googler Maithra Raghu explains how they work.