|
You are here |
wandb.ai | ||
| | | | |
distill.pub
|
|
| | | | | How to tune hyperparameters for your machine learning model using Bayesian optimization. | |
| | | | |
blog.fastforwardlabs.com
|
|
| | | | | From random forest to neural networks, many modern machine learning algorithms involve a number of parameters that have to be fixed before training the algorithm. These parameters, in contrast to the ones learned by the algorithm during training, are called hyperparameters. The performance of a model on a task given data depends on the specific values of these hyperparameters. Hyperparamter tuning is the process of determining the hyperparameter values that maximize model performance on a task given data. | |
| | | | |
michael-lewis.com
|
|
| | | | | This is a short summary of some of the terminology used in machine learning, with an emphasis on neural networks. I've put it together primarily to help my own understanding, phrasing it largely in non-mathematical terms. As such it may be of use to others who come from more of a programming than a mathematical background. | |
| | | | |
www.kdnuggets.com
|
|
| | | This blog post provides a tutorial on constructing a convolutional neural network for image classification in PyTorch, leveraging convolutional and pooling layers for feature extraction as well as fully-connected layers for prediction. | ||