Explore >> Select a destination


You are here

nanonets.com
| | michael-lewis.com
2.9 parsecs away

Travel
| | This is a short summary of some of the terminology used in machine learning, with an emphasis on neural networks. I've put it together primarily to help my own understanding, phrasing it largely in non-mathematical terms. As such it may be of use to others who come from more of a programming than a mathematical background.
| | graphneural.network
3.6 parsecs away

Travel
| | [AI summary] The provided text is a comprehensive list of convolutional layers available in the Spektral library, each with specific functionalities and parameters. These layers are designed for graph neural networks and include various architectures such as GIN, GraphSAGE, TAG, and XENet, among others. Each layer has its own set of parameters and input/output requirements, making them suitable for different types of graph-based machine learning tasks.
| | distill.pub
2.5 parsecs away

Travel
| | What components are needed for building learning algorithms that leverage the structure and properties of graphs?
| | comsci.blog
14.7 parsecs away

Travel
| In this blog post, we will learn about vision transformers (ViT), and implement an MNIST classifier with it. We will go step-by-step and understand every part of the vision transformers clearly, and you will see the motivations of the authors of the original paper in some of the parts of the architecture.