Explore >> Select a destination


You are here

research.google
| | hpc-ai.com
3.5 parsecs away

Travel
| | We are delighted to announce a comprehensive upgrade to the ColossalAI-MoE module, which is specifically designed to enhance MoE models. This upgrade aims to assist users in training and deploying expert models efficiently and stably.
| | pytorch.org
3.8 parsecs away

Travel
| | Recent studies have shown that large model training will be beneficial for improving model quality. During the last 3 years, model size grew 10,000 times from BERT with 110M parameters to Megatron-2 with one trillion. However, training large AI models is not easy-aside from the need for large amounts of computing resources, software engineering complexity is also challenging. PyTorch has been working on building tools and infrastructure to make it easier.
| | www.anyscale.com
3.9 parsecs away

Travel
| | ByteDance, the company behind Tiktok, leverages multi-modal models to enable many applications, such as text-based image retrieval or object detection.
| | bdtechtalks.com
26.0 parsecs away

Travel
| The transformer model has become one of the main highlights of advances in deep learning and deep neural networks.