Explore >> Select a destination


You are here

algobeans.com
| | sportscidata.com
3.6 parsecs away

Travel
| | Recently Dan Weaving and the research group at Leeds Beckett University put out a paper outlining how to perform a type of dimension reduction on training load data: principal component analysis (PCA). The benefit of such an analysis is it can reduce a large number of metrics into a more manageable dataset. This may uncover...
| | liorpachter.wordpress.com
3.0 parsecs away

Travel
| | In the Jeopardy! game show contestants are presented with questions formulated as answers that requireanswers in the form questions. For example, if a contestant selects "Normality for $200&...
| | moyhu.blogspot.com
8.5 parsecs away

Travel
| | I had been quietly plodding through Wahl and Ammann 2007 . This is the paper that re-did the MBH98/99 proxy calculations using conventionall...
| | vxlabs.com
15.8 parsecs away

Travel
| I have recently become fascinated with (Variational) Autoencoders and with PyTorch. Kevin Frans has a beautiful blog post online explaining variational autoencoders, with examples in TensorFlow and, importantly, with cat pictures. Jaan Altosaar's blog post takes an even deeper look at VAEs from both the deep learning perspective and the perspective of graphical models. Both of these posts, as well as Diederik Kingma's original 2014 paper Auto-Encoding Variational Bayes, are more than worth your time.