|
You are here |
sciruby.com | ||
| | | | |
aosmith.rbind.io
|
|
| | | | | I walk through an example of simulating data from a binomial generalized linear mixed model with a logit link and then exploring estimates of over/underdispersion. | |
| | | | |
matbesancon.xyz
|
|
| | | | | Learning by doing: predicting the outcome. | |
| | | | |
www.fromthebottomoftheheap.net
|
|
| | | | | [AI summary] The text discusses the use of generalized additive models (GAMs) to represent random effects as smooths, enabling the testing of random effects against a null of zero variance. It compares this approach with traditional mixed-effects models (e.g., lmer) and highlights the advantages and limitations of each. Key points include: (1) Representing random effects as smooths in GAMs allows for efficient testing of variance components and compatibility with complex distributional models. (2) While GAMs can fit such models, they are computationally slower for large datasets with many random effects due to the lack of sparse matrix optimization. (3) The AIC values for models with and without random effects are similar, suggesting that the simpler model i... | |
| | | | |
iclr-blogposts.github.io
|
|
| | | Diffusion Models, a new generative model family, have taken the world by storm after the seminal paper by Ho et al. [2020]. While diffusion models are often described as a probabilistic Markov Chains, their underlying principle is based on the decade-old theory of Stochastic Differential Equations (SDE), as found out later by Song et al. [2021]. In this article, we will go back and revisit the 'fundamental ingredients' behind the SDE formulation and show how the idea can be 'shaped' to get to the modern form of Score-based Diffusion Models. We'll start from the very definition of the 'score', how it was used in the context of generative modeling, how we achieve the necessary theoretical guarantees and how the critical design choices were made to finally arri... | ||