|
You are here |
beltoforion.de | ||
| | | | |
ataspinar.com
|
|
| | | | | [latexpage] In this blog-post we will have a look at how Differential Equations (DE) can be solved numerically via the Finite Differences method. By solving differential equations we can run simula... | |
| | | | |
www.stochasticlifestyle.com
|
|
| | | | | I found these notes from August 2018 and thought they might be useful so I am posting them verbatim. A stiff ordinary differential equation is a difficult problem to integrator. However, many of the ODE solver suites offer quite a few different choices for this kind of problem. DifferentialEquations.jl offers almost 200 different choices for example. In this article we will dig into what the differences between these integrators really is so that way you can more easily find which one will be most efficient for your problem. Quick Overview (tl;dr) A BDF, Rosenbrock, ESDIRK method are standard For small equations, Rosenbrock methods have performance advantages For very stiff systems, Rosenbrock and Rosenbrock-W methods do not require convergence of Newton's m... | |
| | | | |
ggcarvalho.dev
|
|
| | | | | Using the power of randomness to answer scientific questions. | |
| | | | |
blog.otoro.net
|
|
| | | [AI summary] This article describes a project that combines genetic algorithms, NEAT (NeuroEvolution of Augmenting Topologies), and backpropagation to evolve neural networks for classification tasks. The key components include: 1) Using NEAT to evolve neural networks with various activation functions, 2) Applying backpropagation to optimize the weights of these networks, and 3) Visualizing the results of the evolved networks on different datasets (e.g., XOR, two circles, spiral). The project also includes a web-based demo where users can interact with the system, adjust parameters, and observe the evolution process. The author explores how the genetic algorithm can discover useful features (like squaring inputs) without human intervention, and discusses the ... | ||