using ConformalPrediction | |
using Distributions | |
using MLJ | |
using Plots | |
# Inputs: | |
N = 600 | |
xmax = 3.0 | |
d = Uniform(-xmax, xmax) |
The goal is to differentiate a log-likelihood function - the workhorse of probability theory, mathematical statistics and machine learning.
Here, it's the log-likelihood of a Gaussian mixture model:
normal_pdf(x::Real, mean::Real, var::Real) =
exp(-(x - mean)^2 / (2var)) / sqrt(2π * var)
# Requires installation of [GLMakie](https://github.com/JuliaPlots/Makie.jl) | |
# Include this file as include("eigshow.jl"), then run eigshow() | |
using GLMakie, LinearAlgebra, Printf | |
# Toby Driscoll ([email protected]), October 2021. Released under Creative Commons CC BY-NC 3.0 license. | |
# This function is inspired by EIGSHOW.M, which is held in copyright by The MathWorks, Inc and found at: | |
# Cleve Moler (2021). Cleve_Lab (https://www.mathworks.com/matlabcentral/fileexchange/59085-cleve_lab), MATLAB Central File Exchange. Retrieved October 25, 2021. | |
""" |
The Jax developers optimized a differential equation benchmark in this issue which used DiffEqFlux.jl as a performance baseline. The Julia code from there was updated to include some standard performance tricks and is the benchmark code here. Thus both codes have been optimized by the library developers.
## Source for math political compass klein bottle meme by @KimPLab on Twitter | |
## Tweet: https://twitter.com/KimPLab/status/1381621398636949511 | |
## Inspired by the math political compass torus meme by @jessebett | |
## https://twitter.com/jessebett/status/1379162611414138885 | |
## @jessebett source code notes: | |
## upcycled from torus knot fibration visualization: | |
## http://www.jessebett.com/TorusKnotFibration/torusknot.html | |
## Note: |
The spiral neural ODE was used as the training benchmark for both torchdiffeq (Python) and DiffEqFlux (Julia) which utilized the same architecture and 500 steps of ADAM. Both achived similar objective values at the end. Results:
- DiffEqFlux defaults: 7.4 seconds
- DiffEqFlux optimized: 2.7 seconds
- torchdiffeq: 288.965871299999 seconds
Only non-stiff ODE solvers are tested since torchdiffeq does not have methods for stiff ODEs. The ODEs are chosen to be representative of models seen in physics and model-informed drug development (MIDD) studies (quantiative systems pharmacology) in order to capture the performance on realistic scenarios.
Below are the timings relative to the fastest method (lower is better). For approximately 1 million ODEs and less, torchdiffeq was more than an order of magnitude slower than DifferentialEquations.jl