You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+3-1
Original file line number
Diff line number
Diff line change
@@ -46,4 +46,6 @@ x = zeros(LogDensityProblems.dimension(ℓ)) # ℓ is your log density
46
46
47
47
5.[FiniteDifferences.jl](https://github.com/JuliaDiff/FiniteDifferences.jl) Finite differences are very robust, with a small numerical error, but usually not fast enough to practically replace AD on nontrivial problems. The backend in this package is mainly intended for checking and debugging results from other backends; but note that in most cases ForwardDiff is faster and more accurate.
48
48
49
-
PRs for other AD frameworks are welcome, even if they are WIP.
49
+
Other AD frameworks are supported thanks to [ADTypes.jl](https://github.com/SciML/ADTypes.jl) and [DifferentiationInterface.jl](https://github.com/gdalle/DifferentiationInterface.jl).
50
+
51
+
PRs for remaining AD frameworks are welcome, even if they are WIP.
Wrap log density `ℓ` using automatic differentiation (AD) of type `ad` to obtain a gradient.
15
15
@@ -19,12 +19,19 @@ Currently,
19
19
- `ad::ADTypes.AutoReverseDiff`
20
20
- `ad::ADTypes.AutoTracker`
21
21
- `ad::ADTypes.AutoZygote`
22
-
are supported.
23
-
The AD configuration specified by `ad` is forwarded to the corresponding calls of `ADgradient(Val(...), ℓ)`.
22
+
are supported with custom implementations.
23
+
The AD configuration specified by `ad` is forwarded to the corresponding calls of `ADgradient(Val(...), ℓ)`.
24
+
25
+
Passing `x` as a keyword argument means that the gradient operator will be "prepared" for the specific type and size of the array `x`. This can speed up further evaluations on similar inputs, but will likely cause errors if the new inputs have a different type or size. With `AutoReverseDiff`, it can also yield incorrect results if the logdensity contains value-dependent control flow.
26
+
27
+
If you want to use another backend from [ADTypes.jl](https://github.com/SciML/ADTypes.jl) which is not in the list above, you need to load [DifferentiationInterface.jl](https://github.com/gdalle/DifferentiationInterface.jl) first.
Gradient wrapper which uses [DifferentiationInterface.jl](https://github.com/gdalle/DifferentiationInterface.jl)
11
+
12
+
# Fields
13
+
14
+
- `backend::AbstractADType`: one of the autodiff backend types defined in [ADTypes.jl](https://github.com/SciML/ADTypes.jl), for example `ADTypes.AutoForwardDiff()`
15
+
- `prep`: either `nothing` or the output of `DifferentiationInterface.prepare_gradient` applied to the logdensity and the provided input
16
+
- `ℓ`: logdensity function, amenable to `LogDensityProblemsAD.logdensity(ℓ, x)`
0 commit comments