Test: Add Mooncake AD testing to conv layer test infrastructure#641
Test: Add Mooncake AD testing to conv layer test infrastructure#641CarloLucibello merged 2 commits intoJuliaGraphs:masterfrom
Conversation
|
@CarloLucibello Is this looking okay? I can proceed to check other layers too. |
| if test_mooncake | ||
| # Mooncake gradient with respect to input, compared against Zygote. | ||
| loss_mc_x = (xs...) -> loss(f, graph, xs...) | ||
| _cache_x = Base.invokelatest(Mooncake.prepare_gradient_cache, loss_mc_x, xs...) |
There was a problem hiding this comment.
It is related to the world age, the import Mooncake and TestItemRunner's eval have different world ages and throws error. It is prevented by invokelatest.
|
tests related to this PR are failing (GNN julia 1). On julia <1.12 mooncake testing should be skipped. You can define a global flag in test/runtests.jl const TEST_MOONCAKE = VERSION >= v"1.12" |
|
Also, instead of calling Mooncake directly, we can call See the testing function https://github.com/FluxML/Flux.jl/blob/master/test/test_utils.jl#L82, |
@CarloLucibello, thanks for the detailed review! For With friendly_tangents=true, Mooncake tries to deep-copy the closure's captured variables (including GNNGraph), which fails because DataStore contains Dict{Symbol,Any}. I'm calling Mooncake's API directly with the default config as a workaround. I will include the version check for julia 1.12. |
If you could provide an example, this should be reported to the Mooncake repo. Ok having the workaroud for the time being. |
|
There are a couple of layers that fail tests. We should:
|
Signed-off-by: Parvm1102 <parvmittal31757@gmail.com>
7aeae52 to
84d49d1
Compare
|
@CarloLucibello I have updated my PR and the checks are passing now. I had to disable mooncake testing for four layers:
I have also made an MWE: using Mooncake
# _copy_output fails on any Dict
Mooncake._copy_output(Dict(:x => 1))
# TypeError: in new, expected DataType, got Type{Symbol}
# This breaks prepare_gradient_cache when friendly_tangents=true
struct DataStore
_n::Int
_data::Dict{Symbol, Any}
end
f(x, ds) = sum(x) * Float32(ds._n)
x = randn(Float32, 3, 3)
ds = DataStore(3, Dict{Symbol,Any}(:x => x))
# FAILS: friendly_tangents=true triggers _copy_output on all args
Mooncake.prepare_gradient_cache(f, x, ds; config=Mooncake.Config(friendly_tangents=true))
# WORKS: default (friendly_tangents=false) is fine
cache = Mooncake.prepare_gradient_cache(f, x, ds)
Mooncake.value_and_gradient!!(cache, f, x, ds) # works |
Nice! Could open an issue in Mooncake.jl with it? |
I will open it, and should I also check mooncake compatibility for layers other than Conv? |
|
yes! |
This is related to this issue #640
I have implemented Mooncake AD testing to Conv layers (conv.jl).
test_gradientsfunction in theGraphNeuralNetworks/test/test_module.jlfile to accomodate Mooncake AD iftest_mooncakeflag istrue.