Skip to content

Add promote_eltype for CuArray / ROCArray / MtlArray#483

Open
ChrisRackauckas-Claude wants to merge 1 commit intoJuliaArrays:masterfrom
ChrisRackauckas-Claude:add-promote-eltype-gpu
Open

Add promote_eltype for CuArray / ROCArray / MtlArray#483
ChrisRackauckas-Claude wants to merge 1 commit intoJuliaArrays:masterfrom
ChrisRackauckas-Claude:add-promote-eltype-gpu

Conversation

@ChrisRackauckas-Claude
Copy link
Copy Markdown

Summary

  • Adds ArrayInterface.promote_eltype methods for the three GPU array types, defined in their respective package extensions (ArrayInterfaceCUDAExt, ArrayInterfaceAMDGPUExt, ArrayInterfaceMetalExt).
  • Each implementation swaps the element type via promote_type(T, T2) while preserving the array's other type parameters (memory kind M for CuArray, buffer type B for ROCArray, storage mode S for MtlArray).
  • Bumps patch version 7.23.07.24.0.

Why

ArrayInterface.promote_eltype was only defined for Array{T, N} (with a "no generic fallback" note in its docstring), so downstream packages that pass GPU array types through promote_eltype hit a MethodError. Example from SciML/NonlinearSolve.jl#910 (test/cuda_tests.jl:33 "GeneralizedFirstOrderAlgorithm") when deriving a Dual-eltype wrapper-signature type for a CuArray{Float32} state:

MethodError: no method matching promote_eltype(
    ::Type{CuArray{Float32, 1, CUDA.DeviceMemory}},
    ::Type{ForwardDiff.Dual{Tag{NonlinearSolveBase.NonlinearSolveTag, Float32}, Float32, 1}})

Adding the obvious eltype-swapping method in each GPU extension makes promote_eltype usable for GPU-array downstream code paths without forcing callers back onto the allocating typeof(similar(a, T2)) pattern.

Test plan

  • CI passes (no GPU runners configured in this repo, so methods are verified by dispatch signature only; downstream NonlinearSolve.jl CUDA tests confirm the CuArray variant resolves correctly).

Notes

  • No changes to the generic promote_eltype contract — the "no generic fallback" docstring note still holds; this PR just adds three more concrete methods.
  • Method signatures use `<:CUDA.CuArray{T, N, M}` (etc.) rather than the exact type to cover subtypes (e.g. `DenseCuArray`).

🤖 Generated with Claude Code

`ArrayInterface.promote_eltype` only had a method for plain `Array{T, N}`
(with an explicit "no generic fallback is given" note in the docstring).
Downstream packages that pass GPU array types through
`promote_eltype` therefore hit a `MethodError` — for example,
SciML/NonlinearSolve.jl#910 tripped this on
`test/cuda_tests.jl:33 "GeneralizedFirstOrderAlgorithm"` when deriving
a Dual-eltype wrapper-signature array type for `CuArray{Float32}`:

    MethodError: no method matching promote_eltype(
        ::Type{CuArray{Float32, 1, CUDA.DeviceMemory}},
        ::Type{ForwardDiff.Dual{Tag{NonlinearSolveBase.NonlinearSolveTag, Float32}, Float32, 1}})

Adds the obvious eltype-swapping method in each GPU extension,
preserving the non-eltype type parameters (`M` for `CuArray` memory
kind, `B` for `ROCArray` buffer type, `S` for `MtlArray` storage mode):

    ArrayInterface.promote_eltype(
        ::Type{<:CuArray{T, N, M}}, ::Type{T2}
    ) where {T, N, M, T2} = CuArray{promote_type(T, T2), N, M}

    ArrayInterface.promote_eltype(
        ::Type{<:ROCArray{T, N, B}}, ::Type{T2}
    ) where {T, N, B, T2} = ROCArray{promote_type(T, T2), N, B}

    ArrayInterface.promote_eltype(
        ::Type{<:MtlArray{T, N, S}}, ::Type{T2}
    ) where {T, N, S, T2} = MtlArray{promote_type(T, T2), N, S}

Bumps patch version 7.23.0 → 7.24.0 so downstream packages can
compat-bound the new method.

Co-Authored-By: Chris Rackauckas <accounts@chrisrackauckas.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@codecov
Copy link
Copy Markdown

codecov bot commented Apr 15, 2026

Codecov Report

❌ Patch coverage is 0% with 6 lines in your changes missing coverage. Please review.
✅ Project coverage is 60.20%. Comparing base (192a325) to head (ab3b340).

Files with missing lines Patch % Lines
ext/ArrayInterfaceAMDGPUExt.jl 0.00% 2 Missing ⚠️
ext/ArrayInterfaceCUDAExt.jl 0.00% 2 Missing ⚠️
ext/ArrayInterfaceMetalExt.jl 0.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #483      +/-   ##
==========================================
- Coverage   60.82%   60.20%   -0.63%     
==========================================
  Files          14       14              
  Lines         582      588       +6     
==========================================
  Hits          354      354              
- Misses        228      234       +6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants