Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about how to use on gpu #195

Open
yolhan83 opened this issue Nov 16, 2024 · 2 comments
Open

Question about how to use on gpu #195

yolhan83 opened this issue Nov 16, 2024 · 2 comments

Comments

@yolhan83
Copy link

yolhan83 commented Nov 16, 2024

Hello, I wonder why this is not working on gpu :

julia> using CUDA,FiniteDiff
julia> f(x) = sum(abs2,x)
f (generic function with 1 method)

julia> x = rand(100,100) |> CuArray;

julia> f(x)
3298.8592709432687

julia> FiniteDiff.finite_difference_gradient(f,x);
ERROR: MethodError: no method matching compute_epsilon(::Val{:central}, ::CuArray{Float64, 2, CUDA.DeviceMemory}, ::Float64, ::Float64, ::Bool)
The function `compute_epsilon` exists, but no method is defined for this combination of argument types.

julia> FiniteDiff.finite_difference_gradient(f,collect(x)); # works
@ChrisRackauckas
Copy link
Member

I don't currently have a GPU to debug this: can you share the full stack trace so I can see if I can eyeball it?

@yolhan83
Copy link
Author

yolhan83 commented Nov 17, 2024

Yes, here it is

MethodError: no method matching compute_epsilon(::Val{:central}, ::CuArray{Float64, 2, CUDA.DeviceMemory}, ::Float64, ::Float64, ::Bool)
The function `compute_epsilon` exists, but no method is defined for this combination of argument types.

Closest candidates are:
  compute_epsilon(::Val{:central}, ::T, ::Real, ::Real, ::Any) where T<:Number
   @ FiniteDiff C:\Users\yolha\.julia\packages\FiniteDiff\vund7\src\epsilons.jl:13
  compute_epsilon(::Val{:hcentral}, ::T, ::Real, ::Real, ::Any) where T<:Number
   @ FiniteDiff C:\Users\yolha\.julia\packages\FiniteDiff\vund7\src\epsilons.jl:17
  compute_epsilon(::Val{:central}, ::T, ::Real, ::Real) where T<:Number
   @ FiniteDiff C:\Users\yolha\.julia\packages\FiniteDiff\vund7\src\epsilons.jl:13
  ...


Stacktrace:
 [1] #finite_difference_gradient!#9
   @ C:\Users\yolha\.julia\packages\FiniteDiff\vund7\src\gradients.jl:280 [inlined]
 [2] finite_difference_gradient!
   @ C:\Users\yolha\.julia\packages\FiniteDiff\vund7\src\gradients.jl:228 [inlined]
 [3] finite_difference_gradient(f::typeof(f), x::CuArray{Float64, 2, CUDA.DeviceMemory}, fdtype::Val{:central}, returntype::Type, inplace::Val{true}, fx::Nothing, c1::Nothing, c2::Nothing; relstep::Float64, absstep::Float64, dir::Bool)
   @ FiniteDiff C:\Users\yolha\.julia\packages\FiniteDiff\vund7\src\gradients.jl:163
 [4] finite_difference_gradient(f::Function, x::CuArray{Float64, 2, CUDA.DeviceMemory}, fdtype::Val{:central}, returntype::Type, inplace::Val{true}, fx::Nothing, c1::Nothing, c2::Nothing)
   @ FiniteDiff C:\Users\yolha\.julia\packages\FiniteDiff\vund7\src\gradients.jl:129
 [5] top-level scope
   @ In[12]:3

meaning, it enters here in the code,

function finite_difference_gradient!(
    df,
    f,
    x,
    cache::GradientCache{T1,T2,T3,T4,fdtype,returntype,inplace};
    relstep=default_relstep(fdtype, eltype(x)),
    absstep=relstep,
    dir=true) where {T1,T2,T3,T4,fdtype,returntype,inplace}

    # NOTE: in this case epsilon is a vector, we need two arrays for epsilon and x1
    # c1 denotes x1, c2 is epsilon
    fx, c1, c2, c3 = cache.fx, cache.c1, cache.c2, cache.c3
    if fdtype != Val(:complex) && ArrayInterface.fast_scalar_indexing(c2)
        @. c2 = compute_epsilon(fdtype, x, relstep, absstep, dir)
        copyto!(c1, x)
    end
    copyto!(c3, x)
    if fdtype == Val(:forward)
        ......
    elseif fdtype == Val(:central)
        @inbounds for i ∈ eachindex(x)
            if ArrayInterface.fast_scalar_indexing(c2)
                epsilon = ArrayInterface.allowed_getindex(c2, i) * dir
            else
                epsilon = compute_epsilon(fdtype, x, relstep, absstep, dir) * dir # !!!!!!!!!!!!!!!!!!!!!!!!!!!
            end

and breaks because x should be a number and its the array that enters.
Also, it's the same for :forward

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants