You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm just getting started with porting a Julia package to a Python module. On the Python side, all the interaction will be with PyTorch tensors.
I noticed that one can pass numpy arrays and lists to Julia, which will then be cast to the built-in Array type. AFAIK, this is not possible with PyTorch tensors. A simple workaround would be to convert the tensors from torch to numpy (and the reverse). However, I am curious if there is a more elegant way. How can I control this mapping?
Best regards,
Fabio
The text was updated successfully, but these errors were encountered:
I believe any Python array that uses the buffer protocol should be wrappable by Julia without copying. From the Julia perspective, we just need the dimensions, (d)type, and pointer for the data structure. We could then use Base.unsafe_wrap or its equivalent.
Hi PyJulia team,
I'm just getting started with porting a Julia package to a Python module. On the Python side, all the interaction will be with PyTorch tensors.
I noticed that one can pass numpy arrays and lists to Julia, which will then be cast to the built-in Array type. AFAIK, this is not possible with PyTorch tensors. A simple workaround would be to convert the tensors from torch to numpy (and the reverse). However, I am curious if there is a more elegant way. How can I control this mapping?
Best regards,
Fabio
The text was updated successfully, but these errors were encountered: