You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This package seems to go pretty far in unpacking objects that don't necessarily have an obvious JSON representation. This can be a pretty cool feature, but it can also lead to some weird unexpected behavior. The example that burned me is
This of course happens because skipmissing returns an AbstractVector which refers to the original object and simply skips the missings during iteration, and the field in the skipmissing iterator that refers to the parent array is called x. I think it's safe to say that most people would probably expect any AbstractVector just convert into a JS list corresponding to the AbstractVector itself and not the underlying data.
So, my proposal is that at least for AbstractArray and AbstractDict types the JSON generated should attempt to show the object represented as an actual JS list or dict, rather than showing the underlying data.
Thoughts?
The text was updated successfully, but these errors were encountered:
This seems very reasonable, and is indeed already the behavior, as far as I know. But unfortunately the return value of skipmissing is not an AbstractArray. We can try to turn all iterables to JS lists instead of unpacking them as a struct?
I had not realized when I opened this issue that skipmissing is not an AbstractArray. Now that I'm thinking about it, I suppose the reason for this is that it does not have a known length.
I'm no longer sure what I think the right solution is. Converting all iterables into JS lists seems a little aggressive, and the best way to determine whether something is with hasmethod(iterate, T) (which is a bit slow, but perhaps not prohibitively so).
I'm wondering if it's perhaps better to throw an error when encountering new structs? That seems far safer than trying to unpack them. As long as JSON.jl converts AbstractVector and AbstractDict correctly, this should only happen on types that don't have an immediately obvious JS interpretation.
When I encountered the error I was sort of carelessly calling JSON.json on stuff that was ultimately going to be read by a REST API. Since that's a pretty common use case, it might be a little better to err on the side of safety for unknown types.
This package seems to go pretty far in unpacking objects that don't necessarily have an obvious JSON representation. This can be a pretty cool feature, but it can also lead to some weird unexpected behavior. The example that burned me is
This of course happens because
skipmissing
returns anAbstractVector
which refers to the original object and simply skips themissing
s during iteration, and the field in theskipmissing
iterator that refers to the parent array is calledx
. I think it's safe to say that most people would probably expect anyAbstractVector
just convert into a JS list corresponding to theAbstractVector
itself and not the underlying data.So, my proposal is that at least for
AbstractArray
andAbstractDict
types the JSON generated should attempt to show the object represented as an actual JS list or dict, rather than showing the underlying data.Thoughts?
The text was updated successfully, but these errors were encountered: