You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
InvalidArgumentError: Number of ways to split should evenly divide the split dimension, but got split_dim 0 (size = 29) and num_split 4
[[node sequential_1/masksembles2d/split (defined at /usr/local/lib/python3.7/dist-packages/tensorflow_core/python/framework/ops.py:1751) ]] [Op:__inference_distributed_function_14069]
The current implementation assumes that every batch has a number of samples that are divisible by the number of masks.
A possible workaround is to create tf dataset with drop_remainder option turned on or just check that number of training (and validation) samples is divisible by 4.
I think I can try to patch the current implementation so it would work in your case too but I need some time for it.
Hello,
I'm trying to use the layer, and I'm facing the below error:
My Network looks like this:
My amount of data is 1170 records of train, and 0.1 to the validation.
What should I notice before using this model?
Thanks.
The text was updated successfully, but these errors were encountered: