Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: An operation has None for gradient. #3

Open
KihaRaito opened this issue May 18, 2019 · 0 comments
Open

ValueError: An operation has None for gradient. #3

KihaRaito opened this issue May 18, 2019 · 0 comments

Comments

@KihaRaito
Copy link

Hi!

I tried to set Params.max_passage_count to 1 and write a method for getting word embedding that matches the shape of INPUT from passage_ids and question_ids and model.fit, but I encountered the following gradient error.

src/model.py

model.fit(get_batch("dev"), batch_size=Params.batch_size, epochs=Params.num_epochs, verbose=1)

src/data_load.py

class Embedding:
    ...
    def get_word_emb(self, ids):
        return [getattr(self, "_word_emb")[i] for i in ids]

...

def get_batch(mode="train"):
    data, shapes = load_data("../dev.json")
    input_queue = tf.train.slice_input_producer(data, shuffle=False)
    batch = tf.train.batch(input_queue, num_threads=2,
                           batch_size=Params.batch_size, capacity=Params.batch_size * 32, dynamic_pad=True)
    return batch

Error

Traceback (most recent call last):
  File "model.py", line 72, in <module>
    Vnet()
  File "model.py", line 65, in __init__
    model.fit(get_batch("dev"), batch_size=Params.batch_size, epochs=Params.num_epochs, verbose=1)
  File "/anaconda3/lib/python3.7/site-packages/keras/engine/training.py", line 1010, in fit
    self._make_train_function()
  File "/anaconda3/lib/python3.7/site-packages/keras/engine/training.py", line 509, in _make_train_function
    loss=self.total_loss)
  File "/anaconda3/lib/python3.7/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)
  File "/anaconda3/lib/python3.7/site-packages/keras/optimizers.py", line 475, in get_updates
    grads = self.get_gradients(loss, params)
  File "/anaconda3/lib/python3.7/site-packages/keras/optimizers.py", line 91, in get_gradients
    raise ValueError('An operation has `None` for gradient. '
ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.

It looks like verify_loss is being implemented, is it?
src/loss_functions.py

def verify_loss(y_true, y_pred):
    batch_losses = K.sum(y_true * K.log(y_pred), axis=-1)
    # batch_losses = K.map_fn(loss_per_batch, (y_true, y_pred), dtype="float32")
    return -K.mean(batch_losses)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant