LBNLayer not working when called on multiple tensors

The LBNLayer class is not working properly, when called on multiple different tensors. This problem can be showed by this minimal example: (using tensorflow-gpu==1.13.1)

import tensorflow as tf
from lbn import LBNLayer

sseq = tf.keras.Sequential()
sseq.add(LBNLayer(n_particles=8))

x_0 = tf.placeholder(tf.float32, (100, 8, 4))
x_1 = tf.placeholder(tf.float32, (100, 8, 4))
y_0 = sseq(x_0)
y_1  = sseq(x_1)

tf.gradients(y_0, x_0)
# [<tf.Tensor 'gradients/sequential/LBN/inputs/split_grad/concat:0' shape=(100, 8, 4) dtype=float32>]
# 

tf.gradients(y_1, x_1)
# [None]
# tf.gradients returns None, because there is no connection in the graph from y_1 to x_1

tf.gradients(y_1, x_0)
# [<tf.Tensor 'gradients_2/sequential/LBN/inputs/split_grad/concat:0' shape=(100, 8, 4) dtype=float32>]
# This connection exists

Calling the sequential model a second time on the tensor x_1, does not connect x_1 to the graph. This is not the desired behaviour of a keras layer.

I suspect, the reason behind this could be, that the call function of LBNLayer (which is the call function of lbn), actually registers the given input tensors as attributes of the LBN-object. This way, it is stuck with the initially registered tensors and not able to feed other tensors through.

Edit: To make the code work on tf.1.13.1, one has to change: weight_shape = (n_in, m) to weight_shape = (n_in.value, m) in lbn.py line 512

Edited by Niclas Steve Eich
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information