Skip to content
Snippets Groups Projects
Commit cb1528bd authored by Lucas Miranda's avatar Lucas Miranda
Browse files

Instanciated a mean and a variance for each component, and the categorical...

Instanciated a mean and a variance for each component, and the categorical prior as a Dense layer with a softmax activation
parent 7b86681d
No related branches found
No related tags found
No related merge requests found
Source diff could not be displayed: it is too large. Options to address this: view the blob.
......@@ -130,7 +130,11 @@ class KLDivergenceLayer(Layer):
def call(self, inputs, **kwargs):
mu, log_var = inputs
KL_batch = -0.5 * self.beta * K.sum(1 + log_var - K.square(mu) - K.exp(log_var), axis=-1)
KL_batch = (
-0.5
* self.beta
* K.sum(1 + log_var - K.square(mu) - K.exp(log_var), axis=-1)
)
self.add_loss(K.mean(KL_batch), inputs=inputs)
self.add_metric(KL_batch, aggregation="mean", name="kl_divergence")
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment