From b5f0dfe5ea105c4f01e2748eb508f37db113c701 Mon Sep 17 00:00:00 2001
From: Andreas Leitherer <leitherer@fhi-berlin.mpg.de>
Date: Thu, 17 Dec 2020 10:52:52 +0100
Subject: [PATCH] Picture was not appearing due to change of asset folder path

---
 nn_regression.ipynb | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/nn_regression.ipynb b/nn_regression.ipynb
index 241a967..65fe3da 100644
--- a/nn_regression.ipynb
+++ b/nn_regression.ipynb
@@ -172,7 +172,7 @@
     "\n",
     "The ReLU activation function is most frequently used. Non-linear functions are essential to increase the space of possible (complex) functions that the model can learn. If  no activation function would be used, i.e., the identity - also called *linear activation function*- the class of possible functions that the model can represent would be drastically reduced.\n",
     "\n",
-    "![activation_functions.png](./assets/Neural_network_regression/activation_functions.png)"
+    "![activation_functions.png](./assets/nn_regression/activation_functions.png)"
    ]
   },
   {
-- 
GitLab