diff --git a/assets/nn_regression/mlp_more_layers_example.png b/assets/nn_regression/mlp_more_layers_example.png
new file mode 100644
index 0000000000000000000000000000000000000000..d19aee457f3629094645877dff84248f01464fe7
Binary files /dev/null and b/assets/nn_regression/mlp_more_layers_example.png differ
diff --git a/nn_regression.ipynb b/nn_regression.ipynb
index 65fe3da5a49a5fc1ab11bc4a799af1375167e2fa..838dc288e102e928baec9a486aa8f8b9641dac62 100644
--- a/nn_regression.ipynb
+++ b/nn_regression.ipynb
@@ -227,6 +227,10 @@
     "\\mathbf{o} = f^{N} ( W^N f^{N-1} ( W^{N-1} ... f^1 ( W^1 \\mathbf{x}) )\n",
     "\\end{equation*}$\n",
     "\n",
+    "An example for the case of N=3 is shown in the following figure:\n",
+    "\n",
+    "<img src=\"./assets/nn_regression/mlp_more_layers_example.png\" width=\"200\">\n",
+    "\n",
     "\n",
     "Coming now to the choice of activation function in the final layer, in case of classification,\n",
     "the softmax activation function is the usual choice, yielding the following expression for the $j$th component \n",