From 4ec819d18d609d3dd4e298d0375098bf30e7cef6 Mon Sep 17 00:00:00 2001
From: Andreas Leitherer <leitherer@fhi-berlin.mpg.de>
Date: Thu, 17 Dec 2020 11:14:38 +0100
Subject: [PATCH] Refresh

---
 nn_regression.ipynb | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/nn_regression.ipynb b/nn_regression.ipynb
index 838dc28..0e34b1b 100644
--- a/nn_regression.ipynb
+++ b/nn_regression.ipynb
@@ -119,7 +119,7 @@
    "source": [
     "### 1.1 Biological motivation, the perceptron, and typical activation functions\n",
     "\n",
-    "The origin of *artifical neural networks* (ANNs) dates back to the early 1940's. The most simple form of an ANN is the *perceptron*, which was developed by Frank Rosenblatt in 1958 (the interested reader can find the original report  [here](https://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-1957.pdf)) and is biologically motivated (see the simplifying sketch of a biological neuron below). \n",
+    "The origin of *artifical neural networks* (ANNs) dates back to the early 1940's. The most simple form of an ANN is the *perceptron*, which was developed by Frank Rosenblatt in 1957 (the interested reader can find the original report  [here](https://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-1957.pdf)) and is biologically motivated (see the simplifying sketch of a biological neuron below). \n",
     "In a perceptron, the output y is computed by taking the linear combination of the input with weights $w_1, w_2$ and *bias* b, yielding the value z (see right part of below figure) to which a non-linear function f (the *activation function*) is applied at the end.\n",
     "\n",
     "\n",
-- 
GitLab