diff --git a/nn_regression.ipynb b/nn_regression.ipynb index 838dc288e102e928baec9a486aa8f8b9641dac62..0e34b1b8da4c8ffc5a219315b6f64895c09d0341 100644 --- a/nn_regression.ipynb +++ b/nn_regression.ipynb @@ -119,7 +119,7 @@ "source": [ "### 1.1 Biological motivation, the perceptron, and typical activation functions\n", "\n", - "The origin of *artifical neural networks* (ANNs) dates back to the early 1940's. The most simple form of an ANN is the *perceptron*, which was developed by Frank Rosenblatt in 1958 (the interested reader can find the original report [here](https://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-1957.pdf)) and is biologically motivated (see the simplifying sketch of a biological neuron below). \n", + "The origin of *artifical neural networks* (ANNs) dates back to the early 1940's. The most simple form of an ANN is the *perceptron*, which was developed by Frank Rosenblatt in 1957 (the interested reader can find the original report [here](https://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-1957.pdf)) and is biologically motivated (see the simplifying sketch of a biological neuron below). \n", "In a perceptron, the output y is computed by taking the linear combination of the input with weights $w_1, w_2$ and *bias* b, yielding the value z (see right part of below figure) to which a non-linear function f (the *activation function*) is applied at the end.\n", "\n", "\n",