diff --git a/notebook/compressed_sensing.ipynb b/notebook/compressed_sensing.ipynb index 677a146756ffdcf929ba744e72eeb3339e60502b..d97936314644e7df1388b0e2a64028d8011e8a5d 100644 --- a/notebook/compressed_sensing.ipynb +++ b/notebook/compressed_sensing.ipynb @@ -4,9 +4,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Symbolic regression via compressed sensing: a tutorial\n", + "# Symbolic regression: A tutorial\n", "\n", - "___Emre Ahmetcik,___ _Angelo Ziletti, Runhai Ouyang, Sbailò Luigi, Matthias Scheffler, Luca M. Ghiringhelli \\<luca@fhi-berlin.mpg.de\\>_\n", + "__Tom Purcell__, ___Emre Ahmetcik,___ _Angelo Ziletti, Runhai Ouyang, Sbailò Luigi, Matthias Scheffler, Luca M. Ghiringhelli \\<luca@fhi-berlin.mpg.de\\>_\n", "\n", "<img src=\"assets/logo_MPG.png\" width=\"80px\">\n", "<img src=\"assets/logo_NOMAD.png\" width=\"80px\">\n", @@ -1027,7 +1027,11 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# PySR" + "# Symbolic Regression Through Genetic Programing\n", + "\n", + "It is good to compare the results of the previous workflows against other methods for performing symbolic regression, in this case genetic programming. Historically, genetic programming is the most popular tool for performing symbolic regerssion, and it performs genetic operations on the binary expression tress of the forumlas to generate new expressions. pySR is the latest package to perform this work where it uses a python frontend and a julia backend to efficently find the expressions. For more details of the approach see: https://github.com/MilesCranmer/PySR\n", + "\n", + "In this example we are using a \"deterministic\" version of the code by turnning off all paralellism and setting the seed via the `procs`, `multithreading`, `random_state`, and `deterministic` keywords. While this slows down the run time it makes the results consistent.\n" ] }, { @@ -1118,7 +1122,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# FFX" + "# FFX: Fast Function Extraction\n", + "\n", + "FFX was the first popularized attempt of removing genetic programing from symbolic regression. This approach builds up a set of basis functions by rasing each input variable to a group of powers and applying a predetermined set of unary and binary operators onto them. From here it uses an `ElasticNet` to find the best solution for a given number of basis functions.\n" ] }, { @@ -1177,13 +1183,6 @@ "plt.ylabel(\"Count\")\n", "plt.show()" ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [] } ], "metadata": { @@ -1202,7 +1201,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.9.16" + "version": "3.9.12" } }, "nbformat": 4,