diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml
index deac156f7e0e578edd688a132fc5d25165341876..a728988ed5ff55fa67feaa7fe0fdb57daf315f1c 100644
--- a/.gitlab-ci.yml
+++ b/.gitlab-ci.yml
@@ -351,7 +351,9 @@ build-gnu-gcov:
 pages:
   stage: doc_builds
   script:
-    - source cpp_sisso_gnu_py_env/bin/activate
+    - source cpp_sisso_gnu_param_py_env/bin/activate
+    - export LD_LIBRARY_PATH=$HOME/intel/oneapi/intelpython/latest/lib/:$HOME/intel/oneapi/intelpython/latest/lib/python3.7:$LD_LIBRARY_PATH
+    - export PYTHONPATH=$HOME/intel/oneapi/intelpython/latest/lib/python3.7/site-packages/:cpp_sisso_gnu_param_py_env/lib/python3.7/site-packages/
     - cd docs/
     - make html
     - mv _build/html/ ../public
diff --git a/docs/development/Credits.md b/docs/development/Credits.md
index 5c5626c328ca7ab784d21796897bb85ca596f8ab..97c6cc934d3339e40c43c2d0e80ec153cb79a01d 100644
--- a/docs/development/Credits.md
+++ b/docs/development/Credits.md
@@ -1,5 +1,4 @@
-Acknowledgements
----
+# Acknowledgements
 
 `SISSO++` would not be possible without the following packages:
 
@@ -9,7 +8,7 @@ Acknowledgements
 - [googletest](https://github.com/google/googletest)
 - [NLopt](http://github.com/stevengj/nlopt)
 
-# How to cite these packages:
+## How to cite these packages:
 
 Please make sure to give credit to the right people when using `SISSO++`:
 For classification problems cite:
diff --git a/docs/index.rst b/docs/index.rst
index bcbfd93baf9268ebd6179831da14a5b9d94bd988..bfab53a7ae3ee17b6a65e8bc01f43407dc875ae2 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -12,19 +12,12 @@ SISSO++
 This package provides a C++ implementation of SISSO with built in Python bindings for an efficient python interface.
 Future work will expand the python interface to include more postporcessing analysis tools.
 
-Indices
-=======
-
-* :ref:`genindex`
-* :ref:`search`
-
 Table of Contents
 ^^^^^^^^^^^^^^^^^
 
 .. toctree::
    :maxdepth: 2
 
-   self
    quick_start/QuickStart
    tutorial/tutorial
    cpp_api/cpp_api
diff --git a/docs/quick_start/QuickStart.rst b/docs/quick_start/QuickStart.rst
index 393a3952edb07f2b6d8fc363e2a3d50ed3636744..0eadf44141e79089e32488cdfb154692c5d09c11 100644
--- a/docs/quick_start/QuickStart.rst
+++ b/docs/quick_start/QuickStart.rst
@@ -1,6 +1,6 @@
 .. _quick_start:
 
-Quick Start Guide
+Quick-Start Guide
 =================
 .. toctree::
     :maxdepth: 2
diff --git a/docs/quick_start/code_ref.md b/docs/quick_start/code_ref.md
index cde3e05cbdf4ab8a0eb0b1f6f4f9bc298b2319ed..50b60f332bb1a9a2718e6da729f9c27f8856891c 100644
--- a/docs/quick_start/code_ref.md
+++ b/docs/quick_start/code_ref.md
@@ -31,7 +31,7 @@ A list containing the set of all operators that will be used during the feature
 
 #### `param_opset`
 
-A list containing the set of all operators, for which the non-linear scale and bias terms will be optimized, that will be used during the feature creation step of SISSO. (If empty none of the available features)
+A list containing the set of all operators, for which the non-linear scale and bias terms will be optimized, that will be used during the feature creation step of SISSO. (If empty none of the available features are used)
 
 #### `calc_type`
 
@@ -39,15 +39,15 @@ The type of calculation to run either regression, log regression, or classificat
 
 #### `desc_dim`
 
-The maximum dimension of the model to be created
+The maximum dimension of the model to be created (no default value)
 
 #### `n_sis_select`
 
-The number of features that SIS selects over each iteration
+The number of features that SIS selects over each iteration (no default value)
 
 #### `max_rung`
 
-The maximum rung of the feature (height of the tallest possible binary expression tree - 1)
+The maximum rung of the feature (height of the tallest possible binary expression tree - 1) (no default value)
 
 #### `n_residual`
 
diff --git a/docs/tutorial/0_intro.md b/docs/tutorial/0_intro.md
index 852b98a8b3a890470f6da2a395b3b0e2a89f6ae7..ee76cd06d75fdb21b9751352b0707c1ba0e5c6d9 100644
--- a/docs/tutorial/0_intro.md
+++ b/docs/tutorial/0_intro.md
@@ -2,17 +2,17 @@
 
 This tutorial is based on the [Predicting energy differences between crystal structures: (Meta-)stability of octet-binary compounds](https://analytics-toolkit.nomad-coe.eu/public/user-redirect/notebooks/tutorials/descriptor_role.ipynb) tutorial created by Mohammad-Yasin Arif, Luigi Sbailò, Thomas A. R. Purcell, Luca M. Ghiringhelli, and Matthias Scheffler.
 The goal of the tutorial is to teach a user how to use `SISSO++` to find and analyze quantitative models for materials properties.
-In particular we will use SISSO to predict the crystal structure (rock salt or zincblende) of a series of octet binaries.
+In particular we will use SISSO to predict the crystal structure (rock-salt or zinc-blende) of a series of octet binaries.
 The tutorial will be split into three parts: 1) explaining how to use the executable to perform the calculations and the python utilities to analyze the results and 2) How to use only python to run, analyze, and demonstrate results 3) How to perform classification problems using SISSO.
 
 ## Outline
 The following tutorials are available:
 
-- [Combined Binary and Python](1_combined.md)
-- [Python only](2_python.md)
+- [Using the Command Line Interface](1_command_line.md)
+- [Using the Python Interface](2_python.md)
 - [Classification](3_classification.md)
 
-All tutorials use the octet binary dataset first described in [PRL-2015](http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.114.10550) with the goal of predicting whether a material will crystallize in a rock salt or zincblende phase.
+All tutorials use the octet binary dataset first described in [PRL-2015](http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.114.10550) with the goal of predicting whether a material will crystallize in a rock-salt or zinc-blende phase.
 For all applications of SISSO a data set has to be passed via a standard `csv` file where the first row represents the feature and property label and the first column are the index-label for each sample for example
 ```
 Material, energy_diff (eV), rs_A (AA), rs_B (AA), E_HOMO_A (eV), E_HOMO_B (eV),....
diff --git a/docs/tutorial/1_command_line.md b/docs/tutorial/1_command_line.md
index c82de07775fba6145c5dd12cd7e466b1acccbc2e..8e0890723e9dcaa77789399bec4d07fb49b8d5d9 100644
--- a/docs/tutorial/1_command_line.md
+++ b/docs/tutorial/1_command_line.md
@@ -64,12 +64,11 @@ The standard output provides information about what step the calculation just fi
 When all calculations are complete the code prints out a summary of the best 1D, 2D, ..., {desc_dim}D models with their training RMSE/Testing RMSE (Only training if there is no test set provided).
 Additionally, two additional output files are stored in `feature_space/`: `SIS_summary.txt` and `selected_features.txt`.
 These files represent a human readable (`SIS_summary.txt`) and computer readable (`selected_features.txt`) summary of the selected feature space from SIS.
-Below are reconstructions of both files for this calculation
+Below are reconstructions of both files for this calculation (To see the file click the triangle)
 
 <details>
     <summary>feature_space/SIS_summary.txt</summary>
 
-    ```text
     # FEAT_ID     Score                   Feature Expression
     0             0.920868624862486329    ((E_HOMO_B / r_p_A) / (r_sigma + r_p_B))
     1             0.919657911026942054    ((|r_pi - r_s_A|) / (r_s_A^3))
@@ -115,12 +114,10 @@ Below are reconstructions of both files for this calculation
     38            0.262777418218664849    ((E_LUMO_A^6) / (r_p_B^3))
     39            0.253659279222423484    ((E_LUMO_A / r_p_B) * (E_LUMO_B * E_LUMO_A))
     #-----------------------------------------------------------------------
-    ```
 </details>
 <details>
     <summary>feature_space/selected_features.txt</summary>
 
-    ```text
     # FEAT_ID     Feature Postfix Expression (RPN)
     0             9|14|div|18|15|add|div
     1             19|12|abd|12|cb|div
@@ -166,7 +163,7 @@ Below are reconstructions of both files for this calculation
     38            10|sp|15|cb|div
     39            10|15|div|11|10|mult|mult
     #-----------------------------------------------------------------------
-    ```
+
 </details>
 In both files the change in rung is represented by the commented out dashed (--) line.
 
@@ -183,7 +180,6 @@ An example of these files is provided here:
 <details>
     <summary>models/train_dim_2_model_0.dat</summary>
 
-    ```csv
     # c0 + a0 * ((EA_B - IP_A) * (|r_sigma - r_s_B|)) + a1 * ((E_HOMO_B / r_p_A) / (r_sigma + r_p_B))
     # Property Label: $E_{RS} - E_{ZB}$; Unit of the Property: eV
     # RMSE: 0.0931540779192557; Max AE: 0.356632500670745
@@ -281,19 +277,19 @@ An example of these files is provided here:
     SeZn        ,  2.631368992806530e-01,  2.463580576975095e-01,  7.384497385908948e-01, -2.320488278555971e+00
     TeZn        ,  2.450012951740060e-01,  1.776248032825628e-01,  2.763715059556858e+00, -2.304848319397327e+00
 
-    ```
 </details>
 
+
+
 ## Determining the Ideal Model Complexity with Cross-Validation
 While the training error always decreases with descriptor dimensionality for a given application, over-fitting can reduce the general applicability of the models outside of the training set.
 In order to determine the optimal dimensionality of a model and optimize the hyperparameters associated with SISSO, we need to perform cross-validation.
-As an example we will discuss how to perform leave-out 10% using the command line
+As an example we will discuss how to perform leave-out 10% using the command line.
 To do this we have to modify the `sisso.json` file to automatically leave out a random sample of the training data and use that as a test set by changing `"leave_out_frac": 0.0,` do `"leave_out_frac": 0.10,`.
 
 <details>
     <summary> updated sisso.json file</summary>
 
-    ```json
     {
         "data_file": "data.csv",
         "property_key": "E_RS - E_ZB",
@@ -309,7 +305,7 @@ To do this we have to modify the `sisso.json` file to automatically leave out a
         "leave_out_inds": [],
         "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"]
     }
-    ```
+
 </details>
 
 Now lets make ten cross validation directories in the working directory and copy the `data.csv` and `sisso.json` into them and run separate calculations for each run.
@@ -337,7 +333,6 @@ A full example of the testing set output file is reproduced below:
 <details>
     <summary>The test data file cv_0/models/test_dim_2_model_0.dat</summary>
 
-    ```csv
     # c0 + a0 * ((E_HOMO_B / r_p_A) / (r_sigma + r_p_B))
     # Property Label: $E_{RS} - E_{ZB}$; Unit of the Property: eV
     # RMSE: 0.212994478440008; Max AE: 0.442277221520276
@@ -361,7 +356,6 @@ A full example of the testing set output file is reproduced below:
     BrNa        , -1.264287278827400e-01, -1.888626375989341e-01, -8.644123624074346e-01
     CSi         ,  6.690237272359810e-01,  3.948280949265375e-01, -3.351692484156472e+00
 
-    ```
 </details>
 
 ## Analyzing the Results with python
@@ -386,7 +380,9 @@ To visualize these results we will also use `plot_validation_rmse` at the end, a
 Here is an example of the `plot_validation_rmse` output:
 <details>
     <summary> Cross-Validation results </summary>
-![image](command_line/cv/cv_10_error.png)
+
+![image](./command_line/cv/cv_10_error.png)
+
 </details>
 
 These initial results, particularly the high standard error of the mean for the 1D and 3D models, indicate that more cross-validation samples are needed (Note: you will have different values as the random samples will be different), so lets increase the total number of samples to 100, and redo the analysis
@@ -415,7 +411,8 @@ As can be seen from the standard error measurements the results are now reasonab
 <details>
     <summary> Converged cross-validation results </summary>
 
-![image](combined/cv/cv_100_error.png)
+![image](./command_line/cv/cv_100_error.png)
+
 </details>
 
 Because the validation error for the three and four dimensional models are within each others error bars and the standard error increases when going to the fourth dimension, we conclude that the three-dimensional model has the ideal complexity.
@@ -431,11 +428,14 @@ To see the distributions for this system we run
 <details>
 <summary> Distribution of Errors </summary>
 
-![image](./combined/error_cv.png)
+![image](./command_line/cv/error_cv_dist.png)
+
 </details>
+
 One thing that stands out in the plot is the large error seen in a single point for both the one and two dimensional models.
-By looking at the validation errors, we find that the point with the largest error is diamond for all model dimensions, which is by far the most stable zincblende structure in the data set.
+By looking at the validation errors, we find that the point with the largest error is diamond for all model dimensions, which is by far the most stable zinc-blende structure in the data set.
 As a note for this setup there is a 0.22\% chance that one of the samples is never in the validation set so if `max_error_ind != 21` check if that sample is in one of the validation sets.
+
 ```python
 >>> import numpy as np
 >>> import pandas as pd
@@ -585,12 +585,66 @@ To get the final models we will perform the same calculation we started off the
 From here we can use `models/train_dim_3_model_0.dat` for all of the analysis.
 In order to generate a machine learning plot for this model in matplotlib, run the following in python
 ```python
->>> from sissopp.postprocess.plot.parity_plot import plot_model_ml_plot_from_file
->>> plot_model_ml_plot_from_file("models/train_dim_3_model_0.dat", filename="3d_model.pdf").show()
+>>> from sissopp.postprocess.plot.parity_plot import plot_model_parity_plot
+>>> plot_model_parity_plot("models/train_dim_3_model_0.dat", filename="3d_model.pdf").show()
 ```
 The result of which is shown below:
 <details>
 <summary> Final 3D model </summary>
 
-![image](./combined/3d_model.png)
+![image](./command_line/cv/3d_model.png)
+</details>
+
+Additionally you can generate a output the model as a Matlab function or a LaTeX string using the following commands.
+```python
+>>> from sissopp.postprocess.load_models import load_model
+>>> model = load_model("models/train_dim_3_model_0.dat")
+>>> print(model.latex_str)
+
+>>> model.write_matlab_fxn("matlab_fxn/model.m")
+```
+
+A copy of the generated matlab function is below.
+<details>
+<summary> Matlab function of the Final 3D model </summary>
+
+
+    function P = model(X)
+    % Returns the value of E_{RS} - E_{ZB} = c0 + a0 * ((r_d_B / r_d_A) * (r_p_B * E_HOMO_A)) + a1 * ((IP_A^3) * (|r_sigma - r_s_B|)) + a2 * ((IP_A / r_p_A) / (r_p_B + r_p_A))
+    %
+    % X = [
+    %     r_d_B,
+    %     r_d_A,
+    %     r_p_B,
+    %     E_HOMO_A,
+    %     IP_A,
+    %     r_sigma,
+    %     r_s_B,
+    %     r_p_A,
+    % ]
+
+    if(size(X, 2) ~= 8)
+        error("ERROR: X must have a size of 8 in the second dimension.")
+    end
+    r_d_B    = reshape(X(:, 1), 1, []);
+    r_d_A    = reshape(X(:, 2), 1, []);
+    r_p_B    = reshape(X(:, 3), 1, []);
+    E_HOMO_A = reshape(X(:, 4), 1, []);
+    IP_A     = reshape(X(:, 5), 1, []);
+    r_sigma  = reshape(X(:, 6), 1, []);
+    r_s_B    = reshape(X(:, 7), 1, []);
+    r_p_A    = reshape(X(:, 8), 1, []);
+
+    f0 = ((r_d_B ./ r_d_A) .* (r_p_B .* E_HOMO_A));
+    f1 = ((IP_A).^3 .* abs(r_sigma - r_s_B));
+    f2 = ((IP_A ./ r_p_A) ./ (r_p_B + r_p_A));
+
+    c0 = -1.3509197357e-01;
+    a0 = 2.8311062079e-02;
+    a1 = 3.7282871777e-04;
+    a2 = -2.3703222974e-01;
+
+    P = reshape(c0 + a0 * f0 + a1 * f1 + a2 * f2, [], 1);
+    end
+
 </details>
diff --git a/docs/tutorial/3_classification.md b/docs/tutorial/3_classification.md
index 1ff724f57d7dfb3e5e7cc4710f0c4b31ebd8ce57..cdfe924041825bfed776211273d881a257c1379e 100644
--- a/docs/tutorial/3_classification.md
+++ b/docs/tutorial/3_classification.md
@@ -1,15 +1,15 @@
-Performing Classification with SISSO++
----
-Finally `SISSO++` can be used to solve classification problems as well as regression problems.
-As an example of this we will adapt the previous example by replacing the property with the identifier of if the material favors the rock-salt or zincblende structure, and change the calculation type to be `classification`.
+# Performing Classification with SISSO++
+
+inally, besides regression problems, `SISSO++` can be used to solve classification problems.
+As an example of this we will adapt the previous example by replacing the property with the identifier of if the material favors the rock-salt or zinc-blende structure, and change the calculation type to be `classification`.
 It is important to note that while this problem only has two classes, multi-class classification is also possible.
 
 ## The Data File
-Here is the updated data file, with the property `E_RS - E_ZB (eV)` replaced with a `Class` column where any negative `E_RS - E_ZB (eV)` is replaced with 0 and any positive value replaced with 1.
+Here is the updated data file, with the property `E_RS - E_ZB (eV)` replaced with a `Class` column where any negative `E_RS - E_ZB (eV)` is replaced with 0 and any positive value replaced with 1. While this example has only one task and two classes, the method works for an arbitrary number of classes and tasks.
+
 <details>
     <summary>Here is the full data_class.csv file for the calculation</summary>
 
-    ```
     # Material,Class,Z_A (nuc_charge) ,Z_B (nuc_charge) ,period_A,period_B,IP_A (eV_IP) ,IP_B (eV_IP) ,EA_A (eV_IP),EA_B (eV_IP) ,E_HOMO_A (eV) ,E_HOMO_B (eV) ,E_LUMO_A (eV),E_LUMO_B (eV) ,r_s_A ,r_s_B ,r_p_A ,r_p_B ,r_d_A ,r_d_B,r_sigma ,r_pi
     AgBr,0,47,35,5,4,-8.0580997467,-12.649600029,-1.66659998894,-3.73930001259,-4.71000003815,-8.00100040436,-0.479000002146,0.708000004292,1.32000005245,0.75,1.87999999523,0.879999995232,2.97000002861,1.87000000477,1.570000052448,0.689999938012
     AgCl,0,47,17,5,3,-8.0580997467,-13.9018001556,-1.66659998894,-3.97079992294,-4.71000003815,-8.69999980927,-0.479000002146,0.574000000954,1.32000005245,0.680000007153,1.87999999523,0.759999990463,2.97000002861,1.66999995708,1.760000050064,0.63999992609
@@ -93,7 +93,7 @@ Here is the updated data file, with the property `E_RS - E_ZB (eV)` replaced wit
     SZn,1,30,16,4,3,-10.1354999542,-11.7951002121,1.08070003986,-2.84489989281,-6.21700000763,-7.10599994659,-1.19400000572,0.64200001955,1.10000002384,0.740000009537,1.54999995232,0.850000023842,2.25,2.36999988556,1.059999942781,0.559999942785
     SeZn,1,30,34,4,4,-10.1354999542,-10.9460000992,1.08070003986,-2.75099992752,-6.21700000763,-6.65399980545,-1.19400000572,1.31599998474,1.10000002384,0.800000011921,1.54999995232,0.949999988079,2.25,2.18000006676,0.89999997616,0.599999904638
     TeZn,1,30,52,4,5,-10.1354999542,-9.86670017242,1.08070003986,-2.66599988937,-6.21700000763,-6.10900020599,-1.19400000572,0.0989999994636,1.10000002384,0.939999997616,1.54999995232,1.13999998569,2.25,1.83000004292,0.569999992854,0.649999916554
-    ```
+
 </details>
 
 ## Running `SISSO++` for Classification problems
@@ -147,7 +147,6 @@ The two output files stored in `feature_space/` are also very similar, with the
 <details>
     <summary>feature_space/SIS_summary.txt</summary>
 
-    ```text
     # FEAT_ID     Score                   Feature Expression
     0             2.00218777423865069     (r_sigma + r_p_B)
     1             2.0108802733799549      (r_pi - r_p_A)
@@ -191,7 +190,8 @@ The two output files stored in `feature_space/` are also very similar, with the
     38            -0.999999633027590651   (period_A * Z_B)
     39            -0.999999625788926316   (Z_B / EA_A)
     #-----------------------------------------------------------------------
-    ```
+
+</details>
 
 Additionally the model files change to better represent the classifier.
 The largest changes are in the header, where the coefficients now represent the linear decision boundaries calculated using support-vector machines (SVM).
@@ -199,7 +199,6 @@ The estimated property vector in this case refers to the predicted class from SV
 <details>
     <summary>models/train_dim_2_model_0.dat</summary>
 
-    ```csv
     # [(EA_B * Z_A), (r_sigma + r_p_B)]
     # Property Label: $$Class$$; Unit of the Property: Unitless
     # # Samples in Convex Hull Overlap Region: 0;# Samples SVM Misclassified: 0
@@ -297,10 +296,9 @@ The estimated property vector in this case refers to the predicted class from SV
     SeZn        ,  1.000000000000000e+00,  1.000000000000000e+00, -8.252999782560001e+01,  1.849999964239000e+00
     TeZn        ,  1.000000000000000e+00,  1.000000000000000e+00, -7.997999668110000e+01,  1.709999978544000e+00
 
-    ```
 </details>
 
-## Updating the SVM Model the Python Interface
+## Updating the SVM Model Using `sklearn`
 Because the basis of the classification algorithm is based on the overlap region of the convex hull, the `c` value for the SVM model is set at a fairly high value of 1000.0.
 This will prioritize reducing the number of misclassified points, but does make the model more susceptible to being over fit.
 To account for this the python interface has the ability to refit the Linear SVM using the `svm` module of `sklearn`.
@@ -331,11 +329,14 @@ These changes are a result of different SVM libraries leading to slightly differ
 <summary> `SISSO++` Classification </summary>
 
 ![image](./classification/sissopp.png)
+
 </details>
+
 <details>
 <summary> sklearn SVM </summary>
 
 ![image](./classification/c_1000.png)
+
 </details>
 However as we decrease the value of `c` an increasing number of points becomes miss classified, suggesting the model is potentially over-fitting the data .
 
diff --git a/docs/tutorial/command_line/cv/3d_model.png b/docs/tutorial/command_line/cv/3d_model.png
new file mode 100644
index 0000000000000000000000000000000000000000..c9eb59412557ebad45d65d0e4b8ae438fdd3b4c1
Binary files /dev/null and b/docs/tutorial/command_line/cv/3d_model.png differ
diff --git a/docs/tutorial/command_line/cv/cv_100_error.png b/docs/tutorial/command_line/cv/cv_100_error.png
new file mode 100644
index 0000000000000000000000000000000000000000..2e7271503dadf66ccd096c55697ab812e097175d
Binary files /dev/null and b/docs/tutorial/command_line/cv/cv_100_error.png differ
diff --git a/docs/tutorial/command_line/cv/cv_10_error.png b/docs/tutorial/command_line/cv/cv_10_error.png
new file mode 100644
index 0000000000000000000000000000000000000000..90379f77e0075197d889253f86402484e5f8249f
Binary files /dev/null and b/docs/tutorial/command_line/cv/cv_10_error.png differ
diff --git a/docs/tutorial/command_line/cv/error_cv_dist.png b/docs/tutorial/command_line/cv/error_cv_dist.png
new file mode 100644
index 0000000000000000000000000000000000000000..a93be2fccfdc369706b18f4bb445f8883e3d585a
Binary files /dev/null and b/docs/tutorial/command_line/cv/error_cv_dist.png differ
diff --git a/src/CMakeLists.txt b/src/CMakeLists.txt
index 793780f2b261d527d4b32590ad2c321c84b57069..fe0a33642609e3da3af624f391af0e63ea92516f 100644
--- a/src/CMakeLists.txt
+++ b/src/CMakeLists.txt
@@ -84,8 +84,16 @@ add_test(
     COMMAND ${MPIEXEC_EXECUTABLE} ${MPIEXEC_NUMPROC_FLAG} 2 ${MPIEXEC_PREFLAGS} "${CMAKE_BINARY_DIR}/bin/sisso++" "${CMAKE_SOURCE_DIR}/tests/exec_test/log_reg/sisso.json" ${MPIEXEC_POSTFLAGS}
 )
 add_test(
-    NAME Train_Only
-    COMMAND ${MPIEXEC_EXECUTABLE} ${MPIEXEC_NUMPROC_FLAG} 2 ${MPIEXEC_PREFLAGS} "${CMAKE_BINARY_DIR}/bin/sisso++" "${CMAKE_SOURCE_DIR}/tests/exec_test/no_test_data/sisso.json" ${MPIEXEC_POSTFLAGS}
+    NAME Log_Regression_Max_Correlation_NE_One
+    COMMAND ${MPIEXEC_EXECUTABLE} ${MPIEXEC_NUMPROC_FLAG} 2 ${MPIEXEC_PREFLAGS} "${CMAKE_BINARY_DIR}/bin/sisso++" "${CMAKE_SOURCE_DIR}/tests/exec_test/log_reg_gen_proj/sisso.json" ${MPIEXEC_POSTFLAGS}
+)
+add_test(
+    NAME Log_Regression_Generate_Project
+    COMMAND ${MPIEXEC_EXECUTABLE} ${MPIEXEC_NUMPROC_FLAG} 2 ${MPIEXEC_PREFLAGS} "${CMAKE_BINARY_DIR}/bin/sisso++" "${CMAKE_SOURCE_DIR}/tests/exec_test/log_reg_max_corr/sisso.json" ${MPIEXEC_POSTFLAGS}
+)
+add_test(
+    NAME Log_Regression_Max_Correlation_NE_One_Generate_Project
+    COMMAND ${MPIEXEC_EXECUTABLE} ${MPIEXEC_NUMPROC_FLAG} 2 ${MPIEXEC_PREFLAGS} "${CMAKE_BINARY_DIR}/bin/sisso++" "${CMAKE_SOURCE_DIR}/tests/exec_test/log_reg_max_corr_gen_proj/sisso.json" ${MPIEXEC_POSTFLAGS}
 )
 if(BUILD_PARAMS)
     add_test(
diff --git a/src/classification/ConvexHull1D.cpp b/src/classification/ConvexHull1D.cpp
index 49c8c1a64989bd7c3c5cb9efee3803915ebb13c5..67623e00dc6467fcf2c6f13d4a68813e2cce5746 100644
--- a/src/classification/ConvexHull1D.cpp
+++ b/src/classification/ConvexHull1D.cpp
@@ -32,30 +32,27 @@ ConvexHull1D::ConvexHull1D() :
     _n_class(0)
 {}
 
-ConvexHull1D::ConvexHull1D(std::vector<int> sizes, const double* prop) :
-    _sorted_value(std::accumulate(sizes.begin(), sizes.end(), 0), 0.0),
+ConvexHull1D::ConvexHull1D(std::vector<int> task_sizes, const double* prop) :
+    _sorted_value(std::accumulate(task_sizes.begin(), task_sizes.end(), 0), 0.0),
     _cls_max(),
     _cls_min(),
-    _sorted_prop_inds(std::accumulate(sizes.begin(), sizes.end(), 0), 0),
+    _sorted_prop_inds(std::accumulate(task_sizes.begin(), task_sizes.end(), 0), 0),
     _cls_start(),
     _cls_sz(),
-    _n_task(sizes.size()),
+    _n_task(task_sizes.size()),
     _n_class(0)
 {
-    initialize_prop(sizes, prop);
+    initialize_prop(task_sizes, prop);
 }
 
-void ConvexHull1D::initialize_prop(std::vector<int> sizes, const double* prop)
+void ConvexHull1D::initialize_prop(std::vector<int> task_sizes, const double* prop)
 {
     // Set the number of tasks and samples
-    _n_task = sizes.size();
+    _n_task = task_sizes.size();
     _task_scores.resize(_n_task, 0.0);
-    int n_samp = std::accumulate(sizes.begin(), sizes.end(), 0);
+    int n_samp = std::accumulate(task_sizes.begin(), task_sizes.end(), 0);
 
     // Set number of classes
-
-
-
     std::vector<double> unique_prop_vals;
     for(int pp = 0; pp < n_samp; ++pp)
     {
@@ -74,6 +71,9 @@ void ConvexHull1D::initialize_prop(std::vector<int> sizes, const double* prop)
     _cls_sz.resize(_n_class * _n_task, 0);
     _cls_start.resize(_n_class * _n_task, 0);
 
+    std::fill_n(_cls_sz.data(), _cls_sz.size(), 0);
+    std::fill_n(_cls_start.data(), _cls_start.size(), 0);
+
     // Set the values of the cls vectors and sorted inds
     _sorted_value.resize(n_samp, 0.0);
     _sorted_prop_inds.resize(n_samp, 0);
@@ -87,15 +87,15 @@ void ConvexHull1D::initialize_prop(std::vector<int> sizes, const double* prop)
 
     int start = 0;
 
-    for(int tt = 0; tt < sizes.size(); ++tt)
+    for(int tt = 0; tt < task_sizes.size(); ++tt)
     {
         util_funcs::argsort<double>(
             _sorted_prop_inds.data() + start,
-            _sorted_prop_inds.data() + start + sizes[tt],
+            _sorted_prop_inds.data() + start + task_sizes[tt],
             prop
         );
 
-        for(int ind = start; ind < start + sizes[tt]; ++ind)
+        for(int ind = start; ind < start + task_sizes[tt]; ++ind)
         {
             ++_cls_sz[tt * _n_class + cl_ind[prop[ind]]];
         }
@@ -143,7 +143,9 @@ double ConvexHull1D::overlap_1d(double* value, double width)
         for(int c1 = 0; c1 < _n_class; ++c1)
         {
             if(_cls_sz[tt * _n_class + c1] == 0)
+            {
                 continue;
+            }
 
             double min = _cls_min[tt * _n_class + c1];
             double max = _cls_max[tt * _n_class + c1];
diff --git a/src/classification/ConvexHull1D.hpp b/src/classification/ConvexHull1D.hpp
index d72b4f37606e704024d021b47aed3c936c3f28bc..7069dce107bf1dfdd490c0b27977e30b88af8064 100644
--- a/src/classification/ConvexHull1D.hpp
+++ b/src/classification/ConvexHull1D.hpp
@@ -51,10 +51,10 @@ public:
     /**
      * @brief Constructor
      *
-     * @param sizes The size of each task
+     * @param task_sizes The size of each task
      * @param prop The pointer to the property vector
      */
-    ConvexHull1D(std::vector<int> sizes, const double* prop);
+    ConvexHull1D(std::vector<int> task_sizes, const double* prop);
 
     /**
      * @brief Default constructor
@@ -64,10 +64,10 @@ public:
     /**
      * @brief Initialize the projection objects
      *
-     * @param sizes The size of each task
+     * @param task_sizes The size of each task
      * @param prop The pointer to the property vector
      */
-    void initialize_prop(std::vector<int> sizes, const double* prop);
+    void initialize_prop(std::vector<int> task_sizes, const double* prop);
 
     /**
      * @brief Calculate the projection scores of a set of features to a vector via Pearson correlation
diff --git a/src/classification/LPWrapper.cpp b/src/classification/LPWrapper.cpp
index e850c1d0d9191560a7f32f45d5dc273674b5199a..b4465857caa943aad6ff704a81ac95da7eec517f 100644
--- a/src/classification/LPWrapper.cpp
+++ b/src/classification/LPWrapper.cpp
@@ -206,7 +206,7 @@ void LPWrapper::copy_data(const int cls, const std::vector<double*> val_ptrs, co
     {
         throw std::logic_error("Size of the val_ptrs vector is larger than _n_dim");
     }
-    if(val_ptrs.size() > _n_dim)
+    if(test_val_ptrs.size() > _n_dim)
     {
         throw std::logic_error("Size of the test_val_ptrs vector is larger than _n_dim");
     }
diff --git a/src/classification/SVMWrapper.cpp b/src/classification/SVMWrapper.cpp
index f947de1c646b6c8f486f83cb1d9769b209322ee0..7a0c98a86ca49bfcd2ebcfc9180218130ea2785d 100644
--- a/src/classification/SVMWrapper.cpp
+++ b/src/classification/SVMWrapper.cpp
@@ -22,27 +22,8 @@
 #include "classification/SVMWrapper.hpp"
 
 SVMWrapper::SVMWrapper(const int n_class, const int n_dim, const int n_samp, const double* prop) :
-    _model(nullptr),
-    _y(prop, prop + n_samp),
-    _y_est(n_samp),
-    _x_space(n_samp * (n_dim + 1)),
-    _x(n_samp),
-    _coefs(n_class * (n_class - 1) / 2, std::vector<double>(n_dim, 0.0)),
-    _C(1000.0),
-    _intercept(n_class * (n_class - 1) / 2, 0.0),
-    _w_remap(n_dim, 1.0),
-    _b_remap(n_dim, 0.0),
-    _n_dim(n_dim),
-    _n_samp(n_samp),
-    _n_class(n_class)
-{
-    setup_parameter_obj(_C);
-    setup_x_space();
-
-    _prob.l = _n_samp;
-    _prob.y = _y.data();
-    _prob.x = _x.data();
-}
+    SVMWrapper(1000.0, n_class, n_dim, n_samp, prop)
+{}
 
 SVMWrapper::SVMWrapper(const int n_class, const int n_dim, const std::vector<double> prop) :
     SVMWrapper(n_class, n_dim, prop.size(), prop.data())
@@ -63,6 +44,14 @@ SVMWrapper::SVMWrapper(const double C, const int n_class, const int n_dim, const
     _n_samp(n_samp),
     _n_class(n_class)
 {
+    std::vector<double> unique_prop_vals = vector_utils::unique(_y);
+    std::map<double, double> rep_map;
+    for(int cc = 0; cc < _n_class; ++cc)
+    {
+        _map_prop_vals[static_cast<double>(cc)] = unique_prop_vals[cc];
+        rep_map[unique_prop_vals[cc]] = static_cast<double>(cc);
+    }
+    std::transform(_y.begin(), _y.end(), _y.begin(), [&](double val){return rep_map[val];});
     setup_parameter_obj(_C);
     setup_x_space();
 
@@ -121,6 +110,11 @@ void SVMWrapper::setup_x_space()
 
 void SVMWrapper::copy_data(const std::vector<int> inds, const int task)
 {
+    if((task < 0) || (task >= node_value_arrs::TASK_SZ_TRAIN.size()))
+    {
+        throw std::logic_error("The requested task is invalid.");
+    }
+
     if(inds.size() > _n_dim)
     {
         throw std::logic_error("Size of the inds vector is larger than _n_dim");
@@ -210,7 +204,10 @@ void SVMWrapper::train(const bool remap_coefs)
     std::transform(_x.begin(), _x.end(), _y_est.begin(), [this](svm_node* sn){return svm_predict(_model, sn);});
     _n_misclassified = 0;
     for(int ss = 0; ss < _n_samp; ++ss)
+    {
         _n_misclassified += _y[ss] != _y_est[ss];
+    }
+    std::transform(_y_est.begin(), _y_est.end(), _y_est.begin(), [this](double val){return _map_prop_vals[val];});
 }
 
 void SVMWrapper::train(const std::vector<int> inds, const int task, const bool remap_coefs)
@@ -251,6 +248,6 @@ std::vector<double> SVMWrapper::predict(const int n_samp_test, const std::vector
         x_test[ss] = &x_space_test[ss * (val_ptrs.size() + 1)];
     }
     std::transform(x_test.begin(), x_test.end(), y_est_test.begin(), [this](svm_node* sn){return svm_predict(_model, sn);});
-
+    std::transform(y_est_test.begin(), y_est_test.end(), y_est_test.begin(), [this](double val){return _map_prop_vals[val];});
     return y_est_test;
 }
diff --git a/src/classification/SVMWrapper.hpp b/src/classification/SVMWrapper.hpp
index b616a44876ec3856852573c59091129b62ba3804..8f412797b51625c2d954182fe192e0025162c150 100644
--- a/src/classification/SVMWrapper.hpp
+++ b/src/classification/SVMWrapper.hpp
@@ -22,11 +22,13 @@
 #ifndef LIBSVM_WRAPPER
 #define LIBSVM_WRAPPER
 
+#include <map>
 #include <algorithm>
 #include <iostream>
 
 #include "external/libsvm/svm.h"
 #include "feature_creation/node/value_storage/nodes_value_containers.hpp"
+#include "utils/vector_utils.hpp"
 
 // DocString: cls_svm_wrapper
 /**
@@ -50,6 +52,7 @@ protected:
     std::vector<double> _w_remap; //!< Prefactors to convert the data to/from the preferred SVM range (0 to 1)
     std::vector<double> _b_remap;  //!< Prefactors to map the minimum of the features to 0.0
 
+    std::map<double, double> _map_prop_vals; //!< Map of the property values to the values used for SVM
     const double _C; //!< The C parameter for the SVM calculation
 
     const int _n_dim; //!< The number of dimensions for the SVM problem
diff --git a/src/descriptor_identifier/model/Model.cpp b/src/descriptor_identifier/model/Model.cpp
index cce0459e260e6dd452c6aa1ca37218f7b41d30de..697f96f95cfb33c5ee6661e85a7664e8eb3c372b 100644
--- a/src/descriptor_identifier/model/Model.cpp
+++ b/src/descriptor_identifier/model/Model.cpp
@@ -86,10 +86,6 @@ double Model::eval(std::map<std::string, double> x_in_dct) const
             {
                 throw std::logic_error("The value of " + in_expr + " is not in x_in_dct.");
             }
-            else if(x_in_dct.count(in_expr) > 1)
-            {
-                throw std::logic_error("Multiple values of " + in_expr + " defined in x_in_dct.");
-            }
 
             x_in.push_back(x_in_dct[in_expr]);
         }
@@ -132,10 +128,6 @@ std::vector<double> Model::eval(std::map<std::string, std::vector<double>> x_in_
             {
                 throw std::logic_error("The value of " + in_expr + " is not in x_in_dct.");
             }
-            else if(x_in_dct.count(in_expr) > 1)
-            {
-                throw std::logic_error("Multiple values of " + in_expr + " defined in x_in_dct.");
-            }
 
             x_in.push_back(x_in_dct[in_expr]);
         }
@@ -276,6 +268,8 @@ void Model::write_matlab_fxn(std::string fxn_filename)
     }
 
     boost::filesystem::path p(fxn_filename.c_str());
+    std::string fxn_name = p.filename().string();
+    fxn_name = fxn_name.substr(0, fxn_name.size() - 2);
     boost::filesystem::path parent = p.remove_filename();
     if(parent.string().size() > 0)
     {
@@ -296,7 +290,7 @@ void Model::write_matlab_fxn(std::string fxn_filename)
     std::transform(leaves.begin(), leaves.end(), leaves.begin(), [](std::string s){return str_utils::matlabify(s);});
 
     // Write the header of the function
-    out_file_stream << "function P = " << fxn_filename.substr(0, fxn_filename.size() - 2) << "(X)\n";
+    out_file_stream << "function P = " << fxn_name << "(X)\n";
     out_file_stream << "% Returns the value of " << _prop_label << " = " << toString() << "\n%\n";
     out_file_stream << "% X = [\n";
     for(auto & leaf : leaves)
@@ -373,6 +367,8 @@ void Model::populate_model(const std::string train_filename, const std::string t
     std::string test_error_line;
     std::getline(train_file_stream, prop_desc_line);
     int n_line = 5;
+
+    // Legacy Code so previous model files can be read in
     if(!is_error_line(prop_desc_line))
     {
         split_line = str_utils::split_string_trim(prop_desc_line);
diff --git a/src/descriptor_identifier/model/ModelClassifier.cpp b/src/descriptor_identifier/model/ModelClassifier.cpp
index 2b6a1459a7a53b0817174ad866d62741bf47c2ca..ec9fd8251f30f16204bb8adda7fbc9147728f50e 100644
--- a/src/descriptor_identifier/model/ModelClassifier.cpp
+++ b/src/descriptor_identifier/model/ModelClassifier.cpp
@@ -59,7 +59,9 @@ ModelClassifier::ModelClassifier(
     _test_n_svm_misclassified = std::accumulate(test_misclassified.begin(), test_misclassified.end(), 0);
 }
 
-ModelClassifier::ModelClassifier(const std::string train_file)
+ModelClassifier::ModelClassifier(const std::string train_file) :
+    _train_n_convex_overlap(0),
+    _test_n_convex_overlap(0)
 {
     populate_model(train_file);
     _n_class = _loss->n_class();
@@ -79,7 +81,9 @@ ModelClassifier::ModelClassifier(const std::string train_file)
         );
     }
 }
-ModelClassifier::ModelClassifier(const std::string train_file, std::string test_file)
+ModelClassifier::ModelClassifier(const std::string train_file, std::string test_file) :
+    _train_n_convex_overlap(0),
+    _test_n_convex_overlap(0)
 {
     populate_model(train_file, test_file);
     _n_class = _loss->n_class();
@@ -220,8 +224,8 @@ std::string ModelClassifier::write_coefs() const
 {
     std::stringstream coef_head_stream;
     coef_head_stream << "# Decision Boundaries" << std::endl;
-    int n_db = _n_class * (_n_class - 1) / 2;
-    int task_header_w = 1 + static_cast<int>(std::floor(std::log10(n_db))) + std::max(
+    int n_db = _loss->n_class() * (_loss->n_class() - 1) / 2;
+    int task_header_w = 2 + static_cast<int>(std::floor(std::log10(_n_class))) * 3 + std::max(
         6,
         static_cast<int>(std::max_element(_task_names.begin(), _task_names.end(), [](std::string s1, std::string s2){return s1.size() <= s2.size();})->size())
     );
@@ -233,16 +237,19 @@ std::string ModelClassifier::write_coefs() const
     }
     coef_head_stream << " b" << std::endl;
 
+    int start_coefs = 0;
     for(int tt = 0; tt < _task_names.size(); ++tt)
     {
+        n_db = _loss->n_class(tt) * (_loss->n_class(tt) - 1) / 2;
         for(int db = 0; db < n_db; ++db)
         {
-            coef_head_stream << std::setw(task_header_w) << std::left << "# " + _task_names[tt] + "_" + std::to_string(db) << std::setw(2) << ", ";
-            for(auto& coeff : _coefs[tt * n_db + db])
+            coef_head_stream << std::setw(task_header_w) << std::left << "# " + _task_names[tt] + "_" + _loss->coef_labels()[start_coefs] << std::setw(2) << ", ";
+            for(auto& coeff : _coefs[start_coefs])
             {
                 coef_head_stream << std::setprecision(15) << std::scientific << std::right << std::setw(22) << coeff << std::setw(2) << ", ";
             }
             coef_head_stream << "\n";
+            ++start_coefs;
         }
     }
     return coef_head_stream.str();
diff --git a/src/descriptor_identifier/model/ModelClassifier.hpp b/src/descriptor_identifier/model/ModelClassifier.hpp
index 6126910ccda7c56d42766ae925f4eb8f86d56442..cf6163af5f9a7071f259e799b54c9b97295338fc 100644
--- a/src/descriptor_identifier/model/ModelClassifier.hpp
+++ b/src/descriptor_identifier/model/ModelClassifier.hpp
@@ -43,10 +43,9 @@ class ModelClassifier : public Model
 
     int _train_n_svm_misclassified; //!< The number of points misclassified by SVM (training set)
     int _test_n_svm_misclassified; //!< The number of points misclassified by SVM (test set)
-protected:
-    using Model::eval;
 
 public:
+    using Model::eval;
 
     /**
      * @brief Construct a ModelClassifier using a loss function and a set of features
diff --git a/src/descriptor_identifier/model/ModelLogRegressor.cpp b/src/descriptor_identifier/model/ModelLogRegressor.cpp
index a7519adca0b6d4e67f16854e9cddab98641f618d..caf1f0d3603fbf5e556f948eff2964474b9ab34f 100644
--- a/src/descriptor_identifier/model/ModelLogRegressor.cpp
+++ b/src/descriptor_identifier/model/ModelLogRegressor.cpp
@@ -126,7 +126,7 @@ std::string ModelLogRegressor::toLatexString() const
     std::stringstream model_rep;
     if(_fix_intercept)
     {
-        model_rep << "$\\left(" << _feats[0]->get_latex_expr() << "\\right)^{a_0}" << std::endl;
+        model_rep << "$\\left(" << _feats[0]->get_latex_expr() << "\\right)^{a_0}";
         for(int ff = 1; ff < _feats.size(); ++ff)
         {
             model_rep << "\\left(" << _feats[ff]->get_latex_expr() << "\\right)^{a_" << ff << "}";
diff --git a/src/descriptor_identifier/solver/SISSOClassifier.cpp b/src/descriptor_identifier/solver/SISSOClassifier.cpp
index 589d068f4cb90b781842b82b8afe6bd6f20f063a..2c17f4d015ef94fb5578a247cf8969c18744f579 100644
--- a/src/descriptor_identifier/solver/SISSOClassifier.cpp
+++ b/src/descriptor_identifier/solver/SISSOClassifier.cpp
@@ -45,7 +45,7 @@ void SISSOClassifier::setup_d_mat_transfer()
         std::vector<int> inds(_task_sizes_train[tt]);
         std::iota(inds.begin(), inds.end(), task_start);
 
-        util_funcs::argsort<double>(inds.data(), inds.data() + inds.size(), _loss->prop_pointer() + task_start);
+        util_funcs::argsort<double>(inds.data(), inds.data() + inds.size(), _loss->prop_pointer());
         _sample_inds_to_sorted_dmat_inds[inds[0]] = task_start;
 
         int cls_start = 0;
@@ -70,7 +70,7 @@ std::array<double, 2> SISSOClassifier::svm_error(std::vector<SVMWrapper>& svm, c
             dist_error += 1.0 / util_funcs::norm(coefs.data(), feat_inds.size());
         }
     }
-    dist_error /= static_cast<double>(_n_task * svm[0].n_class());
+    dist_error /= static_cast<double>(_n_task * _n_class);
     return {error, dist_error};
 }
 
@@ -85,8 +85,7 @@ int SISSOClassifier::get_max_error_ind(
         scores,
         [this](int n_overlap, double score){return static_cast<double>(n_overlap * _n_samp * _n_class) + score;}
     );
-
-    double max_dist = *std::max_element(svm_margin, svm_margin + n_models) + 0.01;
+    double max_dist = std::abs(*std::max_element(svm_margin, svm_margin + n_models, [](double v1, double v2){return std::abs(v1) < std::abs(v2);})) + 0.01;
     std::transform(
         svm_margin,
         svm_margin + n_models,
@@ -121,8 +120,8 @@ void SISSOClassifier::l0_regularization(const int n_dim)
     std::vector<int> inds(n_dim, 0);
 
     std::vector<int> min_inds(n_get_models * n_dim, -1);
-    std::vector<int> min_n_convex_overlap(n_get_models, std::numeric_limits<int>::max());
-    std::vector<double> min_svm_score(n_get_models, std::numeric_limits<double>::max());
+    std::vector<int> min_n_convex_overlap(n_get_models, _n_samp * _n_class * _n_class);
+    std::vector<double> min_svm_score(n_get_models, _n_samp * _n_class * _n_class);
     std::vector<double> min_svm_margin(n_get_models, -1.0);
 
     unsigned long long int n_interactions = 1;
@@ -161,17 +160,17 @@ void SISSOClassifier::l0_regularization(const int n_dim)
             int start = 0;
             for(int tt = 0; tt < _n_task; ++tt)
             {
-                svm_vec.push_back(SVMWrapper(_c, _n_class, _n_dim, _task_sizes_train[tt], loss_copy->prop_pointer() + start));
+                svm_vec.push_back(SVMWrapper(_c, _loss->n_class(tt), _n_dim, _task_sizes_train[tt], loss_copy->prop_pointer() + start));
                 start += _task_sizes_train[tt];
             }
 
             unsigned long long int ii_prev = 0;
 
-#ifdef OMP45
+            #ifdef OMP45
             #pragma omp for schedule(monotonic: dynamic)
-#else
+            #else
             #pragma omp for schedule(dynamic)
-#endif
+            #endif
             for(unsigned long long int ii = _mpi_comm->rank(); ii < n_interactions; ii += static_cast<unsigned long long int>(_mpi_comm->size()))
             {
                 util_funcs::iterate(inds, inds.size(), ii - ii_prev);
@@ -301,8 +300,13 @@ void SISSOClassifier::l0_regularization(const int n_dim)
 
 void SISSOClassifier::fit()
 {
-    for(int dd = 1; dd <= _n_dim; ++dd)
+    int dd = 1;
+    while(
+        (dd <= _n_dim) &&
+        (*std::max_element(_loss->prop_project_pointer(), _loss->prop_project_pointer() + _loss->n_prop_project() * _n_samp) > 0.0)
+    )
     {
+
         double start = omp_get_wtime();
         _feat_space->sis(_loss);
 
@@ -330,6 +334,11 @@ void SISSOClassifier::fit()
                 }
             }
         }
+        ++dd;
+    }
+    if(dd <= _n_dim)
+    {
+        std::cerr << "WARNING: All points sperated before reaching the requested dimension." << std::endl;
     }
 }
 
diff --git a/src/descriptor_identifier/solver/SISSOSolver.hpp b/src/descriptor_identifier/solver/SISSOSolver.hpp
index 44955e67619b708cf37a39f9de17812186aea850..7c4bddd09e3a407d7c9b3035ba15b06113136168 100644
--- a/src/descriptor_identifier/solver/SISSOSolver.hpp
+++ b/src/descriptor_identifier/solver/SISSOSolver.hpp
@@ -128,6 +128,11 @@ public:
      */
     inline int n_models_store() const {return _n_models_store;}
 
+    /**
+     * @brief If true the bias term is fixed at 0
+     */
+    inline bool fix_intercept() const {return _fix_intercept;}
+
     // Python interface functions
     #ifdef PY_BINDINGS
 
diff --git a/src/feature_creation/feature_space/FeatureSpace.hpp b/src/feature_creation/feature_space/FeatureSpace.hpp
index e99e136c18adf7254786a058391d19a61bcad728..4f3833f073ec91c5b79181e534840657f4c3b4bc 100644
--- a/src/feature_creation/feature_space/FeatureSpace.hpp
+++ b/src/feature_creation/feature_space/FeatureSpace.hpp
@@ -322,6 +322,19 @@ public:
      */
     void remove_feature(const int ind);
 
+    #ifdef PARAMETERIZE
+    // DocString: feat_space_param_feats_allowed
+    /**
+     * @brief True if built with -DBUILD_PARAMS (used for python tests)
+     */
+    bool parameterized_feats_allowed() const {return true;}
+    #else
+    // DocString: feat_space_param_feats_allowed
+    /**
+     * @brief True if built with -DBUILD_PARAMS (used for python tests)
+     */
+    bool parameterized_feats_allowed() const {return false;}
+    #endif
     // Python Interface Functions
     #ifdef PY_BINDINGS
 
diff --git a/src/feature_creation/node/FeatureNode.hpp b/src/feature_creation/node/FeatureNode.hpp
index 36e03de3b08fd9acc913e35456566e68e9919f1e..371204bb1f692a07c280d3bd6cff5251ee42e541 100644
--- a/src/feature_creation/node/FeatureNode.hpp
+++ b/src/feature_creation/node/FeatureNode.hpp
@@ -500,7 +500,7 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    void param_derivative(const double* params, double* dfdp) const {}
+    void param_derivative(const double* params, double* dfdp, const int depth = 1) const {}
 
     /**
      * @brief Get the parameter gradient for non-linear optimization
@@ -519,8 +519,9 @@ public:
      * @param grad pointer to the gradient storage
      * @param dfdp pointer to where the feature derivative pointers are located
      * @param params A pointer to the bias and scale terms for this Node and its children
+     * @param depth The current depth in the binary expression tree
      */
-    inline void gradient(double* grad, double* dfdp, const double* params) const {};
+    inline void gradient(double* grad, double* dfdp, const double* params, const int depth = 1) const {};
     #endif
 };
 
diff --git a/src/feature_creation/node/ModelNode.cpp b/src/feature_creation/node/ModelNode.cpp
index 48eec97ffb28a695b133162fb4904762b78d3e83..59be042b20bcfc365270294301a7590a4436add8 100644
--- a/src/feature_creation/node/ModelNode.cpp
+++ b/src/feature_creation/node/ModelNode.cpp
@@ -900,7 +900,9 @@ double ModelNode::eval(double* x_in)
         }
     }
     if(stack.size() != 1)
+    {
         throw std::logic_error("The final stack size is not one, something wrong happened during the calculation.");
+    }
 
     return *stack[0];
 }
@@ -923,11 +925,6 @@ double ModelNode::eval(std::map<std::string, double> x_in_dct)
         {
             throw std::logic_error("The value of " + in_expr + " is not in x_in_dct.");
         }
-        else if(x_in_dct.count(in_expr) > 1)
-        {
-            throw std::logic_error("Multiple values of " + in_expr + " defined in x_in_dct.");
-        }
-
         x_in.push_back(x_in_dct[in_expr]);
     }
     return eval(x_in.data());
@@ -995,10 +992,6 @@ std::vector<double> ModelNode::eval(std::map<std::string, std::vector<double>> x
         {
             throw std::logic_error("The value of " + in_expr + " is not in x_in_dct.");
         }
-        else if(x_in_dct.count(in_expr) > 1)
-        {
-            throw std::logic_error("Multiple values of " + in_expr + " defined in x_in_dct.");
-        }
 
         x_in.push_back(x_in_dct[in_expr]);
     }
diff --git a/src/feature_creation/node/ModelNode.hpp b/src/feature_creation/node/ModelNode.hpp
index 831cbc81ace9ded744bad6850c2f3784c6d791fd..555fe6bd2bd6143d5e8fc49939c6ca2193a0ee27 100644
--- a/src/feature_creation/node/ModelNode.hpp
+++ b/src/feature_creation/node/ModelNode.hpp
@@ -317,7 +317,6 @@ public:
     inline double* value_ptr(int offset=-1, const bool for_comp=false) const
     {
         throw std::logic_error("const version of value_ptr for ModelNode is impossible.");
-        return nullptr;
     }
 
     /**
@@ -329,7 +328,6 @@ public:
     inline double* test_value_ptr(int offset=-1, const bool for_comp=false) const
     {
         throw std::logic_error("const version of test_value_ptr for ModelNode is impossible.");
-        return nullptr;
     }
 
     // DocString: model_node_rung
diff --git a/src/feature_creation/node/Node.hpp b/src/feature_creation/node/Node.hpp
index 06f385ce107bfc737c26fe86e398dd60e62c24b2..6b752818b9d02c00c1c4a4372f00b3e1a4cabbc4 100644
--- a/src/feature_creation/node/Node.hpp
+++ b/src/feature_creation/node/Node.hpp
@@ -418,7 +418,6 @@ public:
     virtual inline const double* param_pointer() const
     {
         throw std::logic_error("Trying to access the parameter pointer to a node with no parameters.");
-        return nullptr;
     }
 
     /**
@@ -554,7 +553,7 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param samp_ind sample index number
      */
-    virtual void param_derivative(const double* params, double* dfdp) const = 0;
+    virtual void param_derivative(const double* params, double* dfdp, const int depth = 1) const = 0;
 
     /**
      * @brief Get the parameter gradient for non-linear optimization
@@ -570,8 +569,9 @@ public:
      * @param grad pointer to the gradient storage
      * @param dfdp pointer to where the feature derivative pointers are located
      * @param params A pointer to the bias and scale terms for this Node and its children
+     * @param depth The current depth in the binary expression tree
      */
-    virtual void gradient(double* grad, double* dfdp, const double* params) const = 0;
+    virtual void gradient(double* grad, double* dfdp, const double* params, const int depth = 1) const = 0;
     #endif
 
     // DocString: node_n_feats
diff --git a/src/feature_creation/node/operator_nodes/OperatorNode.hpp b/src/feature_creation/node/operator_nodes/OperatorNode.hpp
index fd742686f608d9bb72d1f4775693f633cc82b10f..858130a2e907287833f9bf16efe6392e4b24a641 100644
--- a/src/feature_creation/node/operator_nodes/OperatorNode.hpp
+++ b/src/feature_creation/node/operator_nodes/OperatorNode.hpp
@@ -579,7 +579,7 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    virtual void param_derivative(const double* params, double* dfdp) const = 0;
+    virtual void param_derivative(const double* params, double* dfdp, const int depth = 1) const = 0;
 
     /**
      * @brief Get the parameter gradient for non-linear optimization
@@ -589,7 +589,7 @@ public:
      */
     void gradient(double* grad, double* dfdp) const
     {
-        if(n_params_possible() == 0)
+        if(n_params() == 0)
         {
             throw std::logic_error("Asking for the gradient of non-parameterized feature");
         }
@@ -603,13 +603,23 @@ public:
      * @param grad pointer to the gradient storage
      * @param dfdp pointer to where the feature derivative are located
      * @param params A pointer to the bias and scale terms for this Node and its children
+     * @param depth The current depth in the binary expression tree
      */
-    void gradient(double* grad, double* dfdp, const double* params) const
+    void gradient(double* grad, double* dfdp, const double* params, const int depth = 1) const
     {
         int np = n_params_possible();
         // Calculate f' and x
-        param_derivative(params, dfdp);
-        double* val_ptr = _feats[N - 1]->value_ptr(params + 2);
+        param_derivative(params, dfdp, depth);
+        double* val_ptr;
+
+        if(depth < nlopt_wrapper::MAX_PARAM_DEPTH)
+        {
+            val_ptr = _feats[N - 1]->value_ptr(params + 2, -1, true, depth + 1);
+        }
+        else
+        {
+            val_ptr = _feats[N - 1]->value_ptr(-1, true);
+        }
 
         // df / d\alpha = x f'
         std::transform(dfdp, dfdp + _n_samp, grad, grad, std::multiplies<double>());
@@ -627,10 +637,13 @@ public:
 
         // Go down the chain rule
         int start = 2;
-        for(int ff = N - 1; ff >=0; --ff)
+        if(depth < nlopt_wrapper::MAX_PARAM_DEPTH)
         {
-            _feats[ff]->gradient(grad + start * _n_samp, dfdp, params + start);
-            start += _feats[ff]->n_params_possible();
+            for(int ff = N - 1; ff >=0; --ff)
+            {
+                _feats[ff]->gradient(grad + start * _n_samp, dfdp, params + start, depth + 1);
+                start += _feats[ff]->n_params_possible();
+            }
         }
     }
 
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/abs/absolute_value.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/abs/absolute_value.hpp
index 80dc6ccf662fb890fcbdf798573304d46146c4d7..d3e90a7fde842e96a09d11cec3096873478d298f 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/abs/absolute_value.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/abs/absolute_value.hpp
@@ -298,9 +298,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return util_funcs::sign(params[0] * vp + params[1]);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/abs_diff/absolute_difference.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/abs_diff/absolute_difference.hpp
index 5e880c1a9b02b67c5539c53b23e21387f42dea73..a6f9d6d809301e517f0853693a3f3da183bc2a9a 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/abs_diff/absolute_difference.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/abs_diff/absolute_difference.hpp
@@ -314,10 +314,11 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr_1 = _feats[0]->value_ptr(params, 2);
-        double* val_ptr_2 = _feats[1]->value_ptr(params, 1);
+        double* val_ptr_1 = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2 + _feats[1]->n_params(), 0, true, depth + 1) : _feats[0]->value_ptr(0, true);
+        double* val_ptr_2 = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[1]->value_ptr(params + 2, 1, true, depth + 1) : _feats[1]->value_ptr(1, true);
+
         std::transform(
             val_ptr_1,
             val_ptr_1 + _n_samp,
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/add/add.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/add/add.hpp
index bffbece28b1fe75ee62d1bc0273086eedf2fda68..12848c2fa8a929f222ce4cd35bf7b5cf02743334 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/add/add.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/add/add.hpp
@@ -307,7 +307,7 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const {std::fill_n(dfdp,  _n_samp, 1.0);}
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const {std::fill_n(dfdp,  _n_samp, 1.0);}
     #endif
 };
 
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cb/cube.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cb/cube.hpp
index d74dd28ae9ef3cf64168aca2e6c73481f781aad4..677d64c24769b36bb10cfd5c58c18c934ca81d08 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cb/cube.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cb/cube.hpp
@@ -295,9 +295,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return 3.0 * std::pow(params[0] * vp + params[1], 2.0);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cbrt/cube_root.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cbrt/cube_root.hpp
index c12dd29900543910bbe9e52141b6018c580ec34f..6bb0003f0e1fcdff7e58221fb1abbff8fbad89e6 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cbrt/cube_root.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cbrt/cube_root.hpp
@@ -294,9 +294,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return 1.0 / 3.0 * std::pow(params[0] * vp + params[1], -2.0 / 3.0);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cos/cos.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cos/cos.hpp
index 0c57eb6fb9d5d9a821d5e8e82e63db033ca02d62..f165a642b533e060449557c996b2a94ceff15b5d 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cos/cos.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/cos/cos.hpp
@@ -297,9 +297,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return -1.0 * std::sin(params[0] * vp + params[1]);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/div/divide.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/div/divide.hpp
index 01f3c4606a0b5c7d661269a66212dffd54d07333..58d4f765b981f072a703b624d3cd0625a91c27bb 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/div/divide.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/div/divide.hpp
@@ -310,10 +310,11 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr_1 = _feats[0]->value_ptr(params, 2);
-        double* val_ptr_2 = _feats[1]->value_ptr(params, 1);
+        double* val_ptr_1 = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2 + _feats[1]->n_params(), 0, true, depth + 1) : _feats[0]->value_ptr(0, true);
+        double* val_ptr_2 = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[1]->value_ptr(params + 2, 1, true, depth + 1) : _feats[1]->value_ptr(1, true);
+
         std::transform(
             val_ptr_1,
             val_ptr_1 + _n_samp,
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/exp/exponential.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/exp/exponential.hpp
index e03871655229025c5ab0ffb11e2261ca88dd3795..aabfa7b1f6b15b524ad27e62232ced8a978ce8ff 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/exp/exponential.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/exp/exponential.hpp
@@ -294,9 +294,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return std::exp(params[0] * vp + params[1]);});
     }
 
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/inv/inverse.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/inv/inverse.hpp
index 973fa42a673933a2aa4f6d03b80d6d23ffcc6ce3..c5d10b14e7f7ac93af6f798105df8e0d2750b34a 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/inv/inverse.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/inv/inverse.hpp
@@ -291,9 +291,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return -1.0 / std::pow(params[0] * vp + params[1], 2.0);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/log/log.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/log/log.hpp
index 246d754fc50b27979b1140bd955c60c23ecb4888..fdbfd1001a3aa8c72aa168a70d147a0914700911 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/log/log.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/log/log.hpp
@@ -297,9 +297,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return 1.0 / (params[0] * vp + params[1]);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/mult/multiply.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/mult/multiply.hpp
index a43fdb8fb03e6748c22e33651a3c2a0739464bf6..f44867481c98e2368b39ee03d46578991390b6b6 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/mult/multiply.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/mult/multiply.hpp
@@ -310,7 +310,11 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const {std::copy_n(_feats[0]->value_ptr(params, -1, true),  _n_samp, dfdp);}
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
+    {
+        double* val_ptr_1 = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2 + _feats[1]->n_params(), 0, true, depth + 1) : _feats[0]->value_ptr(0, true);
+        std::copy_n(val_ptr_1, _n_samp, dfdp);
+    }
     #endif
 };
 
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/negative_exponential.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/negative_exponential.hpp
index 97d4ab99b3d2872be58ee0fb650fd5f7230aa655..3ccd18a7cd7d4f2014fa3a26d0060ce6e56e1444 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/negative_exponential.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/negative_exponential.hpp
@@ -295,9 +295,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return -1.0 * std::exp(-1.0 * (params[0] * vp + params[1]));});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sin/sin.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sin/sin.hpp
index a254182bbfce996a83efcd631a95c10fe3c180d5..07297f1bd469d68bedec2ae29f82418a69069649 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sin/sin.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sin/sin.hpp
@@ -298,9 +298,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return std::cos(params[0] * vp + params[1]);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/six_pow/sixth_power.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/six_pow/sixth_power.hpp
index 00baa8f20052c2b7b37a3ae5debb5f1fbbf980c4..08ae74d9473c5755dbba026c4147d6f2b7da0d2e 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/six_pow/sixth_power.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/six_pow/sixth_power.hpp
@@ -296,9 +296,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return 6.0 * std::pow(params[0] * vp + params[1], 5.0);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sq/square.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sq/square.hpp
index 416736bb0068d3e4fb0a22ed8e92b6255a302259..4dc09452047640542fbd68612ba0c5a73de33875 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sq/square.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sq/square.hpp
@@ -295,9 +295,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return 2.0 * (params[0] * vp + params[1]);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sqrt/square_root.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sqrt/square_root.hpp
index fc96cfe7f212889ba5a0dcd2fc5b6005eb777982..c4aa453f2f3d8738d0d79452e208436152817d41 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sqrt/square_root.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sqrt/square_root.hpp
@@ -298,9 +298,9 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const
     {
-        double* val_ptr = _feats[0]->value_ptr(params);
+        double* val_ptr = (depth < nlopt_wrapper::MAX_PARAM_DEPTH) ? _feats[0]->value_ptr(params + 2, depth + 1, -1, true) : _feats[0]->value_ptr(-1, true);
         std::transform(val_ptr, val_ptr + _n_samp, dfdp, [params](double vp){return 0.5 * std::pow(params[0] * vp + params[1], -0.5);});
     }
     #endif
diff --git a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sub/subtract.hpp b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sub/subtract.hpp
index 12e967c07fcb77551a0ba93498d0e4f59150fb87..bcf74bf6e1258ece7bf30e0d95d78daacd6d8e96 100644
--- a/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sub/subtract.hpp
+++ b/src/feature_creation/node/operator_nodes/allowed_operator_nodes/sub/subtract.hpp
@@ -311,7 +311,7 @@ public:
      * @param params A pointer to the bias and scale terms for this Node and its children
      * @param dfdp pointer to where the feature derivative pointers are located
      */
-    inline void param_derivative(const double* params, double* dfdp) const {std::fill_n(dfdp, _n_samp, -1.0);}
+    inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const {std::fill_n(dfdp, _n_samp, -1.0);}
     #endif
 };
 
diff --git a/src/feature_creation/units/Unit.cpp b/src/feature_creation/units/Unit.cpp
index 91d8fd207ed4e65b9a562702bf52b284fa886c6d..9d380367afd0e753c009b53c3b747235f7124b02 100644
--- a/src/feature_creation/units/Unit.cpp
+++ b/src/feature_creation/units/Unit.cpp
@@ -243,9 +243,9 @@ Unit& Unit::operator/=(const Unit unit_2)
 Unit Unit::operator^(const double power) const
 {
     std::map<std::string, double> to_out = dct();
-    if(power == 0.0)
+    if(std::abs(power) < 1e-10)
     {
-        return Unit(to_out);
+        return Unit();
     }
 
     for(auto& el : to_out)
diff --git a/src/loss_function/LossFunction.hpp b/src/loss_function/LossFunction.hpp
index 198acd015401a2398f331e62cf2854eacc4e22dc..dcee589bc5854bd37af14b5242557236103367b2 100644
--- a/src/loss_function/LossFunction.hpp
+++ b/src/loss_function/LossFunction.hpp
@@ -309,6 +309,11 @@ public:
      */
     inline int n_task() const {return _n_task;}
 
+    /**
+     * @brief Number of properties to project over
+     */
+    inline int n_prop_project(){return _n_project_prop;}
+
     /**
      * @brief Set the number of features and linear model constants to those for the new n_feat
      *
@@ -317,9 +322,26 @@ public:
     virtual inline void set_nfeat(int n_feat){_n_feat = n_feat; _n_dim = n_feat + (!_fix_intercept);}
 
     /**
-     * @brief The number of classes in the calculation
+     * @brief Number of classes in the property
+     */
+    virtual inline std::vector<int> n_class_per_task() const {return {};}
+
+    /**
+     * @brief The number of classes in the calculation for a given task
+     *
+     * @param task_num The task to get the number of classes for
+     */
+    virtual inline int n_class(int task_num) const {return 0;}
+
+    /**
+     * @brief The maximum number of classes in the calculation
+     */
+    virtual inline int n_class() const {return 0;}
+
+    /**
+     * @brief The labels for each set of coefficients
      */
-    virtual inline int n_class(){return 0;}
+    virtual inline std::vector<std::string> coef_labels() const {return {};}
 };
 
 #endif
diff --git a/src/loss_function/LossFunctionConvexHull.cpp b/src/loss_function/LossFunctionConvexHull.cpp
index 9ffd4addc994a9404f079c834078ba9d9970a3ae..e87f928c0df97991c5ff9e6ab7f89a078124d672 100644
--- a/src/loss_function/LossFunctionConvexHull.cpp
+++ b/src/loss_function/LossFunctionConvexHull.cpp
@@ -30,6 +30,7 @@ LossFunctionConvexHull::LossFunctionConvexHull(
 ) :
     LossFunction(prop_train, prop_test, task_sizes_train, task_sizes_test, false, n_feat),
     _width(1e-5),
+    _n_class_per_task(task_sizes_train.size(), 0),
     _n_class(0)
 {
     for(auto& pt : prop_test)
@@ -40,7 +41,23 @@ LossFunctionConvexHull::LossFunctionConvexHull(
         }
     }
 
-    std::vector<double> unique_classes = vector_utils::unique<double>(prop_train);
+    int start = 0.0;
+    std::vector<double> unique_classes;
+    for(int tt = 0; tt < _task_sizes_train.size(); ++tt)
+    {
+        unique_classes = vector_utils::unique<double>(prop_train.data() + start, _task_sizes_train[tt]);
+        _n_class_per_task[tt] = unique_classes.size();
+        for(int c1 = 0; c1 < unique_classes.size(); ++c1)
+        {
+            for(int c2 = c1 + 1; c2 < unique_classes.size(); ++c2)
+            {
+                _coef_labels.push_back(fmt::format("_{:.1f}_{:.1f}", unique_classes[c1], unique_classes[c2]));
+            }
+        }
+        start += _task_sizes_train[tt];
+    }
+
+    unique_classes = vector_utils::unique<double>(prop_train);
     std::map<double, double> class_map;
     _n_class = unique_classes.size();
     for(int cc = 0; cc < _n_class; ++cc)
@@ -95,6 +112,8 @@ LossFunctionConvexHull::LossFunctionConvexHull(
 LossFunctionConvexHull::LossFunctionConvexHull(std::shared_ptr<LossFunction> o) :
     LossFunction(o),
     _width(1e-5),
+    _coef_labels(o->coef_labels()),
+    _n_class_per_task(o->n_class_per_task()),
     _n_class(o->n_class())
 {
     set_nfeat(_n_feat);
@@ -106,7 +125,8 @@ void LossFunctionConvexHull::set_nfeat(int n_feat, bool initialize_sorted_d_mat)
     _n_feat = n_feat;
     _n_dim = n_feat + 1;
     setup_lp(initialize_sorted_d_mat);
-    _coefs.resize(_n_class * (_n_class - 1) / 2 * _n_dim * _task_sizes_train.size(), 0.0);
+    int n_class_combos = std::accumulate(_n_class_per_task.begin(), _n_class_per_task.end(), 0, [](int tot, int nc){return tot + nc * (nc - 1) / 2;});
+    _coefs.resize(n_class_combos * _n_dim, 0.0);
 }
 
 void LossFunctionConvexHull::prepare_project()
@@ -157,60 +177,44 @@ void LossFunctionConvexHull::setup_lp(bool initialize_sorted_d_mat)
     int task_start_test = 0;
     _lp = {};
 
-    std::vector<int> n_samp_per_class;
-    for(int tt = 0; tt < _n_task; ++tt)
+    std::map<double, int> rep_map;
+    std::vector<double> unique_classes = vector_utils::unique(_prop_train);
+    std::sort(unique_classes.begin(), unique_classes.end());
+
+    for(int uc = 0; uc < unique_classes.size(); ++uc)
     {
-        std::map<double, int> rep_class;
+        rep_map[unique_classes[uc]] = uc;
+    }
 
+    std::vector<int> n_samp_per_class(_n_class * _n_task);
+    for(int tt = 0; tt < _n_task; ++tt)
+    {
         std::vector<int> inds(_task_sizes_train[tt]);
         std::iota(inds.begin(), inds.end(), task_start);
         util_funcs::argsort<double>(inds.data(), inds.data() + inds.size(), _prop_train.data());
 
-        int cls_start = 0;
-        _sample_inds_to_sorted_dmat_inds[inds[0]] = task_start;
-        rep_class[_prop_train[inds[0]]] = 0;
-        for(int ii = 1; ii < inds.size(); ++ii)
+        for(int ii = 0; ii < inds.size(); ++ii)
         {
             _sample_inds_to_sorted_dmat_inds[inds[ii]] = ii + task_start;
-            if(_prop_train[inds[ii]] != _prop_train[inds[ii - 1]])
-            {
-                n_samp_per_class.push_back(ii - cls_start);
-                rep_class[_prop_train[inds[ii]]] = n_samp_per_class.size();
-                cls_start = ii;
-            }
-        }
-        n_samp_per_class.push_back(inds.size() - cls_start);
-        if(n_samp_per_class.size() != (tt + 1) * _n_class)
-        {
-            throw std::logic_error("A class is not represented in task " + std::to_string(tt) + ".");
+            ++n_samp_per_class[tt * _n_class + rep_map[_prop_train[inds[ii]]]];
         }
 
         std::vector<int> samp_per_class(_n_class);
         std::copy_n(n_samp_per_class.begin() + tt * _n_class, _n_class, samp_per_class.begin());
         task_start += _task_sizes_train[tt];
 
-        std::vector<int> n_samp_test_per_class(n_samp_per_class.size(), 0);
+        std::vector<int> samp_test_per_class(samp_per_class.size(), 0);
         if(_task_sizes_test[tt] > 0)
         {
             inds.resize(_task_sizes_test[tt]);
             std::iota(inds.begin(), inds.end(), task_start_test);
-            util_funcs::argsort<double>(inds.data(), inds.data() + inds.size(), &_prop_test[task_start_test]);
+            util_funcs::argsort<double>(inds.data(), inds.data() + inds.size(), _prop_test.data());
 
-            cls_start = 0;
-            _test_sample_inds_to_sorted_dmat_inds[inds[0]] = task_start_test;
-            for(int ii = 1; ii < inds.size(); ++ii)
+            for(int ii = 0; ii < inds.size(); ++ii)
             {
                 _test_sample_inds_to_sorted_dmat_inds[inds[ii]] = ii + task_start_test;
-                if(_prop_test[inds[ii]] != _prop_test[inds[ii - 1]])
-                {
-                    n_samp_test_per_class[
-                        rep_class[_prop_test[inds[ii - 1]]]
-                    ] = ii - cls_start;
-                    cls_start = ii;
-                }
+                ++samp_test_per_class[rep_map[_prop_test[inds[ii]]]];
             }
-            n_samp_test_per_class[rep_class[_prop_test[inds.back()]]] = inds.size() - cls_start;
-
             task_start_test += _task_sizes_test[tt];
         }
         LPWrapper lp(
@@ -220,8 +224,8 @@ void LossFunctionConvexHull::setup_lp(bool initialize_sorted_d_mat)
             _n_feat,
             std::accumulate(n_samp_per_class.begin() + tt * _n_class, n_samp_per_class.end(), 0),
             _width,
-            n_samp_test_per_class,
-            std::accumulate(n_samp_test_per_class.begin(), n_samp_test_per_class.end(), 0)
+            samp_test_per_class,
+            std::accumulate(samp_test_per_class.begin(), samp_test_per_class.end(), 0)
         );
         _lp.push_back(lp);
     }
@@ -342,9 +346,10 @@ double LossFunctionConvexHull::operator()(const std::vector<model_node_ptr>& fea
     // Perform SVM to set estimated prop
     start = 0;
     start_test = 0;
+    int start_coefs = 0;
     for(int tt = 0; tt < _task_sizes_train.size(); ++tt)
     {
-        SVMWrapper svm(_n_class, _n_feat, _task_sizes_train[tt], &_prop_train[start]);
+        SVMWrapper svm(_n_class_per_task[tt], _n_feat, _task_sizes_train[tt], &_prop_train[start]);
 
         std::vector<double*> node_val_ptrs(_n_feat);
         std::vector<double*> node_test_val_ptrs(_n_feat);
@@ -352,19 +357,19 @@ double LossFunctionConvexHull::operator()(const std::vector<model_node_ptr>& fea
         for(int dd = 0; dd < _n_feat; ++dd)
         {
             node_val_ptrs[dd] = feats[dd]->value_ptr() + start;
-            node_test_val_ptrs[dd] = feats[dd]->test_value_ptr() + start;
+            node_test_val_ptrs[dd] = feats[dd]->test_value_ptr() + start_test;
         }
 
         svm.train(node_val_ptrs);
         std::vector<std::vector<double>> coefs = svm.coefs();
         for(int cc = 0; cc < coefs.size(); ++cc)
         {
-            std::copy_n(coefs[cc].data(), coefs[cc].size(), &_coefs[tt * _n_class * (_n_class - 1) / 2 * _n_dim + cc * _n_dim]);
-            _coefs[tt * _n_class * (_n_class - 1) / 2 * _n_dim + cc * _n_dim + _n_feat] = svm.intercept()[cc];
+            std::copy_n(coefs[cc].data(), coefs[cc].size(), &_coefs[start_coefs * _n_dim]);
+            _coefs[start_coefs * _n_dim + _n_feat] = svm.intercept()[cc];
+            ++start_coefs;
         }
-
-        std::copy_n(svm.y_estimate().begin() + start, _task_sizes_train[tt], _prop_train_est.begin() + start);
-        std::copy_n(svm.predict(_task_sizes_test[tt], node_test_val_ptrs).begin() + start_test, _task_sizes_test[tt], _prop_test_est.begin() + start_test);
+        std::copy_n(svm.y_estimate().begin(), _task_sizes_train[tt], _prop_train_est.begin() + start);
+        std::copy_n(svm.predict(_task_sizes_test[tt], node_test_val_ptrs).begin(), _task_sizes_test[tt], _prop_test_est.begin() + start_test);
 
         start += _task_sizes_train[tt];
         start_test += _task_sizes_test[tt];
diff --git a/src/loss_function/LossFunctionConvexHull.hpp b/src/loss_function/LossFunctionConvexHull.hpp
index 0a0f39446463bc732c5315dc25f520ad59b7a0b5..9178e4c09f1d79c66c313743e8551b893351a6ed 100644
--- a/src/loss_function/LossFunctionConvexHull.hpp
+++ b/src/loss_function/LossFunctionConvexHull.hpp
@@ -22,6 +22,8 @@
 #ifndef LOSS_FUNCTION_CONVEX_HULL
 #define LOSS_FUNCTION_CONVEX_HULL
 
+#include <fmt/core.h>
+
 #include "classification/ConvexHull1D.hpp"
 #include "classification/LPWrapper.hpp"
 #include "classification/SVMWrapper.hpp"
@@ -43,9 +45,11 @@ protected:
     std::vector<double> _scores; //!< The scores for each of the projection properties
     std::map<int, int> _sample_inds_to_sorted_dmat_inds; //!< map from input sample inds to the SORTED_D_MATRIX_INDS
     std::map<int, int> _test_sample_inds_to_sorted_dmat_inds; //!< map from input sample inds to the SORTED_D_MATRIX_INDS
+    std::vector<std::string> _coef_labels; //!< The labels for each set of coefficients
+    std::vector<int> _n_class_per_task; //!< Number of classes in the property
 
-    const double _width; //!< The width used as the tolerance for the LP optimization
     int _n_class; //!< Number of classes in the property
+    const double _width; //!< The width used as the tolerance for the LP optimization
 
 public:
 
@@ -159,9 +163,26 @@ public:
     void set_nfeat(int n_feat, bool initialize_sorted_d_mat=false);
 
     /**
-     * @brief The number of classes in the calculation
+     * @brief The number of classes in the calculation for a given task
+     *
+     * @param task_num The task to get the number of classes for
+     */
+    inline int n_class(int task_num) const {return _n_class_per_task[task_num];}
+
+    /**
+     * @brief The maximum number of classes in the calculation
+     */
+    inline int n_class() const {return _n_class;}
+
+    /**
+     * @brief The labels for each set of coefficients
+     */
+    inline std::vector<std::string> coef_labels() const {return _coef_labels;}
+
+    /**
+     * @brief Number of classes in the property
      */
-    inline int n_class(){return _n_class;}
+    inline std::vector<int> n_class_per_task() const {return _n_class_per_task;}
 };
 
 #endif
diff --git a/src/python/postprocess/plot/parity_plot.py b/src/python/postprocess/plot/parity_plot.py
index d653c67c80654f27c2d86e88c56d301597af8e1f..714c1e7fbc70ab25bb1cd4242bc5c100cc8ac164 100644
--- a/src/python/postprocess/plot/parity_plot.py
+++ b/src/python/postprocess/plot/parity_plot.py
@@ -20,9 +20,9 @@ plot_model_parity_plot: Wrapper to plot_model for a set of training and testing
 import numpy as np
 import toml
 from sissopp.postprocess.check_cv_convergence import jackknife_cv_conv_est
-
 from sissopp.postprocess.load_models import load_model
-from sissopp.postprocess.plot.utils import setup_plot_ax
+from sissopp.postprocess.plot.utils import setup_plot_ax, latexify
+from sissopp import ModelClassifier
 
 
 def plot_model_parity_plot(model, filename=None, fig_settings=None):
@@ -52,9 +52,12 @@ def plot_model_parity_plot(model, filename=None, fig_settings=None):
 
     fig_config, fig, ax = setup_plot_ax(fig_settings)
 
-    ax.set_xlabel(model.prop_label + " (" + model.prop_unit.latex_str + ")")
+    ax.set_xlabel(latexify(model.prop_label) + " (" + model.prop_unit.latex_str + ")")
     ax.set_ylabel(
-        f"Estimated {model.prop_label}" + " (" + model.prop_unit.latex_str + ")"
+        f"Estimated {latexify(model.prop_label)}"
+        + " ("
+        + model.prop_unit.latex_str
+        + ")"
     )
     if len(model.prop_test) > 0:
         lims = [
diff --git a/src/python/postprocess/plot/utils.py b/src/python/postprocess/plot/utils.py
index eafde92cc894973407ea3a6562b6a1fd18b0db79..6b768bcd8d0c6786b863ec8050e0d8c7228a5e26 100644
--- a/src/python/postprocess/plot/utils.py
+++ b/src/python/postprocess/plot/utils.py
@@ -91,19 +91,26 @@ def adjust_box_widths(ax, fac):
 def latexify(s):
     """Convert a string s into a latex string"""
     power_split = s.split("^")
-
-    print(power_split)
-
     if len(power_split) == 1:
-        return s
-
-    power_split[0] += "$"
-    for pp in range(1, len(power_split)):
-        unit_end = power_split[pp].split(" ")
-        unit_end[0] = "{" + unit_end[0] + "}$"
+        temp_s = s
+    else:
+        power_split[0] += "$"
+        for pp in range(1, len(power_split)):
+            unit_end = power_split[pp].split(" ")
+            unit_end[0] = "{" + unit_end[0] + "}$"
+            unit_end[-1] += "$"
+            power_split[pp] = " ".join(unit_end)
+        temp_s = "^".join(power_split)[:-1]
+
+    subscript_split = temp_s.split("_")
+    if len(subscript_split) == 1:
+        return temp_s
+
+    subscript_split[0] += "$"
+    for pp in range(1, len(subscript_split)):
+        unit_end = subscript_split[pp].split(" ")
+        unit_end[0] = "\\mathrm{" + unit_end[0] + "}$"
         unit_end[-1] += "$"
-        power_split[pp] = " ".join(unit_end)
-
-    print("^".join(power_split)[:-1])
+        subscript_split[pp] = " ".join(unit_end)
 
-    return "^".join(power_split)[:-1]
+    return "_".join(subscript_split)[:-1]
diff --git a/src/python/py_binding_cpp_def/bindings_docstring_keyed.cpp b/src/python/py_binding_cpp_def/bindings_docstring_keyed.cpp
index 524c2926287e5e2e5917cf0c3c01fb53dc6f1918..d3c83e45cc238b49a1e51724e99a665fd9b8a820 100644
--- a/src/python/py_binding_cpp_def/bindings_docstring_keyed.cpp
+++ b/src/python/py_binding_cpp_def/bindings_docstring_keyed.cpp
@@ -121,6 +121,12 @@ void sisso::register_all()
         "@DocString_str_utils_matlabify@"
     );
 
+    def(
+        "latexify",
+        &str_utils::latexify,
+        (arg("str")),
+        "@DocString_str_utils_latexify@"
+    );
     #ifdef PARAMETERIZE
         sisso::feature_creation::node::registerAddParamNode();
         sisso::feature_creation::node::registerSubParamNode();
@@ -296,6 +302,7 @@ void sisso::feature_creation::registerFeatureSpace()
         .add_property("n_feat", &FeatureSpace::n_feat, "@DocString_feat_space_n_feat@")
         .add_property("n_rung_store", &FeatureSpace::n_rung_store, "@DocString_feat_space_n_rung_store@")
         .add_property("n_rung_generate", &FeatureSpace::n_rung_generate, "@DocString_feat_space_n_rung_generate@")
+        .add_property("parameterized_feats_allowed", &FeatureSpace::parameterized_feats_allowed, "@DocString_feat_space_param_feats_allowed@")
     ;
 }
 
diff --git a/src/python/py_binding_cpp_def/bindings_docstring_keyed.hpp b/src/python/py_binding_cpp_def/bindings_docstring_keyed.hpp
index ce4de017d2921a4e11c8bd34f3307d3124d8d97f..a258b20f75db7a7934511989a0a8c242746475ec 100644
--- a/src/python/py_binding_cpp_def/bindings_docstring_keyed.hpp
+++ b/src/python/py_binding_cpp_def/bindings_docstring_keyed.hpp
@@ -115,7 +115,7 @@ namespace sisso
                 inline void initialize_params(double* params, int depth=1) const {this->get_override("set_bounds")();}
                 inline int n_feats() const {return this->get_override("n_feats")();}
                 inline std::shared_ptr<Node> feat(const int ind) const {return this->get_override("feat")();}
-                inline void param_derivative(const double* params, double* dfdp) const {this->get_override("param_derivative");}
+                inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const {this->get_override("param_derivative");}
                 inline void gradient(double* grad, double* dfdp) const {this->get_override("gradient");}
                 inline void gradient(double* grad, double* dfdp, const double* params) const {this->get_override("gradient");}
                 inline std::vector<std::string> get_x_in_expr_list() const {return this->get_override("get_x_in_expr_list")();}
@@ -151,7 +151,7 @@ namespace sisso
                 inline std::vector<double> parameters() const {return this->get_override("parameters")();}
                 inline void set_bounds(double* lb, double* ub, const int depth=1) const {this->get_override("set_bounds")();}
                 inline void initialize_params(double* params, int depth=1) const {this->get_override("set_bounds")();}
-                inline void param_derivative(const double* params, double* dfdp) const {this->get_override("param_derivative");}
+                inline void param_derivative(const double* params, double* dfdp, const int depth = 1) const {this->get_override("param_derivative");}
             };
             #else
             /**
diff --git a/src/python/py_binding_cpp_def/feature_creation/FeatureNode.cpp b/src/python/py_binding_cpp_def/feature_creation/FeatureNode.cpp
index eff20559676fb04fb88f44299dc3699252c7df3b..cddba9e59693ee78b55e7016f1ca8dfd09fc824a 100644
--- a/src/python/py_binding_cpp_def/feature_creation/FeatureNode.cpp
+++ b/src/python/py_binding_cpp_def/feature_creation/FeatureNode.cpp
@@ -22,70 +22,9 @@
 #include "feature_creation/node/FeatureNode.hpp"
 
 FeatureNode::FeatureNode(unsigned long int feat_ind, std::string expr, np::ndarray value, np::ndarray test_value, Unit unit) :
-    Node(feat_ind, value.shape(0), test_value.shape(0)),
-    _value(python_conv_utils::from_ndarray<double>(value)),
-    _test_value(python_conv_utils::from_ndarray<double>(test_value)),
-    _unit(unit),
-    _expr(expr)
-{
-    // Automatically resize the storage arrays
-    if(node_value_arrs::N_STORE_FEATURES == 0)
-    {
-        node_value_arrs::initialize_values_arr(_n_samp, _n_samp_test, 1);
-    }
-    else if((_n_samp != node_value_arrs::N_SAMPLES) || (_n_samp_test != node_value_arrs::N_SAMPLES_TEST))
-    {
-        throw std::logic_error(
-            "Number of samples in current feature is not the same as the others, (" +
-            std::to_string(_n_samp) +
-            " and " + std::to_string(_n_samp_test) +
-            " vs. "  +
-            std::to_string(node_value_arrs::N_SAMPLES) +
-            " and " +
-            std::to_string(node_value_arrs::N_SAMPLES_TEST) +
-            ")"
-        );
-    }
-    else if(feat_ind >= node_value_arrs::N_STORE_FEATURES)
-    {
-        node_value_arrs::resize_values_arr(0, node_value_arrs::N_STORE_FEATURES + 1);
-    }
-
-    set_value();
-    set_test_value();
-}
+    FeatureNode(feat_ind, expr, python_conv_utils::from_ndarray<double>(value), python_conv_utils::from_ndarray<double>(test_value), unit)
+{}
 
 FeatureNode::FeatureNode(unsigned long int feat_ind, std::string expr, py::list value, py::list test_value, Unit unit) :
-    Node(feat_ind, py::len(value), py::len(test_value)),
-    _value(python_conv_utils::from_list<double>(value)),
-    _test_value(python_conv_utils::from_list<double>(test_value)),
-    _unit(unit),
-    _expr(expr)
-{
-    // Automatically resize the storage arrays
-    if(node_value_arrs::N_STORE_FEATURES == 0)
-    {
-        node_value_arrs::initialize_values_arr(_n_samp, _n_samp_test, 1);
-    }
-    else if((_n_samp != node_value_arrs::N_SAMPLES) || (_n_samp_test != node_value_arrs::N_SAMPLES_TEST))
-    {
-        throw std::logic_error(
-            "Number of samples in current feature is not the same as the others, (" +
-            std::to_string(_n_samp) +
-            " and " +
-            std::to_string(_n_samp_test) +
-            " vs. "  +
-            std::to_string(node_value_arrs::N_SAMPLES) +
-            " and " +
-            std::to_string(node_value_arrs::N_SAMPLES_TEST) +
-            ")"
-        );
-    }
-    else if(feat_ind >= node_value_arrs::N_STORE_FEATURES)
-    {
-        node_value_arrs::resize_values_arr(0, node_value_arrs::N_STORE_FEATURES + 1);
-    }
-
-    set_value();
-    set_test_value();
-}
+    FeatureNode(feat_ind, expr, python_conv_utils::from_list<double>(value), python_conv_utils::from_list<double>(test_value), unit)
+{}
diff --git a/src/utils/string_utils.cpp b/src/utils/string_utils.cpp
index 8989dc2d76c2bc70df83ca6606c328f671255a39..0558e6409874d058b885bd682a894b29d62c859d 100644
--- a/src/utils/string_utils.cpp
+++ b/src/utils/string_utils.cpp
@@ -75,7 +75,7 @@ std::string str_utils::matlabify(const std::string str)
     std::string copy_str = str;
     std::replace(copy_str.begin(), copy_str.end(), ' ', '_');
 
-    std::vector<std::string> split_str = split_string_trim(str, "\\");
+    std::vector<std::string> split_str = split_string_trim(str, "\\}{");
     for(auto& term_str : split_str)
     {
         std::string add_str = term_str;
diff --git a/src/utils/string_utils.hpp b/src/utils/string_utils.hpp
index d022a951fc320e34c5819ba67b4174d80e36563f..ce628972c4908ddbf08d640ffeb8443148e3141d 100644
--- a/src/utils/string_utils.hpp
+++ b/src/utils/string_utils.hpp
@@ -42,6 +42,7 @@ namespace str_utils
      */
     std::vector<std::string> split_string_trim(const std::string str, const std::string split_tokens = ",;:");
 
+    // DocString: str_utils_latexify
     /**
      * @brief Convert a string into a latex string
      *
diff --git a/src/utils/vector_utils.hpp b/src/utils/vector_utils.hpp
index 33a9dd8d6a83d2656e932dbf4e784fc072bcae04..3008d4bfd6042269e826c6673b6cbbec2c482cac 100644
--- a/src/utils/vector_utils.hpp
+++ b/src/utils/vector_utils.hpp
@@ -48,6 +48,28 @@ std::vector<T> unique(const std::vector<T> in_vec)
 
     return out_vec;
 }
+
+/**
+ * @brief Return a vector of all unique elements of in_vec
+ *
+ * @param in_vec Pointer to the input vector
+ * @param sz The size of the vector
+ * @tparam T The type of elements of T
+ * @return The vector of unique elements of in_vec
+ */
+template<typename T>
+std::vector<T> unique(const T* in_vec, int sz)
+{
+    std::vector<T> out_vec;
+    for(int ii = 0; ii < sz; ++ii)
+    {
+        if(std::find(out_vec.begin(), out_vec.end(), in_vec[ii]) == out_vec.end())
+        {
+            out_vec.push_back(in_vec[ii]);
+        }
+    }
+    return out_vec;
+}
 }
 
 
diff --git a/tests/exec_test/classification/data.csv b/tests/exec_test/classification/data.csv
index 3fa9f64bd0b9d2133040ea4849a71789ac8f078a..3de3e29a3d3a8510a21ace99268501120ce46af4 100644
--- a/tests/exec_test/classification/data.csv
+++ b/tests/exec_test/classification/data.csv
@@ -1,101 +1,101 @@
-index,prop,A,B,C,D,E,F,G,H,I,J
-0,1,0.1,-0.3,10,10,-0.492825179098274,0.173025977694162,0.598942935224295,-0.298754475196825,-0.581254909010269,-0.110656494210556
-1,1,-1.89442810374214,-1.31996134398007,0.955713896876243,0.478117201427488,0.777586191100329,0.655369716778557,0.174914171427966,-0.288498877530604,-0.045316536149489,-0.606586193752411
-2,1,-1.47460150711424,-1.22614964523433,0.330140292484796,-0.389505498689912,0.138935265824808,-0.871869282167555,0.37472462048701,0.16418591189513,0.293560701443717,0.285608940220021
-3,1,-1.30213414336735,-1.82621262418812,-0.71381302228685,0.968769585007681,0.683148179202864,0.799125092538796,0.309479173526504,0.728052031003468,0.725495580994886,-0.676576302804248
-4,1,-1.73938632269334,-1.58349866505488,-0.067783417095816,0.229988549891323,0.582427598044647,0.261947149184825,-0.31573435079735,0.61178122144268,-0.187058216967238,0.871764347690334
-5,1,-1.56660896632398,-1.05861814902183,-0.212800982302764,0.915459776146607,-0.263465552591813,-0.666126495988014,-0.195028996490007,-0.237170057680116,-0.933358858596883,-0.19726273171241
-6,1,-1.55340876153895,-1.25209231285838,0.972585787901787,-0.872502887185675,0.729110910814452,0.265244787210866,-0.726900973624432,0.248626170107939,0.809004396880265,-0.278494064780479
-7,1,-1.54625325136447,-1.81238888450819,0.431645876221063,-0.595223273279383,0.953557069758112,-0.222915219121563,0.257670939076174,0.728927879098318,-0.579783055417687,-0.179960319428095
-8,1,-1.12735554524035,-1.69261497444728,0.827091199052693,0.208781482910759,-0.573317187361529,-0.00488758921352,0.341918716034638,-0.292901671356202,-0.34531700628784,0.766920547630073
-9,1,-1.35367834815884,-1.38141056472962,0.023965203621816,-0.805412569418484,0.806810139718495,-0.806576263127819,-0.39279977856172,-0.463819363774079,0.16095248005356,-0.553419747131608
-10,1,-1.17853151888796,-1.27705829298504,0.916015229666356,0.0275946645157,-0.829507007977635,-0.700063689327201,-0.715601456588714,-0.439799165143527,-0.487241220494887,0.245279267056121
-11,1,-1.17547049766875,-1.05613281246665,0.404557718897757,0.175997361062361,0.227383730822478,0.774254652577977,-0.616080996917636,-0.948639007451084,0.954076433375225,-0.497102001172339
-12,1,-1.67277915033943,-1.86190239883588,0.906708844886064,-0.592737030373698,0.885229803890949,0.822069297241907,0.204367485562992,0.24610324883505,-0.079476866422163,-0.244006995583434
-13,1,-1.96326165438884,-1.31680458089693,-0.284570394188414,-0.198686061574238,-0.168897609541112,-0.92939259112691,0.265899059671913,-0.828727642584781,-0.427453097474168,-0.738368091608883
-14,1,-1.79497769808481,-1.13948217357082,0.011556817105957,0.499408314757229,0.253163424774478,0.017645446880421,0.401735167095264,-0.650287617298501,-0.262217482830833,-0.510102120130588
-15,1,-1.07957120536262,-1.93245955077991,-0.520161174281201,-0.392478459347396,0.147495839565868,0.780879606474075,-0.281044687799173,-0.148036908135786,-0.208810472224093,0.278961929718128
-16,1,-1.52555145037507,-1.72455209196736,0.562803219191695,0.323676061636996,0.490737136410372,-0.95476192699496,0.028603504036769,-0.246295219650507,-0.297736293597739,-0.632473830957653
-17,1,-1.29142309507315,-1.9506961526212,0.859669461054104,0.485772819254089,0.268883598825009,0.253553321699552,-0.045743087569395,0.66793403278249,0.308591963919947,0.771084301464027
-18,1,-1.23404787121001,-1.68173519287847,-0.118871100462413,0.159000937768132,0.2985428841756,-0.203829205332538,-0.637945695251352,0.658049690810909,0.949529589134008,-0.577812553880056
-19,1,-1.13513050029551,-1.3119036669604,0.74037411093045,0.558782660077505,-0.096052126354583,0.529119817360537,0.372190604770144,0.688656466253683,-0.819433165315505,-0.12814415930811
-20,1,-0.2,0.132,-10,-10,0.785237349732891,-0.387217730495401,-0.942409218899448,0.160806577297675,-0.723969983661972,-0.452650134415823
-21,1,1.1507658618081,1.7260505392724,-0.92290734251321,0.465751384219632,-0.81727500527083,-0.182472640926628,0.887526070620356,0.111592883978488,0.386435078880162,-0.440017211221272
-22,1,1.90389768224701,1.71880759316591,0.28033979546451,0.379365407838544,0.634843008192624,0.371753918780839,-0.611653305369863,0.732567927874185,0.85803611350317,-0.577973441708411
-23,1,1.77751976452381,1.28697050370578,0.222850898945077,-0.424057088828287,-0.27619426781836,0.616941667680694,-0.696779972923147,0.23612770730498,0.760705889780527,0.34004139732033
-24,1,1.65314327493874,1.16282810211312,-0.339501197382944,0.898529591365812,-0.881538228231582,0.090728826664301,-0.858395870780934,0.907272331515896,0.160531735619067,0.678911811768841
-25,1,1.30955180558204,1.36827755737648,-0.444576754578563,-0.78871174512572,0.246625773070183,-0.663474018818313,-0.446355552060464,0.750312773134634,-0.98959522970136,-0.150120109840706
-26,1,1.44924431171893,1.40328864581169,-0.388679577334402,-0.708193450791952,0.850310084800308,-0.767256338531612,0.370509317329194,0.387354921015751,0.533160321164986,0.149390212455131
-27,1,1.61362501391753,1.05448314414567,-0.359644680155969,0.682555404147728,-0.53195400936544,0.934101689590862,-0.73600284018832,-0.29960291454053,0.351125596355818,-0.187842884669279
-28,1,1.0243392720598,1.91059602121133,-0.249409157470717,-0.137070024386644,-0.707128531134104,0.944932049234295,0.431233366052987,0.449543990959262,0.912901932280027,0.77394610963827
-29,1,1.99444678594607,1.67204984441306,0.935302642480463,0.833720966523807,0.254167956717343,-0.007922712021391,-0.114776295376767,-0.276042896002242,-0.813098403125419,0.341922052212475
-30,1,1.40110330287926,1.109011516196,0.425716772255087,0.544174803732763,0.492863854358204,-0.047589791717166,-0.743840790633672,0.525289489060411,0.829611715544936,0.015193221845522
-31,1,1.94995090625471,1.05727410799969,0.12665368551441,0.469705238170149,0.874436248273008,-0.759571175468135,0.310230735619265,-0.80342084374485,-0.462431082486477,-0.407165886759129
-32,1,1.47264625042994,1.18913643279065,-0.731393018031039,0.461102224603009,-0.884528391885322,-0.419893944840798,0.647518214389067,0.688126326408485,0.754656371691603,0.116881923067816
-33,1,1.45901677207507,1.17024364037294,-0.527433424947131,-0.598774697808646,0.113197791601676,-0.50528865259863,0.117572114288939,0.590400320594326,-0.155159386769126,0.354827977413197
-34,1,1.32042744243041,1.19801952930384,-0.818626622405165,-0.029008564510599,0.952315968378468,0.817495784213924,0.182224554845043,-0.01377304364653,-0.26273195293588,-0.859530562808673
-35,1,1.88138237976289,1.03670081839679,0.305218688016626,-0.060885647660027,-0.387903446605514,-0.108064042735465,-0.962980405009682,-0.424289604203511,-0.253442293077285,0.309637368411297
-36,1,1.9986688782461,1.36909257128618,0.54312844740039,0.591372473040837,-0.835367086693457,0.198315253422507,-0.181434739783802,0.636429105754948,0.420628646992331,0.990122364664621
-37,1,1.50455818499044,1.19094974349673,-0.653263607332762,0.359359450868376,0.30131719114182,0.649581794356589,0.942268955633086,0.884659894489377,-0.473171239344398,0.039635066570717
-38,1,1.00833361547154,1.98150630000827,-0.812352457176761,0.219766101590983,-0.65021067790289,0.423621690291556,-0.58865099275791,0.061487886019891,-0.237737474016087,0.641284347380825
-39,1,1.60179185724619,1.12508599627141,-0.829819386940741,-0.345104687573802,0.485166070545119,-0.258839727448056,-0.920615208326881,0.275498215871427,-0.629075534110342,-0.642527887960687
-40,0,0.2,0.58,10,-10,0.041519856511361,0.23303461629095,-0.497233246191187,-0.023544587617095,-0.418540837770003,-0.550233932792512
-41,0,-1.09147574370355,1.70418701701285,-0.480316474702795,-0.753784710340632,-0.613234235616998,0.167955573662474,0.455636631315042,-0.380227635953206,0.48021383007369,-0.453674929885108
-42,0,-1.9425392252915,1.59311394144654,0.310098050913387,-0.835007082906627,0.407580140850853,0.556924247596553,-0.388616604639346,0.60215104751412,-0.984322198098753,-0.996332888983337
-43,0,-1.40302421044915,1.05041379743038,-0.898456453446964,-0.797029924245349,0.47491891024478,0.480193220538417,-0.750856163558686,-0.043960372032018,-0.242651391805662,-0.109239061054006
-44,0,-1.45810616907354,1.08468326497063,0.571329522934018,-0.677379826379623,0.098396984688832,-0.961599170104035,-0.753922591922157,0.361435891257559,-0.638030455493982,0.404349024843908
-45,0,-1.60421432901638,1.57730973247518,0.402433205555268,-0.06801187450078,-0.373089661152032,0.23970878487105,0.416451106643361,-0.50599166271433,-0.88669034806741,0.30364523616443
-46,0,-1.54868661350102,1.32883184576708,-0.174925245509766,0.050330391451536,-0.388676795741932,-0.72307604978553,0.646076107724964,-0.00105589909588,0.491928720743773,-0.647995101369186
-47,0,-1.8920756792535,1.76576258461153,0.289410761217525,0.48566510896872,0.338684773860801,0.374319581439648,-0.105479014627167,0.004520417892418,0.222862261975939,0.23538363683764
-48,0,-1.51442922313653,1.69840409315155,-0.199044563017407,-0.855804112781183,0.947572000564906,0.654939562810152,0.802084131057488,0.010033694468233,0.449766366250574,0.119974134618433
-49,0,-1.33469144171841,1.80124846893287,-0.346729568989951,-0.614828863660672,0.578150372001527,-0.697356489908387,-0.272496177427547,-0.326679505363186,0.403185907494623,0.659834986972357
-50,0,-1.39216086591683,1.96030807097305,-0.470918775433235,-0.165965173767875,-0.118373275802139,-0.804671542299309,-0.273096283874977,0.964676290034708,-0.240786016285174,0.781092750718218
-51,0,-1.10818774496527,1.1321805921252,-0.546315077724052,-0.263397808061131,0.073416112160648,-0.561584513583351,-0.003812545601594,-0.067901708659798,-0.797337624892413,-0.502494288676279
-52,0,-1.12733422378345,1.22290093390908,-0.239618151680487,0.281282683112064,-0.122253338243164,-0.416340912422471,-0.302944823763312,0.950697167857575,0.084774348269755,0.245643637478141
-53,0,-1.54504585318447,1.46465556555859,0.672465261607398,0.963677112876299,-0.732866944741014,0.269879007022312,-0.734121763984793,-0.18475004364869,0.494783604230457,-0.563469688908407
-54,0,-1.69728989778812,1.93427938064611,0.916674666213795,0.744100669613517,-0.536325680879341,0.745349313896706,-0.608494971121628,-0.036147807131094,0.730097211981708,-0.986020687921255
-55,0,-1.46716685688328,1.91950733639359,-0.040012375137611,0.248257524389148,-0.795936343325832,-0.755933622220192,0.664943062567423,-0.560825069941966,-0.987328335835364,0.00918182383389
-56,0,-1.5078580841421,1.11065681931139,-0.75655271526814,-0.433965979475654,-0.925820800763387,0.621204380538264,-0.725355435802351,-0.087195045278291,0.500040007799584,-0.351024070867477
-57,0,-1.79333947783294,1.64615611570236,0.593670368718185,0.74125415566331,-0.835056311664806,-0.128283340965351,0.795769070113583,0.338062872249377,0.961610282279288,-0.519755961049099
-58,0,-1.68562328688306,1.79136645116331,-0.917792004629201,-0.224807652067029,0.751172530954049,0.744925497765574,0.054821387540181,-0.268146122719043,-0.373795753322288,-0.023619900695578
-59,0,-1.70325116873164,1.56173898398367,0.937331444475048,-0.189146596668676,0.726757528139029,0.571196020214809,0.150478496659529,0.716370904753891,0.645947936391794,-0.096512499841381
-60,0,-0.31,-0.164,-10,10,0.303748234076738,0.094684069184242,0.846651908762107,0.505710991097032,-0.664846620425076,-0.722934785670171
-61,0,1.51747503460744,-1.57976833969122,-0.313853456471656,-0.670641690437042,0.337481189036041,-0.695059667580877,0.382512664766648,-0.754051294565859,-0.540912893771664,-0.152736592481289
-62,0,1.36729416399966,-1.54942606995245,0.746279765035798,0.320667909398266,0.075712278316126,0.557089028326803,-0.314459962457274,-0.091179395352991,-0.712572618352738,-0.862523770264919
-63,0,1.87551859565403,-1.01245024447758,0.961634242304571,0.99902517180177,0.428576472620752,0.790254229843056,-0.162732148014183,0.057108415575022,0.099625367521191,-0.41779573726667
-64,0,1.8407338686869,-1.58680706359952,-0.293737994923213,-0.654603713924763,-0.15830470325221,-0.4506171823593,0.106217286056366,-0.250165079508456,-0.598894350859836,-0.860382476004742
-65,0,1.47999238640346,-1.68861965445586,0.661374709635725,0.335413696048534,0.295408469126627,-0.340725080366546,0.611961227458239,0.53327702260923,-0.960254363897463,0.913251337834092
-66,0,1.0735581028252,-1.06052424530937,-0.790281335013236,0.372594655247821,-0.940075790261345,0.972106617215367,-0.246874887198155,-0.501544524013033,-0.134947611932188,0.130090806976322
-67,0,1.63769743034008,-1.64946099093265,-0.600590046972624,0.281621309709353,0.836244003088172,0.56250556179443,-0.244248244001593,0.274273110413607,0.988229164412892,-0.903492892429764
-68,0,1.9226795203725,-1.58810792001545,0.230397844467249,0.458000795025685,0.160534364807898,0.106760231103633,0.084376336290482,-0.410257096809632,-0.388975913032382,0.233684932760446
-69,0,1.42821810695172,-1.75832976379165,0.122894112900537,-0.193746425367835,0.602411133999453,-0.895694511099768,0.347280223444287,0.045175117581033,-0.232661771389541,-0.314648785155521
-70,0,1.42602875697361,-1.16082451050484,0.906027162216176,0.736418182225292,-0.041284854438203,0.308524126840497,0.369205540497406,0.333193031466162,0.98544497734097,-0.253876502721057
-71,0,1.73002019404142,-1.80947421953802,-0.677661468333469,0.07388223501889,-0.682147267310905,0.024126391992196,0.848946249678909,-0.516253994735439,0.202627425635043,-0.897477249843204
-72,0,1.11605808678586,-1.05622349137538,0.492431513300772,-0.737330353527688,0.594894327441348,0.805436037154752,-0.912716679245893,-0.390199322338262,-0.735805203184445,-0.05803264345169
-73,0,1.52878306779173,-1.52822073704896,-0.863821530585294,-0.987125905118183,-0.698190916645222,-0.17859271120364,-0.902497993400075,0.777448050547606,0.03349780154213,0.569802193246196
-74,0,1.69602091303769,-1.68791329506752,-0.919679036112179,-0.083795023015624,0.492078750634905,-0.102786002654994,0.168000984501864,-0.984910911120671,-0.901017886055053,0.639813560268343
-75,0,1.82292095427058,-1.79921516167805,0.107455937171145,-0.854711756750333,0.344969246269787,0.519092986129825,0.410230657805076,-0.91216461269154,0.033943611687528,-0.306643316979961
-76,0,1.15382245032495,-1.9125109596393,-0.80848616018294,-0.010443047871684,-0.706296790283886,0.822118261736111,0.163327430772402,0.252786291364115,-0.501338527911191,-0.28349201031843
-77,0,1.19521627831595,-1.4347201247938,-0.814416838367815,-0.02940231646999,-0.841428202408144,-0.004586605289201,-0.606434730541928,0.714277316437912,-0.44481897692423,-0.753698456302665
-78,0,1.99358961720643,-1.52499478281942,-0.877637461379848,0.414405535550407,-0.03365581494898,0.624692043559635,-0.832402658891314,-0.723028062732401,-0.867099034604054,-0.185632378061498
-79,0,1.6235192049324,-1.52045677356057,0.977828685636029,-0.57502380941392,-0.402617609462035,0.631967959251952,-0.426504420434097,0.480579460496328,0.686338078276468,-0.793812851707889
-80,1,-1.9061964810895,-1.28908450646839,10,10,0.101102136284509,-0.416199695149021,-0.494850987164782,-0.568698448483212,-0.184782382471875,-0.552230498856606
-81,1,-1.12334568706136,-1.43192728687949,-0.202671045004157,-0.138914163603925,-0.937156710796857,-0.116790109384378,-0.094237431941851,-0.896761118553971,-0.183423320636867,0.458624633065419
-82,1,-1.85938009020988,-1.2014277824818,-0.699402902052328,0.801606907908076,0.618074329335756,-0.172568708757076,-0.075693445304373,0.488815268086692,-0.612225386267585,0.515474858015819
-83,1,-1.44593059276162,-1.50738144143115,-0.146467066237161,0.773717178872341,0.498796984960351,-0.015862721592055,0.487162827649467,-0.002016922590367,-0.480395455657278,0.140660394856319
-84,1,-1.5068337349461,-1.39605748721966,0.935273336022611,-0.65840232577507,0.254028615496319,-0.207949363786322,0.494233964181716,0.342544015156094,-0.2790717466048,0.681766781920308
-85,1,1.29459521637362,1.25954745515179,-10,-10,-0.772948300582061,-0.755591080857131,-0.795691897784493,0.140653835392209,-0.160483486922781,0.460920935704452
-86,1,1.04689401512909,1.48899924906156,-0.102806023076495,-0.232256721754397,0.982487312078063,0.220639487969972,0.466108251058299,-0.328239000603224,0.955688285869012,0.98401214247364
-87,1,1.58830474403604,1.70226055213414,-0.400909948872293,-0.887240029691788,-0.796366553971199,-0.189011341359002,-0.984264269832423,0.228539348323108,0.696045037642922,-0.734941166556072
-88,1,1.07001216284605,1.81845698640496,-0.906675421892372,0.643501800272306,0.2964442904515,-0.212339822521429,-0.624947347663644,-0.076505534185115,0.690006945874019,0.603178865697037
-89,1,1.47818853391931,1.1810797217516,-0.777878371782176,0.158700400185078,0.77008386941758,0.318201581494366,-0.577373286340777,0.207915408782256,0.169898207168944,-0.134718349741109
-90,0,-1.39792536337696,1.8903759983709,10,-10,-0.381543623044489,-0.150608604917312,0.940200935058958,-0.260126956593852,0.011178432296195,-0.552646188796202
-91,0,-1.34181919280501,1.37770384290606,-0.273896107346467,0.9218628887177,0.183329714125041,0.794995796775324,0.47034078624241,0.587159127993906,0.656065190534019,0.710378359435155
-92,0,-1.08535749655328,1.25684564483175,-0.093438684660175,0.867363731909897,0.501979335337794,0.929133531466716,0.853038546233495,0.231647371842096,-0.921363933789468,0.9955206665909
-93,0,-1.5078347061732,1.75537297346943,-0.353047628963401,0.686996459628496,0.12650715249212,-0.584157551233493,0.67801198459735,0.130184075673761,-0.541365882749818,0.804095414322346
-94,0,-1.67232665291775,1.91398842184753,-0.055989266428472,0.083972688856283,0.495406878960658,-0.531851511151842,-0.68298755038252,-0.762719341237422,0.044183568378214,0.569492860435106
-95,0,1.52196747373202,-1.81272431584475,-10,10,-0.592427348924565,-0.245215291809175,0.450286805609337,-0.61720080602177,-0.078323806376631,-0.138400199664094
-96,0,1.34277619089321,-1.04264614535854,-0.840523610880692,-0.579768061766314,0.207088065224924,-0.30689024242517,-0.707319832593209,0.067209487208095,-0.219041441615042,0.651618314592841
-97,0,1.72996670685819,-1.26148185356343,-0.071347258910479,-0.571647931880792,0.00248497405952,0.410346123251162,0.294254262248804,0.698018369247902,0.652553267893053,-0.960621219815728
-98,0,1.63679608599505,-1.40483117266873,0.133355343382705,0.785183623637213,0.106494106522641,0.457003384754942,-0.314470768070196,-0.05337112691883,0.86147345141363,-0.770167158107586
-99,0,1.22531932528574,-1.39832123108255,0.751819680541469,0.843477659731268,0.880714646905367,0.20665859661747,-0.85053999542226,0.770244035843202,-0.790477429383416,-0.219373260405667
+index,prop,task,A,B,C,D,E,F,G,H,I,J
+0,0,Task_1,-0.815427620522422,-0.549653782587197,10,10,-0.492825179098274,0.173025977694162,0.598942935224295,-0.298754475196825,-0.581254909010269,-0.110656494210556
+1,0,Task_1,-0.69992853524861,-0.229332112274544,0.955713896876243,0.478117201427488,0.777586191100329,0.655369716778557,0.174914171427966,-0.288498877530604,-0.045316536149489,-0.606586193752411
+2,0,Task_1,-0.368076290946109,-0.969405272021421,0.330140292484796,-0.389505498689912,0.138935265824808,-0.871869282167555,0.37472462048701,0.16418591189513,0.293560701443717,0.285608940220021
+3,0,Task_1,-0.573491802821712,-0.815581340289383,-0.71381302228685,0.968769585007681,0.683148179202864,0.799125092538796,0.309479173526504,0.728052031003468,0.725495580994886,-0.676576302804248
+4,0,Task_1,-0.676358754897705,-0.548282221764748,-0.067783417095816,0.229988549891323,0.582427598044647,0.261947149184825,-0.31573435079735,0.61178122144268,-0.187058216967238,0.871764347690334
+5,0,Task_1,-0.757605167585471,-0.808298008590424,-0.212800982302764,0.915459776146607,-0.263465552591813,-0.666126495988014,-0.195028996490007,-0.237170057680116,-0.933358858596883,-0.19726273171241
+6,0,Task_1,-0.886003489547442,-0.509472491633194,0.972585787901787,-0.872502887185675,0.729110910814452,0.265244787210866,-0.726900973624432,0.248626170107939,0.809004396880265,-0.278494064780479
+7,0,Task_1,-1.02924299329947,-0.618392550297407,0.431645876221063,-0.595223273279383,0.953557069758112,-0.222915219121563,0.257670939076174,0.728927879098318,-0.579783055417687,-0.179960319428095
+8,0,Task_1,-0.502456609281931,-0.196195032500234,0.827091199052693,0.208781482910759,-0.573317187361529,-0.00488758921352,0.341918716034638,-0.292901671356202,-0.34531700628784,0.766920547630073
+9,0,Task_1,-0.517308486454666,-0.58057651993072,0.023965203621816,-0.805412569418484,0.806810139718495,-0.806576263127819,-0.39279977856172,-0.463819363774079,0.16095248005356,-0.553419747131608
+10,0,Task_1,-0.634057125095051,-1.01875520243377,0.916015229666356,0.0275946645157,-0.829507007977635,-0.700063689327201,-0.715601456588714,-0.439799165143527,-0.487241220494887,0.245279267056121
+11,0,Task_1,-0.577778256396397,-0.425744718740636,0.404557718897757,0.175997361062361,0.227383730822478,0.774254652577977,-0.616080996917636,-0.948639007451084,0.954076433375225,-0.497102001172339
+12,0,Task_1,-0.197376045004303,-0.404709510371676,0.906708844886064,-0.592737030373698,0.885229803890949,0.822069297241907,0.204367485562992,0.24610324883505,-0.079476866422163,-0.244006995583434
+13,0,Task_1,-0.766513992210544,-1.03945619108008,-0.284570394188414,-0.198686061574238,-0.168897609541112,-0.92939259112691,0.265899059671913,-0.828727642584781,-0.427453097474168,-0.738368091608883
+14,0,Task_1,-0.301129557074769,-0.466366201816861,0.011556817105957,0.499408314757229,0.253163424774478,0.017645446880421,0.401735167095264,-0.650287617298501,-0.262217482830833,-0.510102120130588
+15,0,Task_1,-0.372562647160274,-0.805363289239018,-0.520161174281201,-0.392478459347396,0.147495839565868,0.780879606474075,-0.281044687799173,-0.148036908135786,-0.208810472224093,0.278961929718128
+16,0,Task_1,-0.573276127254349,-0.843760519080871,0.562803219191695,0.323676061636996,0.490737136410372,-0.95476192699496,0.028603504036769,-0.246295219650507,-0.297736293597739,-0.632473830957653
+17,0,Task_1,-1.08177643161138,-1.08748936331147,0.859669461054104,0.485772819254089,0.268883598825009,0.253553321699552,-0.045743087569395,0.66793403278249,0.308591963919947,0.771084301464027
+18,0,Task_1,-0.121943068431321,-0.937658541363365,-0.118871100462413,0.159000937768132,0.2985428841756,-0.203829205332538,-0.637945695251352,0.658049690810909,0.949529589134008,-0.577812553880056
+19,0,Task_1,-1.06747884637477,-0.842449899007254,0.74037411093045,0.558782660077505,-0.096052126354583,0.529119817360537,0.372190604770144,0.688656466253683,-0.819433165315505,-0.12814415930811
+20,0,Task_1,-0.376355273108791,-0.908946282731397,-10,-10,0.785237349732891,-0.387217730495401,-0.942409218899448,0.160806577297675,-0.723969983661972,-0.452650134415823
+21,0,Task_1,-0.846685755905842,-0.209448772979162,-0.92290734251321,0.465751384219632,-0.81727500527083,-0.182472640926628,0.887526070620356,0.111592883978488,0.386435078880162,-0.440017211221272
+22,0,Task_1,-0.837187625737658,-0.851876882999398,0.28033979546451,0.379365407838544,0.634843008192624,0.371753918780839,-0.611653305369863,0.732567927874185,0.85803611350317,-0.577973441708411
+23,0,Task_1,-0.175842102272502,-0.488461994046914,0.222850898945077,-0.424057088828287,-0.27619426781836,0.616941667680694,-0.696779972923147,0.23612770730498,0.760705889780527,0.34004139732033
+24,0,Task_1,-0.857809192768388,-0.265302164309273,-0.339501197382944,0.898529591365812,-0.881538228231582,0.090728826664301,-0.858395870780934,0.907272331515896,0.160531735619067,0.678911811768841
+25,1,Task_1,-0.585614473203943,0.551068965982618,-0.444576754578563,-0.78871174512572,0.246625773070183,-0.663474018818313,-0.446355552060464,0.750312773134634,-0.98959522970136,-0.150120109840706
+26,1,Task_1,-0.23908209161902,0.897577090052027,-0.388679577334402,-0.708193450791952,0.850310084800308,-0.767256338531612,0.370509317329194,0.387354921015751,0.533160321164986,0.149390212455131
+27,1,Task_1,-0.830137779545391,0.125511796448773,-0.359644680155969,0.682555404147728,-0.53195400936544,0.934101689590862,-0.73600284018832,-0.29960291454053,0.351125596355818,-0.187842884669279
+28,1,Task_1,-0.216184782523012,0.211978905733634,-0.249409157470717,-0.137070024386644,-0.707128531134104,0.944932049234295,0.431233366052987,0.449543990959262,0.912901932280027,0.77394610963827
+29,1,Task_1,-0.282632767106854,0.264519450298051,0.935302642480463,0.833720966523807,0.254167956717343,-0.007922712021391,-0.114776295376767,-0.276042896002242,-0.813098403125419,0.341922052212475
+30,1,Task_1,-0.575451831562744,0.581795291243757,0.425716772255087,0.544174803732763,0.492863854358204,-0.047589791717166,-0.743840790633672,0.525289489060411,0.829611715544936,0.015193221845522
+31,1,Task_1,-0.758220206206948,0.613617353581097,0.12665368551441,0.469705238170149,0.874436248273008,-0.759571175468135,0.310230735619265,-0.80342084374485,-0.462431082486477,-0.407165886759129
+32,1,Task_1,-0.591882107713968,0.146363847316077,-0.731393018031039,0.461102224603009,-0.884528391885322,-0.419893944840798,0.647518214389067,0.688126326408485,0.754656371691603,0.116881923067816
+33,1,Task_1,-0.715219788915433,1.08461646785062,-0.527433424947131,-0.598774697808646,0.113197791601676,-0.50528865259863,0.117572114288939,0.590400320594326,-0.155159386769126,0.354827977413197
+34,1,Task_1,-0.796303564991711,0.501349128362771,-0.818626622405165,-0.029008564510599,0.952315968378468,0.817495784213924,0.182224554845043,-0.01377304364653,-0.26273195293588,-0.859530562808673
+35,1,Task_1,-0.195432956299915,0.260304213033586,0.305218688016626,-0.060885647660027,-0.387903446605514,-0.108064042735465,-0.962980405009682,-0.424289604203511,-0.253442293077285,0.309637368411297
+36,1,Task_1,-1.03051702404215,0.699873963424479,0.54312844740039,0.591372473040837,-0.835367086693457,0.198315253422507,-0.181434739783802,0.636429105754948,0.420628646992331,0.990122364664621
+37,1,Task_1,-0.642471040462047,0.350035500680932,-0.653263607332762,0.359359450868376,0.30131719114182,0.649581794356589,0.942268955633086,0.884659894489377,-0.473171239344398,0.039635066570717
+38,1,Task_1,-0.716592168562474,0.251176558055396,-0.812352457176761,0.219766101590983,-0.65021067790289,0.423621690291556,-0.58865099275791,0.061487886019891,-0.237737474016087,0.641284347380825
+39,1,Task_1,-0.627915139540302,0.522644163585557,-0.829819386940741,-0.345104687573802,0.485166070545119,-0.258839727448056,-0.920615208326881,0.275498215871427,-0.629075534110342,-0.642527887960687
+40,1,Task_1,-0.156663423633232,0.304600082490645,10,-10,0.041519856511361,0.23303461629095,-0.497233246191187,-0.023544587617095,-0.418540837770003,-0.550233932792512
+41,1,Task_2,-0.403555861666686,0.807008471177762,-0.480316474702795,-0.753784710340632,-0.613234235616998,0.167955573662474,0.455636631315042,-0.380227635953206,0.48021383007369,-0.453674929885108
+42,1,Task_2,-0.475033062023491,0.231061734571007,0.310098050913387,-0.835007082906627,0.407580140850853,0.556924247596553,-0.388616604639346,0.60215104751412,-0.984322198098753,-0.996332888983337
+43,1,Task_2,-0.286740191813169,0.871522465291953,-0.898456453446964,-0.797029924245349,0.47491891024478,0.480193220538417,-0.750856163558686,-0.043960372032018,-0.242651391805662,-0.109239061054006
+44,1,Task_2,-0.191548445530409,0.578646221732672,0.571329522934018,-0.677379826379623,0.098396984688832,-0.961599170104035,-0.753922591922157,0.361435891257559,-0.638030455493982,0.404349024843908
+45,1,Task_2,-0.962818439897195,0.190811801399786,0.402433205555268,-0.06801187450078,-0.373089661152032,0.23970878487105,0.416451106643361,-0.50599166271433,-0.88669034806741,0.30364523616443
+46,1,Task_2,-0.885419462203051,0.155312944156919,-0.174925245509766,0.050330391451536,-0.388676795741932,-0.72307604978553,0.646076107724964,-0.00105589909588,0.491928720743773,-0.647995101369186
+47,1,Task_2,-0.634581804798124,0.149519641506344,0.289410761217525,0.48566510896872,0.338684773860801,0.374319581439648,-0.105479014627167,0.004520417892418,0.222862261975939,0.23538363683764
+48,1,Task_2,-0.380429991155736,0.423554740615867,-0.199044563017407,-0.855804112781183,0.947572000564906,0.654939562810152,0.802084131057488,0.010033694468233,0.449766366250574,0.119974134618433
+49,1,Task_2,-0.294345606906633,0.954791580849514,-0.346729568989951,-0.614828863660672,0.578150372001527,-0.697356489908387,-0.272496177427547,-0.326679505363186,0.403185907494623,0.659834986972357
+50,2,Task_1,0.164414245529879,0.270409021456753,-0.470918775433235,-0.165965173767875,-0.118373275802139,-0.804671542299309,-0.273096283874977,0.964676290034708,-0.240786016285174,0.781092750718218
+51,2,Task_1,0.962558187516297,0.353448095106742,-0.546315077724052,-0.263397808061131,0.073416112160648,-0.561584513583351,-0.003812545601594,-0.067901708659798,-0.797337624892413,-0.502494288676279
+52,2,Task_1,0.328218018271649,0.30124689351081,-0.239618151680487,0.281282683112064,-0.122253338243164,-0.416340912422471,-0.302944823763312,0.950697167857575,0.084774348269755,0.245643637478141
+53,2,Task_1,0.157711895115317,0.944426984942688,0.672465261607398,0.963677112876299,-0.732866944741014,0.269879007022312,-0.734121763984793,-0.18475004364869,0.494783604230457,-0.563469688908407
+54,2,Task_1,1.05838672002069,0.573716871491595,0.916674666213795,0.744100669613517,-0.536325680879341,0.745349313896706,-0.608494971121628,-0.036147807131094,0.730097211981708,-0.986020687921255
+55,2,Task_1,0.110836847139862,0.585126999320639,-0.040012375137611,0.248257524389148,-0.795936343325832,-0.755933622220192,0.664943062567423,-0.560825069941966,-0.987328335835364,0.00918182383389
+56,2,Task_1,1.06747271856711,1.07364864476858,-0.75655271526814,-0.433965979475654,-0.925820800763387,0.621204380538264,-0.725355435802351,-0.087195045278291,0.500040007799584,-0.351024070867477
+57,2,Task_1,0.989939601503884,0.247705435387067,0.593670368718185,0.74125415566331,-0.835056311664806,-0.128283340965351,0.795769070113583,0.338062872249377,0.961610282279288,-0.519755961049099
+58,2,Task_1,0.970023710841513,1.01758529653736,-0.917792004629201,-0.224807652067029,0.751172530954049,0.744925497765574,0.054821387540181,-0.268146122719043,-0.373795753322288,-0.023619900695578
+59,2,Task_1,1.04979809782509,0.825205513076737,0.937331444475048,-0.189146596668676,0.726757528139029,0.571196020214809,0.150478496659529,0.716370904753891,0.645947936391794,-0.096512499841381
+60,2,Task_1,0.606201322157722,0.429911059767652,-10,10,0.303748234076738,0.094684069184242,0.846651908762107,0.505710991097032,-0.664846620425076,-0.722934785670171
+61,2,Task_1,0.509318826024622,0.139403752494424,-0.313853456471656,-0.670641690437042,0.337481189036041,-0.695059667580877,0.382512664766648,-0.754051294565859,-0.540912893771664,-0.152736592481289
+62,2,Task_1,0.416414890641425,1.04597850573715,0.746279765035798,0.320667909398266,0.075712278316126,0.557089028326803,-0.314459962457274,-0.091179395352991,-0.712572618352738,-0.862523770264919
+63,2,Task_1,0.236961042862613,0.461540684896611,0.961634242304571,0.99902517180177,0.428576472620752,0.790254229843056,-0.162732148014183,0.057108415575022,0.099625367521191,-0.41779573726667
+64,2,Task_1,0.325246248634948,0.721692503249982,-0.293737994923213,-0.654603713924763,-0.15830470325221,-0.4506171823593,0.106217286056366,-0.250165079508456,-0.598894350859836,-0.860382476004742
+65,2,Task_1,0.258917730833821,0.68064431493967,0.661374709635725,0.335413696048534,0.295408469126627,-0.340725080366546,0.611961227458239,0.53327702260923,-0.960254363897463,0.913251337834092
+66,2,Task_2,1.0502123912678,1.0175241545193,-0.790281335013236,0.372594655247821,-0.940075790261345,0.972106617215367,-0.246874887198155,-0.501544524013033,-0.134947611932188,0.130090806976322
+67,2,Task_2,0.653819704386349,0.899775279158189,-0.600590046972624,0.281621309709353,0.836244003088172,0.56250556179443,-0.244248244001593,0.274273110413607,0.988229164412892,-0.903492892429764
+68,2,Task_2,0.29937357944438,1.01665644266054,0.230397844467249,0.458000795025685,0.160534364807898,0.106760231103633,0.084376336290482,-0.410257096809632,-0.388975913032382,0.233684932760446
+69,2,Task_2,0.631194229943451,0.952468985425419,0.122894112900537,-0.193746425367835,0.602411133999453,-0.895694511099768,0.347280223444287,0.045175117581033,-0.232661771389541,-0.314648785155521
+70,2,Task_2,1.08713454227837,0.656075322649452,0.906027162216176,0.736418182225292,-0.041284854438203,0.308524126840497,0.369205540497406,0.333193031466162,0.98544497734097,-0.253876502721057
+71,2,Task_2,0.139844193498669,0.408310759532142,-0.677661468333469,0.07388223501889,-0.682147267310905,0.024126391992196,0.848946249678909,-0.516253994735439,0.202627425635043,-0.897477249843204
+72,2,Task_2,0.155190443948636,0.269264020133562,0.492431513300772,-0.737330353527688,0.594894327441348,0.805436037154752,-0.912716679245893,-0.390199322338262,-0.735805203184445,-0.05803264345169
+73,2,Task_2,0.71862355058755,0.784351472902302,-0.863821530585294,-0.987125905118183,-0.698190916645222,-0.17859271120364,-0.902497993400075,0.777448050547606,0.03349780154213,0.569802193246196
+74,2,Task_2,0.932100491693559,0.673631604373795,-0.919679036112179,-0.083795023015624,0.492078750634905,-0.102786002654994,0.168000984501864,-0.984910911120671,-0.901017886055053,0.639813560268343
+75,3,Task_1,0.103583721726665,-0.373304248094501,0.107455937171145,-0.854711756750333,0.344969246269787,0.519092986129825,0.410230657805076,-0.91216461269154,0.033943611687528,-0.306643316979961
+76,3,Task_1,0.698578699086034,-0.805397267250048,-0.80848616018294,-0.010443047871684,-0.706296790283886,0.822118261736111,0.163327430772402,0.252786291364115,-0.501338527911191,-0.28349201031843
+77,3,Task_1,0.236887498524042,-0.155242051697718,-0.814416838367815,-0.02940231646999,-0.841428202408144,-0.004586605289201,-0.606434730541928,0.714277316437912,-0.44481897692423,-0.753698456302665
+78,3,Task_1,0.178535729762585,-0.178631521167564,-0.877637461379848,0.414405535550407,-0.03365581494898,0.624692043559635,-0.832402658891314,-0.723028062732401,-0.867099034604054,-0.185632378061498
+79,3,Task_1,0.135466013133272,-1.08706802405113,0.977828685636029,-0.57502380941392,-0.402617609462035,0.631967959251952,-0.426504420434097,0.480579460496328,0.686338078276468,-0.793812851707889
+80,3,Task_1,1.06511564653917,-0.529772758901416,10,10,0.101102136284509,-0.416199695149021,-0.494850987164782,-0.568698448483212,-0.184782382471875,-0.552230498856606
+81,3,Task_1,1.065535288073,-0.706304720574752,-0.202671045004157,-0.138914163603925,-0.937156710796857,-0.116790109384378,-0.094237431941851,-0.896761118553971,-0.183423320636867,0.458624633065419
+82,3,Task_1,0.896291258595646,-0.255636497676642,-0.699402902052328,0.801606907908076,0.618074329335756,-0.172568708757076,-0.075693445304373,0.488815268086692,-0.612225386267585,0.515474858015819
+83,3,Task_1,0.524657824983779,-0.653380201807584,-0.146467066237161,0.773717178872341,0.498796984960351,-0.015862721592055,0.487162827649467,-0.002016922590367,-0.480395455657278,0.140660394856319
+84,3,Task_1,0.131331250236976,-0.217693885103831,0.935273336022611,-0.65840232577507,0.254028615496319,-0.207949363786322,0.494233964181716,0.342544015156094,-0.2790717466048,0.681766781920308
+85,3,Task_1,0.247418413965845,-0.55249563814082,-10,-10,-0.772948300582061,-0.755591080857131,-0.795691897784493,0.140653835392209,-0.160483486922781,0.460920935704452
+86,3,Task_1,0.877892966008508,-0.600861427554399,-0.102806023076495,-0.232256721754397,0.982487312078063,0.220639487969972,0.466108251058299,-0.328239000603224,0.955688285869012,0.98401214247364
+87,3,Task_1,0.353355849981167,-0.384127703150446,-0.400909948872293,-0.887240029691788,-0.796366553971199,-0.189011341359002,-0.984264269832423,0.228539348323108,0.696045037642922,-0.734941166556072
+88,3,Task_1,0.278857275841181,-0.845560468506653,-0.906675421892372,0.643501800272306,0.2964442904515,-0.212339822521429,-0.624947347663644,-0.076505534185115,0.690006945874019,0.603178865697037
+89,3,Task_1,0.36866502316092,-0.854728078950075,-0.777878371782176,0.158700400185078,0.77008386941758,0.318201581494366,-0.577373286340777,0.207915408782256,0.169898207168944,-0.134718349741109
+90,3,Task_1,0.493452674287737,-0.910519988093965,10,-10,-0.381543623044489,-0.150608604917312,0.940200935058958,-0.260126956593852,0.011178432296195,-0.552646188796202
+91,3,Task_2,0.908818256135453,-0.80987547007129,-0.273896107346467,0.9218628887177,0.183329714125041,0.794995796775324,0.47034078624241,0.587159127993906,0.656065190534019,0.710378359435155
+92,3,Task_2,0.996336489766102,-0.975493823251068,-0.093438684660175,0.867363731909897,0.501979335337794,0.929133531466716,0.853038546233495,0.231647371842096,-0.921363933789468,0.9955206665909
+93,3,Task_2,0.657942481588168,-0.245177885637302,-0.353047628963401,0.686996459628496,0.12650715249212,-0.584157551233493,0.67801198459735,0.130184075673761,-0.541365882749818,0.804095414322346
+94,3,Task_2,0.328489621282775,-1.08052040344332,-0.055989266428472,0.083972688856283,0.495406878960658,-0.531851511151842,-0.68298755038252,-0.762719341237422,0.044183568378214,0.569492860435106
+95,3,Task_2,0.253790826276124,-0.935268396370178,-10,10,-0.592427348924565,-0.245215291809175,0.450286805609337,-0.61720080602177,-0.078323806376631,-0.138400199664094
+96,3,Task_2,0.226661731310054,-0.206651604608129,-0.840523610880692,-0.579768061766314,0.207088065224924,-0.30689024242517,-0.707319832593209,0.067209487208095,-0.219041441615042,0.651618314592841
+97,3,Task_2,0.730538042566787,-0.815537451517852,-0.071347258910479,-0.571647931880792,0.00248497405952,0.410346123251162,0.294254262248804,0.698018369247902,0.652553267893053,-0.960621219815728
+98,3,Task_2,1.05315118537755,-0.90842251928343,0.133355343382705,0.785183623637213,0.106494106522641,0.457003384754942,-0.314470768070196,-0.05337112691883,0.86147345141363,-0.770167158107586
+99,3,Task_2,0.453505426891074,-0.509861391994549,0.751819680541469,0.843477659731268,0.880714646905367,0.20665859661747,-0.85053999542226,0.770244035843202,-0.790477429383416,-0.219373260405667
diff --git a/tests/exec_test/classification/sisso.json b/tests/exec_test/classification/sisso.json
index 196884b5f3c963281aa753c06501c6fb2bcd8f39..9613bef7bdf1e740abe91993da1d83a91a50209d 100644
--- a/tests/exec_test/classification/sisso.json
+++ b/tests/exec_test/classification/sisso.json
@@ -1,15 +1,16 @@
 {
     "desc_dim": 2,
-    "n_sis_select": 5,
+    "n_sis_select": 10,
     "max_rung": 1,
-    "n_residual": 1,
+    "n_residual": 5,
     "data_file": "data.csv",
     "data_file_relatice_to_json": true,
     "property_key": "prop",
+    "task_key": "task",
     "leave_out_frac": 0.2,
-    "n_models_store": 1,
+    "leave_out_inds": [ 2, 3, 4, 6, 21, 23, 30, 38, 39, 52, 53, 61, 76, 82, 83, 45, 47, 48, 49, 66 ],
+    "n_models_store": 5,
     "calc_type": "classification",
-    "leave_out_inds": [80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95 ,96 ,97, 98 , 99],
     "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"],
     "param_opset" : [],
     "fix_intercept": false
diff --git a/tests/exec_test/classification/test b/tests/exec_test/classification/test
new file mode 100644
index 0000000000000000000000000000000000000000..8a6704af53aa403acc12f2e239bbd0fcedb003c9
--- /dev/null
+++ b/tests/exec_test/classification/test
@@ -0,0 +1,6473 @@
+time input_parsing: 0.001436 s
+time to generate feat sapce: 0.00326109 s
+Projection time: 0.000658989 s
+Time to get best features on rank : 0.000178099 s
+Complete final combination/selection from all ranks: 0.000165939 s
+Time for SIS: 0.001127 s
+nover:: 13	19	13	13	13
+nover:: 29	19	13	13	13
+nover:: 41	19	13	13	13
+nover:: 53	19	13	13	13
+53
+nover:: 0	0	5	8	9
+class1	-2.10512	1
+class1	-3.48748	1
+class1	-5.22061	1
+class1	-1.12941	1
+class1	-2.47797	1
+nover:: 0	0	5	8	9
+class2	1.39155	1
+class2	1.07285	1
+class2	6.4437	1
+class2	7.15082	1
+class2	0.919849	1
+class2	1.5843	1
+class2	3.34031	1
+class2	1.52947	1
+nover:: 9	0	5	8	9
+class3	1.10033	1
+class3	1.00368	1
+class3	1.51989	1
+class3	3.04424	1
+class3	3.94025	1
+class3	4.41186	1
+class3	1.36885	1
+class3	0.949531	1
+class3	2.20505	1
+nover:: 14	0	5	8	9
+67
+9	67	31	24.5452
+nover:: 13	19	13	13	13
+nover:: 29	19	13	13	13
+nover:: 41	19	13	13	13
+nover:: 53	19	13	13	13
+53
+nover:: 0	0	5	8	9
+class1	-0.780263	1
+class1	-0.659421	1
+class1	-0.576447	1
+class1	-0.960247	1
+class1	-0.738983	1
+nover:: 0	0	5	8	9
+class2	0.895709	1
+class2	0.976834	1
+class2	0.537388	1
+class2	0.519057	1
+class2	1.02824	1
+class2	0.857803	1
+class2	0.668967	1
+class2	0.867933	1
+nover:: 9	0	5	8	9
+class3	0.968632	1
+class3	0.998777	1
+class3	0.869753	1
+class3	0.689986	1
+class3	0.633129	1
+class3	0.609714	1
+class3	0.900632	1
+class3	1.01741	1
+class3	0.768294	1
+nover:: 14	0	5	8	9
+67
+8	67	31	24.5845
+nover:: 12	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 42	19	13	13	13
+nover:: 53	19	13	13	13
+53
+nover:: 0	0	5	8	9
+class1	-0.486412	1
+class1	-3.03942	1
+class1	-3.02089	1
+class1	-0.175412	1
+class1	-1.99974	1
+nover:: 7	0	5	8	9
+class2	1.09146	1
+class2	0.722703	1
+class2	1.73506	1
+class2	2.91975	1
+class2	0.60349	1
+class2	1.509	1
+class2	3.39595	1
+class2	1.37618	1
+nover:: 7	0	5	8	9
+class3	-0.89113	1
+class3	-0.979081	1
+class3	-0.372643	1
+class3	-3.28936	1
+class3	-3.68519	1
+class3	-0.911718	1
+class3	-1.11635	1
+class3	-0.862576	1
+class3	-1.12427	1
+nover:: 11	0	5	8	9
+64
+7	64	27	0.0565852
+nover:: 12	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 42	19	13	13	13
+nover:: 53	19	13	13	13
+53
+nover:: 0	0	5	8	9
+class1	-2.05587	1
+class1	-0.329011	1
+class1	-0.331029	1
+class1	-5.70087	1
+class1	-0.500064	1
+nover:: 7	0	5	8	9
+class2	0.916201	1
+class2	1.38369	1
+class2	0.57635	1
+class2	0.342495	1
+class2	1.65703	1
+class2	0.662693	1
+class2	0.294469	1
+class2	0.726648	1
+nover:: 7	0	5	8	9
+class3	-1.12217	1
+class3	-1.02137	1
+class3	-2.68353	1
+class3	-0.304011	1
+class3	-0.271356	1
+class3	-1.09683	1
+class3	-0.895775	1
+class3	-1.15932	1
+class3	-0.889468	1
+nover:: 11	0	5	8	9
+64
+6	64	28	0.0296788
+nover:: 22	19	13	13	13
+nover:: 26	19	13	13	13
+nover:: 53	19	13	13	13
+nover:: 60	19	13	13	13
+60
+nover:: 0	0	5	8	9
+class1	0.706095	1
+class1	1.15826	1
+class1	0.770195	1
+class1	1.04073	1
+class1	1.21056	1
+nover:: 1	0	5	8	9
+class2	0.0657279	1
+class2	-0.258469	1
+class2	0.114074	1
+class2	0.268467	1
+class2	-0.431059	1
+class2	0.321275	1
+class2	0.717283	1
+class2	0.245956	1
+nover:: 2	0	5	8	9
+class3	-1.71869	1
+class3	-1.97183	1
+class3	-0.90312	1
+class3	-1.40901	1
+class3	-1.18906	1
+class3	-0.433313	1
+class3	-1.54608	1
+class3	-1.96157	1
+class3	-0.963367	1
+nover:: 2	0	5	8	9
+62
+5	62	24	0.0357165
+nover:: 11	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 33	19	13	13	13
+nover:: 51	19	13	13	13
+51
+nover:: 0	0	5	8	9
+class1	0.229011	1
+class1	0.76531	1
+class1	0.546891	1
+class1	0.154689	1
+class1	0.722221	1
+nover:: 5	0	5	8	9
+class2	0.706366	1
+class2	0.623828	1
+class2	0.266022	1
+class2	0.39706	1
+class2	0.610012	1
+class2	0.814849	1
+class2	0.850353	1
+class2	0.783187	1
+nover:: 8	0	5	8	9
+class3	-0.724201	1
+class3	-0.827979	1
+class3	-0.242729	1
+class3	-0.882203	1
+class3	-0.804758	1
+class3	-0.205184	1
+class3	-0.728094	1
+class3	-0.788535	1
+class3	-0.488056	1
+nover:: 8	0	5	8	9
+59
+4	59	27	0.10739
+nover:: 11	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 33	19	13	13	13
+nover:: 51	19	13	13	13
+51
+nover:: 0	0	5	8	9
+class1	4.32785	1
+class1	1.14742	1
+class1	1.72817	1
+class1	6.43861	1
+class1	1.23914	1
+nover:: 5	0	5	8	9
+class2	1.27494	1
+class2	1.48449	1
+class2	3.71383	1
+class2	2.44911	1
+class2	1.52422	1
+class2	1.0499	1
+class2	0.983616	1
+class2	1.11139	1
+nover:: 8	0	5	8	9
+class3	-1.23476	1
+class3	-1.02512	1
+class3	-4.07867	1
+class3	-0.92548	1
+class3	-1.06921	1
+class3	-4.83906	1
+class3	-1.22619	1
+class3	-1.10081	1
+class3	-1.96132	1
+nover:: 8	0	5	8	9
+59
+3	59	23	0.0407657
+nover:: 11	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 33	19	13	13	13
+nover:: 51	19	13	13	13
+51
+nover:: 0	0	5	8	9
+class1	0.613634	1
+class1	0.955197	1
+class1	0.833306	1
+class1	0.53753	1
+class1	0.931021	1
+nover:: 5	0	5	8	9
+class2	0.922225	1
+class2	0.876612	1
+class2	0.645743	1
+class2	0.741874	1
+class2	0.86893	1
+class2	0.983898	1
+class2	1.00552	1
+class2	0.965409	1
+nover:: 8	0	5	8	9
+class3	-0.932122	1
+class3	-0.991764	1
+class3	-0.625884	1
+class3	-1.02615	1
+class3	-0.97794	1
+class3	-0.591216	1
+class3	-0.934289	1
+class3	-0.968492	1
+class3	-0.798885	1
+nover:: 8	0	5	8	9
+59
+2	59	27	0.0328996
+nover:: 11	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 38	19	13	13	13
+nover:: 50	19	13	13	13
+50
+nover:: 0	0	5	8	9
+class1	-0.109762	1
+class1	-0.249901	1
+class1	-0.110839	1
+class1	-0.137517	1
+class1	-0.325673	1
+nover:: 3	0	5	8	9
+class2	0.563653	1
+class2	0.627892	1
+class2	0.0417872	1
+class2	0.0570999	1
+class2	0.713242	1
+class2	0.601193	1
+class2	0.30436	1
+class2	0.588291	1
+nover:: 3	0	5	8	9
+class3	-0.73603	1
+class3	-0.97192	1
+class3	-0.161313	1
+class3	-0.35494	1
+class3	-0.237363	1
+class3	-0.04684	1
+class3	-0.595781	1
+class3	-0.956706	1
+class3	-0.231225	1
+nover:: 8	0	5	8	9
+58
+1	58	32	0.0254937
+nover:: 2	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 36	19	13	13	13
+36
+nover:: 0	0	5	8	9
+class1	-0.243971	1
+class1	0.584782	1
+class1	0.387098	1
+class1	-0.730107	1
+class1	0.403453	1
+nover:: 10	0	5	8	9
+class2	1.50298	1
+class2	1.60573	1
+class2	0.424454	1
+class2	0.548155	1
+class2	1.74321	1
+class2	1.58366	1
+class2	1.31603	1
+class2	1.55359	1
+nover:: 11	0	5	8	9
+class3	0.0989428	1
+class3	0.0208427	1
+class3	0.412765	1
+class3	-0.752031	1
+class3	-0.681478	1
+class3	0.0200101	1
+class3	-0.0849994	1
+class3	0.144729	1
+class3	-0.056356	1
+nover:: 15	0	5	8	9
+51
+0	51	21	0.0240705
+51
+Time for l0-norm: 0.195372 s
+Projection time: 0.00291204 s
+Time to get best features on rank : 0.000322819 s
+Complete final combination/selection from all ranks: 8.2016e-05 s
+Time for SIS: 0.00364423 s
+nover:: 17	19	13	13	13
+nover:: 33	19	13	13	13
+nover:: 42	19	13	13	13
+nover:: 55	19	13	13	13
+55
+nover:: 0	0	5	8	9
+class1	-1.45936	0.127118
+class1	-0.529392	-0.330701
+class1	-0.829579	0.169887
+class1	-0.393491	-0.886475
+class1	0.076658	-0.783783
+nover:: 1	0	5	8	9
+class2	0.752121	1.49607
+class2	0.0310826	-0.0528104
+class2	-0.580615	-0.235009
+class2	0.342472	-0.37641
+class2	2.07258	1.42033
+class2	0.398532	0.676369
+class2	-0.0896023	-0.110884
+class2	1.64205	0.928093
+nover:: 3	0	5	8	9
+class3	1.56488	1.49598
+class3	0.0749726	1.22798
+class3	0.116577	0.788127
+class3	0.372673	-0.43423
+class3	0.175467	-0.36341
+class3	0.00762029	0.293871
+class3	1.38309	1.42856
+class3	1.91462	0.99978
+class3	-0.336972	1.22375
+nover:: 6	0	5	8	9
+61
+19	18	61	32	0.169197
+nover:: 13	19	13	13	13
+nover:: 26	19	13	13	13
+nover:: 41	19	13	13	13
+nover:: 52	19	13	13	13
+52
+nover:: 0	0	5	8	9
+class1	-1.45936	0.371089
+class1	-0.529392	-0.915483
+class1	-0.829579	-0.21721
+class1	-0.393491	-0.156369
+class1	0.076658	-1.18724
+nover:: 1	0	5	8	9
+class2	0.752121	-0.00690342
+class2	0.0310826	-1.65854
+class2	-0.580615	-0.659463
+class2	0.342472	-0.924565
+class2	2.07258	-0.322882
+class2	0.398532	-0.907294
+class2	-0.0896023	-1.42691
+class2	1.64205	-0.625502
+nover:: 2	0	5	8	9
+class3	1.56488	1.39703
+class3	0.0749726	1.20714
+class3	0.116577	0.375362
+class3	0.372673	0.317801
+class3	0.175467	0.318068
+class3	0.00762029	0.273861
+class3	1.38309	1.51356
+class3	1.91462	0.855051
+class3	-0.336972	1.28011
+nover:: 2	0	5	8	9
+54
+19	17	54	26	0.121863
+nover:: 7	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 25	19	13	13	13
+nover:: 31	19	13	13	13
+31
+nover:: 0	0	5	8	9
+class1	-1.45936	-0.157555
+class1	-0.529392	0.120666
+class1	-0.829579	-0.175276
+class1	-0.393491	0.801389
+class1	0.076658	1.26265
+nover:: 1	0	5	8	9
+class2	0.752121	-0.118147
+class2	0.0310826	0.841633
+class2	-0.580615	-0.643453
+class2	0.342472	1.25726
+class2	2.07258	1.02528
+class2	0.398532	1.29975
+class2	-0.0896023	1.10103
+class2	1.64205	0.655527
+nover:: 3	0	5	8	9
+class3	1.56488	-0.339535
+class3	0.0749726	-0.122455
+class3	0.116577	0.432834
+class3	0.372673	-1.76351
+class3	0.175467	-0.484982
+class3	0.00762029	-0.913971
+class3	1.38309	-0.521283
+class3	1.91462	-1.22289
+class3	-0.336972	-1.3604
+nover:: 4	0	5	8	9
+35
+19	16	35	25	0.0614904
+nover:: 12	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 21	19	13	13	13
+nover:: 24	19	13	13	13
+24
+nover:: 0	0	5	8	9
+class1	-1.45936	-0.75326
+class1	-0.529392	0.628871
+class1	-0.829579	-0.0593842
+class1	-0.393491	0.647242
+class1	0.076658	1.28722
+nover:: 0	0	5	8	9
+class2	0.752121	0.817849
+class2	0.0310826	-0.227386
+class2	-0.580615	-0.466541
+class2	0.342472	0.610938
+class2	2.07258	1.64152
+class2	0.398532	0.719807
+class2	-0.0896023	0.627681
+class2	1.64205	1.888
+nover:: 0	0	5	8	9
+class3	1.56488	-0.15381
+class3	0.0749726	-1.89686
+class3	0.116577	-0.786544
+class3	0.372673	-1.03634
+class3	0.175467	-1.01359
+class3	0.00762029	-0.425693
+class3	1.38309	-0.162984
+class3	1.91462	-0.0469491
+class3	-0.336972	-1.30034
+nover:: 0	0	5	8	9
+24
+19	15	24	15	0.0312373
+nover:: 16	19	13	13	13
+nover:: 29	19	13	13	13
+nover:: 42	19	13	13	13
+nover:: 48	19	13	13	13
+48
+nover:: 0	0	5	8	9
+class1	-1.45936	0.787986
+class1	-0.529392	1.35172
+class1	-0.829579	-0.382953
+class1	-0.393491	-0.567763
+class1	0.076658	0.974964
+nover:: 2	0	5	8	9
+class2	0.752121	0.605759
+class2	0.0310826	0.570846
+class2	-0.580615	1.0747
+class2	0.342472	0.432437
+class2	2.07258	0.964599
+class2	0.398532	0.0567745
+class2	-0.0896023	1.12342
+class2	1.64205	1.46228
+nover:: 3	0	5	8	9
+class3	1.56488	-0.0148797
+class3	0.0749726	-0.0463603
+class3	0.116577	-0.829335
+class3	0.372673	-1.61237
+class3	0.175467	-1.18048
+class3	0.00762029	-0.513542
+class3	1.38309	-0.405191
+class3	1.91462	-0.451419
+class3	-0.336972	-0.303203
+nover:: 3	0	5	8	9
+51
+19	14	51	24	0.0243312
+nover:: 11	19	13	13	13
+nover:: 23	19	13	13	13
+nover:: 29	19	13	13	13
+nover:: 35	19	13	13	13
+35
+nover:: 0	0	5	8	9
+class1	-1.45936	0.833213
+class1	-0.529392	0.827562
+class1	-0.829579	0.940082
+class1	-0.393491	0.154257
+class1	0.076658	0.426781
+nover:: 1	0	5	8	9
+class2	0.752121	1.5618
+class2	0.0310826	-0.311279
+class2	-0.580615	-0.120935
+class2	0.342472	-0.107943
+class2	2.07258	0.989268
+class2	0.398532	0.997644
+class2	-0.0896023	0.606399
+class2	1.64205	1.17405
+nover:: 5	0	5	8	9
+class3	1.56488	-0.222716
+class3	0.0749726	-0.743846
+class3	0.116577	-0.114994
+class3	0.372673	-1.84324
+class3	0.175467	-1.55247
+class3	0.00762029	-0.139442
+class3	1.38309	-0.117519
+class3	1.91462	-0.961794
+class3	-0.336972	0.260383
+nover:: 7	0	5	8	9
+42
+19	13	42	24	0.0205797
+nover:: 19	19	13	13	13
+nover:: 31	19	13	13	13
+nover:: 46	19	13	13	13
+nover:: 55	19	13	13	13
+55
+nover:: 0	0	5	8	9
+class1	-1.45936	1.07718
+class1	-0.529392	0.24278
+class1	-0.829579	0.552984
+class1	-0.393491	0.884364
+class1	0.076658	0.0233282
+nover:: 1	0	5	8	9
+class2	0.752121	0.0588245
+class2	0.0310826	-1.91701
+class2	-0.580615	-0.54539
+class2	0.342472	-0.656098
+class2	2.07258	-0.753942
+class2	0.398532	-0.586019
+class2	-0.0896023	-0.709631
+class2	1.64205	-0.379547
+nover:: 5	0	5	8	9
+class3	1.56488	-0.321659
+class3	0.0749726	-0.764689
+class3	0.116577	-0.527758
+class3	0.372673	-1.09121
+class3	0.175467	-0.870992
+class3	0.00762029	-0.159452
+class3	1.38309	-0.0325197
+class3	1.91462	-1.10652
+class3	-0.336972	0.316739
+nover:: 9	0	5	8	9
+64
+nover:: 14	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 29	19	13	13	13
+nover:: 37	19	13	13	13
+37
+nover:: 0	0	5	8	9
+class1	-1.45936	0.706095
+class1	-0.529392	1.15826
+class1	-0.829579	0.770195
+class1	-0.393491	1.04073
+class1	0.076658	1.21056
+nover:: 0	0	5	8	9
+class2	0.752121	0.0657279
+class2	0.0310826	0.258469
+class2	-0.580615	0.114074
+class2	0.342472	0.268467
+class2	2.07258	0.431059
+class2	0.398532	0.321275
+class2	-0.0896023	0.717283
+class2	1.64205	0.245956
+nover:: 1	0	5	8	9
+class3	1.56488	1.71869
+class3	0.0749726	1.97183
+class3	0.116577	0.90312
+class3	0.372673	1.40901
+class3	0.175467	1.18906
+class3	0.00762029	0.433313
+class3	1.38309	1.54608
+class3	1.91462	1.96157
+class3	-0.336972	0.963367
+nover:: 3	0	5	8	9
+40
+19	11	40	22	0.0189919
+nover:: 8	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 29	19	13	13	13
+29
+nover:: 0	0	5	8	9
+class1	-1.45936	0.621864
+class1	-0.529392	0.750707
+class1	-0.829579	0.82568
+class1	-0.393491	0.412541
+class1	0.076658	0.667941
+nover:: 0	0	5	8	9
+class2	0.752121	2.05161
+class2	0.0310826	2.53984
+class2	-0.580615	1.16788
+class2	0.342472	1.15009
+class2	2.07258	2.96576
+class2	0.398532	1.87985
+class2	-0.0896023	1.34901
+class2	1.64205	1.92287
+nover:: 8	0	5	8	9
+class3	1.56488	2.48139
+class3	0.0749726	2.70834
+class3	0.116577	1.93082
+class3	0.372673	1.38887
+class3	0.175467	1.2889
+class3	0.00762029	1.25441
+class3	1.38309	2.0762
+class3	1.91462	2.86667
+class3	-0.336972	1.57382
+nover:: 12	0	5	8	9
+41
+19	10	41	27	0.0188834
+nover:: 8	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 27	19	13	13	13
+nover:: 29	19	13	13	13
+29
+nover:: 0	0	5	8	9
+class1	-1.45936	-2.10512
+class1	-0.529392	-3.48748
+class1	-0.829579	-5.22061
+class1	-0.393491	-1.12941
+class1	0.076658	-2.47797
+nover:: 0	0	5	8	9
+class2	0.752121	1.39155
+class2	0.0310826	1.07285
+class2	-0.580615	6.4437
+class2	0.342472	7.15082
+class2	2.07258	0.919849
+class2	0.398532	1.5843
+class2	-0.0896023	3.34031
+class2	1.64205	1.52947
+nover:: 7	0	5	8	9
+class3	1.56488	1.10033
+class3	0.0749726	1.00368
+class3	0.116577	1.51989
+class3	0.372673	3.04424
+class3	0.175467	3.94025
+class3	0.00762029	4.41186
+class3	1.38309	1.36885
+class3	1.91462	0.949531
+class3	-0.336972	2.20505
+nover:: 10	0	5	8	9
+39
+19	9	39	32	0.0175953
+nover:: 7	19	13	13	13
+nover:: 18	19	13	13	13
+nover:: 23	19	13	13	13
+nover:: 25	19	13	13	13
+25
+nover:: 0	0	5	8	9
+class1	-1.45936	-0.780263
+class1	-0.529392	-0.659421
+class1	-0.829579	-0.576447
+class1	-0.393491	-0.960247
+class1	0.076658	-0.738983
+nover:: 0	0	5	8	9
+class2	0.752121	0.895709
+class2	0.0310826	0.976834
+class2	-0.580615	0.537388
+class2	0.342472	0.519057
+class2	2.07258	1.02824
+class2	0.398532	0.857803
+class2	-0.0896023	0.668967
+class2	1.64205	0.867933
+nover:: 7	0	5	8	9
+class3	1.56488	0.968632
+class3	0.0749726	0.998777
+class3	0.116577	0.869753
+class3	0.372673	0.689986
+class3	0.175467	0.633129
+class3	0.00762029	0.609714
+class3	1.38309	0.900632
+class3	1.91462	1.01741
+class3	-0.336972	0.768294
+nover:: 10	0	5	8	9
+35
+19	8	35	29	0.0173846
+nover:: 6	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 17	19	13	13	13
+17
+nover:: 0	0	5	8	9
+class1	-1.45936	-0.486412
+class1	-0.529392	-3.03942
+class1	-0.829579	-3.02089
+class1	-0.393491	-0.175412
+class1	0.076658	-1.99974
+nover:: 1	0	5	8	9
+class2	0.752121	1.09146
+class2	0.0310826	0.722703
+class2	-0.580615	1.73506
+class2	0.342472	2.91975
+class2	2.07258	0.60349
+class2	0.398532	1.509
+class2	-0.0896023	3.39595
+class2	1.64205	1.37618
+nover:: 1	0	5	8	9
+class3	1.56488	-0.89113
+class3	0.0749726	-0.979081
+class3	0.116577	-0.372643
+class3	0.372673	-3.28936
+class3	0.175467	-3.68519
+class3	0.00762029	-0.911718
+class3	1.38309	-1.11635
+class3	1.91462	-0.862576
+class3	-0.336972	-1.12427
+nover:: 2	0	5	8	9
+19
+19	7	19	13	0.0292701
+nover:: 6	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 16	19	13	13	13
+nover:: 18	19	13	13	13
+18
+nover:: 0	0	5	8	9
+class1	-1.45936	-2.05587
+class1	-0.529392	-0.329011
+class1	-0.829579	-0.331029
+class1	-0.393491	-5.70087
+class1	0.076658	-0.500064
+nover:: 2	0	5	8	9
+class2	0.752121	0.916201
+class2	0.0310826	1.38369
+class2	-0.580615	0.57635
+class2	0.342472	0.342495
+class2	2.07258	1.65703
+class2	0.398532	0.662693
+class2	-0.0896023	0.294469
+class2	1.64205	0.726648
+nover:: 2	0	5	8	9
+class3	1.56488	-1.12217
+class3	0.0749726	-1.02137
+class3	0.116577	-2.68353
+class3	0.372673	-0.304011
+class3	0.175467	-0.271356
+class3	0.00762029	-1.09683
+class3	1.38309	-0.895775
+class3	1.91462	-1.15932
+class3	-0.336972	-0.889468
+nover:: 3	0	5	8	9
+21
+19	6	21	13	0.0137628
+nover:: 12	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 21	19	13	13	13
+nover:: 24	19	13	13	13
+24
+nover:: 0	0	5	8	9
+class1	-1.45936	0.706095
+class1	-0.529392	1.15826
+class1	-0.829579	0.770195
+class1	-0.393491	1.04073
+class1	0.076658	1.21056
+nover:: 0	0	5	8	9
+class2	0.752121	0.0657279
+class2	0.0310826	-0.258469
+class2	-0.580615	0.114074
+class2	0.342472	0.268467
+class2	2.07258	-0.431059
+class2	0.398532	0.321275
+class2	-0.0896023	0.717283
+class2	1.64205	0.245956
+nover:: 0	0	5	8	9
+class3	1.56488	-1.71869
+class3	0.0749726	-1.97183
+class3	0.116577	-0.90312
+class3	0.372673	-1.40901
+class3	0.175467	-1.18906
+class3	0.00762029	-0.433313
+class3	1.38309	-1.54608
+class3	1.91462	-1.96157
+class3	-0.336972	-0.963367
+nover:: 0	0	5	8	9
+24
+19	5	24	15	0.0198173
+nover:: 3	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 14	19	13	13	13
+14
+nover:: 0	0	5	8	9
+class1	-1.45936	0.229011
+class1	-0.529392	0.76531
+class1	-0.829579	0.546891
+class1	-0.393491	0.154689
+class1	0.076658	0.722221
+nover:: 1	0	5	8	9
+class2	0.752121	0.706366
+class2	0.0310826	0.623828
+class2	-0.580615	0.266022
+class2	0.342472	0.39706
+class2	2.07258	0.610012
+class2	0.398532	0.814849
+class2	-0.0896023	0.850353
+class2	1.64205	0.783187
+nover:: 2	0	5	8	9
+class3	1.56488	-0.724201
+class3	0.0749726	-0.827979
+class3	0.116577	-0.242729
+class3	0.372673	-0.882203
+class3	0.175467	-0.804758
+class3	0.00762029	-0.205184
+class3	1.38309	-0.728094
+class3	1.91462	-0.788535
+class3	-0.336972	-0.488056
+nover:: 2	0	5	8	9
+16
+19	4	16	11	0.0437635
+nover:: 4	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 13	19	13	13	13
+nover:: 15	19	13	13	13
+15
+nover:: 0	0	5	8	9
+class1	-1.45936	4.32785
+class1	-0.529392	1.14742
+class1	-0.829579	1.72817
+class1	-0.393491	6.43861
+class1	0.076658	1.23914
+nover:: 2	0	5	8	9
+class2	0.752121	1.27494
+class2	0.0310826	1.48449
+class2	-0.580615	3.71383
+class2	0.342472	2.44911
+class2	2.07258	1.52422
+class2	0.398532	1.0499
+class2	-0.0896023	0.983616
+class2	1.64205	1.11139
+nover:: 3	0	5	8	9
+class3	1.56488	-1.23476
+class3	0.0749726	-1.02512
+class3	0.116577	-4.07867
+class3	0.372673	-0.92548
+class3	0.175467	-1.06921
+class3	0.00762029	-4.83906
+class3	1.38309	-1.22619
+class3	1.91462	-1.10081
+class3	-0.336972	-1.96132
+nover:: 3	0	5	8	9
+18
+19	3	18	11	0.0189515
+nover:: 4	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 13	19	13	13	13
+nover:: 15	19	13	13	13
+15
+nover:: 0	0	5	8	9
+class1	-1.45936	0.613634
+class1	-0.529392	0.955197
+class1	-0.829579	0.833306
+class1	-0.393491	0.53753
+class1	0.076658	0.931021
+nover:: 1	0	5	8	9
+class2	0.752121	0.922225
+class2	0.0310826	0.876612
+class2	-0.580615	0.645743
+class2	0.342472	0.741874
+class2	2.07258	0.86893
+class2	0.398532	0.983898
+class2	-0.0896023	1.00552
+class2	1.64205	0.965409
+nover:: 2	0	5	8	9
+class3	1.56488	-0.932122
+class3	0.0749726	-0.991764
+class3	0.116577	-0.625884
+class3	0.372673	-1.02615
+class3	0.175467	-0.97794
+class3	0.00762029	-0.591216
+class3	1.38309	-0.934289
+class3	1.91462	-0.968492
+class3	-0.336972	-0.798885
+nover:: 2	0	5	8	9
+17
+19	2	17	12	0.0162478
+nover:: 6	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 16	19	13	13	13
+16
+nover:: 0	0	5	8	9
+class1	-1.45936	-0.109762
+class1	-0.529392	-0.249901
+class1	-0.829579	-0.110839
+class1	-0.393491	-0.137517
+class1	0.076658	-0.325673
+nover:: 1	0	5	8	9
+class2	0.752121	0.563653
+class2	0.0310826	0.627892
+class2	-0.580615	0.0417872
+class2	0.342472	0.0570999
+class2	2.07258	0.713242
+class2	0.398532	0.601193
+class2	-0.0896023	0.30436
+class2	1.64205	0.588291
+nover:: 1	0	5	8	9
+class3	1.56488	-0.73603
+class3	0.0749726	-0.97192
+class3	0.116577	-0.161313
+class3	0.372673	-0.35494
+class3	0.175467	-0.237363
+class3	0.00762029	-0.04684
+class3	1.38309	-0.595781
+class3	1.91462	-0.956706
+class3	-0.336972	-0.231225
+nover:: 2	0	5	8	9
+18
+19	1	18	15	0.0159368
+nover:: 1	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 12	19	13	13	13
+12
+nover:: 0	0	5	8	9
+class1	-1.45936	-0.243971
+class1	-0.529392	0.584782
+class1	-0.829579	0.387098
+class1	-0.393491	-0.730107
+class1	0.076658	0.403453
+nover:: 2	0	5	8	9
+class2	0.752121	1.50298
+class2	0.0310826	1.60573
+class2	-0.580615	0.424454
+class2	0.342472	0.548155
+class2	2.07258	1.74321
+class2	0.398532	1.58366
+class2	-0.0896023	1.31603
+class2	1.64205	1.55359
+nover:: 2	0	5	8	9
+class3	1.56488	0.0989428
+class3	0.0749726	0.0208427
+class3	0.116577	0.412765
+class3	0.372673	-0.752031
+class3	0.175467	-0.681478
+class3	0.00762029	0.0200101
+class3	1.38309	-0.0849994
+class3	1.91462	0.144729
+class3	-0.336972	-0.056356
+nover:: 2	0	5	8	9
+14
+19	0	14	12	0.0215951
+nover:: 0	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 12	19	13	13	13
+12
+nover:: 0	0	5	8	9
+class1	0.127118	0.371089
+class1	-0.330701	-0.915483
+class1	0.169887	-0.21721
+class1	-0.886475	-0.156369
+class1	-0.783783	-1.18724
+nover:: 2	0	5	8	9
+class2	1.49607	-0.00690342
+class2	-0.0528104	-1.65854
+class2	-0.235009	-0.659463
+class2	-0.37641	-0.924565
+class2	1.42033	-0.322882
+class2	0.676369	-0.907294
+class2	-0.110884	-1.42691
+class2	0.928093	-0.625502
+nover:: 3	0	5	8	9
+class3	1.49598	1.39703
+class3	1.22798	1.20714
+class3	0.788127	0.375362
+class3	-0.43423	0.317801
+class3	-0.36341	0.318068
+class3	0.293871	0.273861
+class3	1.42856	1.51356
+class3	0.99978	0.855051
+class3	1.22375	1.28011
+nover:: 4	0	5	8	9
+16
+18	17	16	15	0.00964901
+nover:: 8	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 27	19	13	13	13
+27
+nover:: 0	0	5	8	9
+class1	0.127118	-0.157555
+class1	-0.330701	0.120666
+class1	0.169887	-0.175276
+class1	-0.886475	0.801389
+class1	-0.783783	1.26265
+nover:: 0	0	5	8	9
+class2	1.49607	-0.118147
+class2	-0.0528104	0.841633
+class2	-0.235009	-0.643453
+class2	-0.37641	1.25726
+class2	1.42033	1.02528
+class2	0.676369	1.29975
+class2	-0.110884	1.10103
+class2	0.928093	0.655527
+nover:: 4	0	5	8	9
+class3	1.49598	-0.339535
+class3	1.22798	-0.122455
+class3	0.788127	0.432834
+class3	-0.43423	-1.76351
+class3	-0.36341	-0.484982
+class3	0.293871	-0.913971
+class3	1.42856	-0.521283
+class3	0.99978	-1.22289
+class3	1.22375	-1.3604
+nover:: 7	0	5	8	9
+34
+nover:: 11	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 26	19	13	13	13
+nover:: 38	19	13	13	13
+38
+nover:: 0	0	5	8	9
+class1	0.127118	-0.75326
+class1	-0.330701	0.628871
+class1	0.169887	-0.0593842
+class1	-0.886475	0.647242
+class1	-0.783783	1.28722
+nover:: 2	0	5	8	9
+class2	1.49607	0.817849
+class2	-0.0528104	-0.227386
+class2	-0.235009	-0.466541
+class2	-0.37641	0.610938
+class2	1.42033	1.64152
+class2	0.676369	0.719807
+class2	-0.110884	0.627681
+class2	0.928093	1.888
+nover:: 4	0	5	8	9
+class3	1.49598	-0.15381
+class3	1.22798	-1.89686
+class3	0.788127	-0.786544
+class3	-0.43423	-1.03634
+class3	-0.36341	-1.01359
+class3	0.293871	-0.425693
+class3	1.42856	-0.162984
+class3	0.99978	-0.0469491
+class3	1.22375	-1.30034
+nover:: 5	0	5	8	9
+43
+nover:: 10	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 28	19	13	13	13
+nover:: 39	19	13	13	13
+39
+nover:: 0	0	5	8	9
+class1	0.127118	0.787986
+class1	-0.330701	1.35172
+class1	0.169887	-0.382953
+class1	-0.886475	-0.567763
+class1	-0.783783	0.974964
+nover:: 3	0	5	8	9
+class2	1.49607	0.605759
+class2	-0.0528104	0.570846
+class2	-0.235009	1.0747
+class2	-0.37641	0.432437
+class2	1.42033	0.964599
+class2	0.676369	0.0567745
+class2	-0.110884	1.12342
+class2	0.928093	1.46228
+nover:: 4	0	5	8	9
+class3	1.49598	-0.0148797
+class3	1.22798	-0.0463603
+class3	0.788127	-0.829335
+class3	-0.43423	-1.61237
+class3	-0.36341	-1.18048
+class3	0.293871	-0.513542
+class3	1.42856	-0.405191
+class3	0.99978	-0.451419
+class3	1.22375	-0.303203
+nover:: 4	0	5	8	9
+43
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 14	19	13	13	13
+14
+nover:: 0	0	5	8	9
+class1	0.127118	0.833213
+class1	-0.330701	0.827562
+class1	0.169887	0.940082
+class1	-0.886475	0.154257
+class1	-0.783783	0.426781
+nover:: 0	0	5	8	9
+class2	1.49607	1.5618
+class2	-0.0528104	-0.311279
+class2	-0.235009	-0.120935
+class2	-0.37641	-0.107943
+class2	1.42033	0.989268
+class2	0.676369	0.997644
+class2	-0.110884	0.606399
+class2	0.928093	1.17405
+nover:: 0	0	5	8	9
+class3	1.49598	-0.222716
+class3	1.22798	-0.743846
+class3	0.788127	-0.114994
+class3	-0.43423	-1.84324
+class3	-0.36341	-1.55247
+class3	0.293871	-0.139442
+class3	1.42856	-0.117519
+class3	0.99978	-0.961794
+class3	1.22375	0.260383
+nover:: 0	0	5	8	9
+14
+18	13	14	11	0.0112293
+nover:: 10	19	13	13	13
+nover:: 21	19	13	13	13
+nover:: 27	19	13	13	13
+nover:: 31	19	13	13	13
+31
+nover:: 0	0	5	8	9
+class1	0.127118	1.07718
+class1	-0.330701	0.24278
+class1	0.169887	0.552984
+class1	-0.886475	0.884364
+class1	-0.783783	0.0233282
+nover:: 0	0	5	8	9
+class2	1.49607	0.0588245
+class2	-0.0528104	-1.91701
+class2	-0.235009	-0.54539
+class2	-0.37641	-0.656098
+class2	1.42033	-0.753942
+class2	0.676369	-0.586019
+class2	-0.110884	-0.709631
+class2	0.928093	-0.379547
+nover:: 3	0	5	8	9
+class3	1.49598	-0.321659
+class3	1.22798	-0.764689
+class3	0.788127	-0.527758
+class3	-0.43423	-1.09121
+class3	-0.36341	-0.870992
+class3	0.293871	-0.159452
+class3	1.42856	-0.0325197
+class3	0.99978	-1.10652
+class3	1.22375	0.316739
+nover:: 6	0	5	8	9
+37
+nover:: 8	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 32	19	13	13	13
+32
+nover:: 0	0	5	8	9
+class1	0.127118	0.706095
+class1	-0.330701	1.15826
+class1	0.169887	0.770195
+class1	-0.886475	1.04073
+class1	-0.783783	1.21056
+nover:: 0	0	5	8	9
+class2	1.49607	0.0657279
+class2	-0.0528104	0.258469
+class2	-0.235009	0.114074
+class2	-0.37641	0.268467
+class2	1.42033	0.431059
+class2	0.676369	0.321275
+class2	-0.110884	0.717283
+class2	0.928093	0.245956
+nover:: 1	0	5	8	9
+class3	1.49598	1.71869
+class3	1.22798	1.97183
+class3	0.788127	0.90312
+class3	-0.43423	1.40901
+class3	-0.36341	1.18906
+class3	0.293871	0.433313
+class3	1.42856	1.54608
+class3	0.99978	1.96157
+class3	1.22375	0.963367
+nover:: 4	0	5	8	9
+36
+nover:: 10	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 27	19	13	13	13
+nover:: 31	19	13	13	13
+31
+nover:: 0	0	5	8	9
+class1	0.127118	0.621864
+class1	-0.330701	0.750707
+class1	0.169887	0.82568
+class1	-0.886475	0.412541
+class1	-0.783783	0.667941
+nover:: 0	0	5	8	9
+class2	1.49607	2.05161
+class2	-0.0528104	2.53984
+class2	-0.235009	1.16788
+class2	-0.37641	1.15009
+class2	1.42033	2.96576
+class2	0.676369	1.87985
+class2	-0.110884	1.34901
+class2	0.928093	1.92287
+nover:: 3	0	5	8	9
+class3	1.49598	2.48139
+class3	1.22798	2.70834
+class3	0.788127	1.93082
+class3	-0.43423	1.38887
+class3	-0.36341	1.2889
+class3	0.293871	1.25441
+class3	1.42856	2.0762
+class3	0.99978	2.86667
+class3	1.22375	1.57382
+nover:: 6	0	5	8	9
+37
+nover:: 8	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 29	19	13	13	13
+29
+nover:: 0	0	5	8	9
+class1	0.127118	-2.10512
+class1	-0.330701	-3.48748
+class1	0.169887	-5.22061
+class1	-0.886475	-1.12941
+class1	-0.783783	-2.47797
+nover:: 0	0	5	8	9
+class2	1.49607	1.39155
+class2	-0.0528104	1.07285
+class2	-0.235009	6.4437
+class2	-0.37641	7.15082
+class2	1.42033	0.919849
+class2	0.676369	1.5843
+class2	-0.110884	3.34031
+class2	0.928093	1.52947
+nover:: 5	0	5	8	9
+class3	1.49598	1.10033
+class3	1.22798	1.00368
+class3	0.788127	1.51989
+class3	-0.43423	3.04424
+class3	-0.36341	3.94025
+class3	0.293871	4.41186
+class3	1.42856	1.36885
+class3	0.99978	0.949531
+class3	1.22375	2.20505
+nover:: 8	0	5	8	9
+37
+nover:: 10	19	13	13	13
+nover:: 21	19	13	13	13
+nover:: 26	19	13	13	13
+nover:: 30	19	13	13	13
+30
+nover:: 0	0	5	8	9
+class1	0.127118	-0.780263
+class1	-0.330701	-0.659421
+class1	0.169887	-0.576447
+class1	-0.886475	-0.960247
+class1	-0.783783	-0.738983
+nover:: 0	0	5	8	9
+class2	1.49607	0.895709
+class2	-0.0528104	0.976834
+class2	-0.235009	0.537388
+class2	-0.37641	0.519057
+class2	1.42033	1.02824
+class2	0.676369	0.857803
+class2	-0.110884	0.668967
+class2	0.928093	0.867933
+nover:: 3	0	5	8	9
+class3	1.49598	0.968632
+class3	1.22798	0.998777
+class3	0.788127	0.869753
+class3	-0.43423	0.689986
+class3	-0.36341	0.633129
+class3	0.293871	0.609714
+class3	1.42856	0.900632
+class3	0.99978	1.01741
+class3	1.22375	0.768294
+nover:: 6	0	5	8	9
+36
+nover:: 1	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 8	19	13	13	13
+8
+nover:: 0	0	5	8	9
+class1	0.127118	-0.486412
+class1	-0.330701	-3.03942
+class1	0.169887	-3.02089
+class1	-0.886475	-0.175412
+class1	-0.783783	-1.99974
+nover:: 0	0	5	8	9
+class2	1.49607	1.09146
+class2	-0.0528104	0.722703
+class2	-0.235009	1.73506
+class2	-0.37641	2.91975
+class2	1.42033	0.60349
+class2	0.676369	1.509
+class2	-0.110884	3.39595
+class2	0.928093	1.37618
+nover:: 0	0	5	8	9
+class3	1.49598	-0.89113
+class3	1.22798	-0.979081
+class3	0.788127	-0.372643
+class3	-0.43423	-3.28936
+class3	-0.36341	-3.68519
+class3	0.293871	-0.911718
+class3	1.42856	-1.11635
+class3	0.99978	-0.862576
+class3	1.22375	-1.12427
+nover:: 1	0	5	8	9
+9
+18	7	9	11	0.00758703
+nover:: 3	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 13	19	13	13	13
+13
+nover:: 0	0	5	8	9
+class1	0.127118	-2.05587
+class1	-0.330701	-0.329011
+class1	0.169887	-0.331029
+class1	-0.886475	-5.70087
+class1	-0.783783	-0.500064
+nover:: 0	0	5	8	9
+class2	1.49607	0.916201
+class2	-0.0528104	1.38369
+class2	-0.235009	0.57635
+class2	-0.37641	0.342495
+class2	1.42033	1.65703
+class2	0.676369	0.662693
+class2	-0.110884	0.294469
+class2	0.928093	0.726648
+nover:: 0	0	5	8	9
+class3	1.49598	-1.12217
+class3	1.22798	-1.02137
+class3	0.788127	-2.68353
+class3	-0.43423	-0.304011
+class3	-0.36341	-0.271356
+class3	0.293871	-1.09683
+class3	1.42856	-0.895775
+class3	0.99978	-1.15932
+class3	1.22375	-0.889468
+nover:: 1	0	5	8	9
+14
+18	6	14	11	0.0065771
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 14	19	13	13	13
+14
+nover:: 0	0	5	8	9
+class1	0.127118	0.706095
+class1	-0.330701	1.15826
+class1	0.169887	0.770195
+class1	-0.886475	1.04073
+class1	-0.783783	1.21056
+nover:: 0	0	5	8	9
+class2	1.49607	0.0657279
+class2	-0.0528104	-0.258469
+class2	-0.235009	0.114074
+class2	-0.37641	0.268467
+class2	1.42033	-0.431059
+class2	0.676369	0.321275
+class2	-0.110884	0.717283
+class2	0.928093	0.245956
+nover:: 0	0	5	8	9
+class3	1.49598	-1.71869
+class3	1.22798	-1.97183
+class3	0.788127	-0.90312
+class3	-0.43423	-1.40901
+class3	-0.36341	-1.18906
+class3	0.293871	-0.433313
+class3	1.42856	-1.54608
+class3	0.99978	-1.96157
+class3	1.22375	-0.963367
+nover:: 0	0	5	8	9
+14
+18	5	14	11	0.00660076
+nover:: 4	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 11	19	13	13	13
+11
+nover:: 0	0	5	8	9
+class1	0.127118	0.229011
+class1	-0.330701	0.76531
+class1	0.169887	0.546891
+class1	-0.886475	0.154689
+class1	-0.783783	0.722221
+nover:: 3	0	5	8	9
+class2	1.49607	0.706366
+class2	-0.0528104	0.623828
+class2	-0.235009	0.266022
+class2	-0.37641	0.39706
+class2	1.42033	0.610012
+class2	0.676369	0.814849
+class2	-0.110884	0.850353
+class2	0.928093	0.783187
+nover:: 4	0	5	8	9
+class3	1.49598	-0.724201
+class3	1.22798	-0.827979
+class3	0.788127	-0.242729
+class3	-0.43423	-0.882203
+class3	-0.36341	-0.804758
+class3	0.293871	-0.205184
+class3	1.42856	-0.728094
+class3	0.99978	-0.788535
+class3	1.22375	-0.488056
+nover:: 4	0	5	8	9
+15
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 11	19	13	13	13
+11
+nover:: 0	0	5	8	9
+class1	0.127118	4.32785
+class1	-0.330701	1.14742
+class1	0.169887	1.72817
+class1	-0.886475	6.43861
+class1	-0.783783	1.23914
+nover:: 3	0	5	8	9
+class2	1.49607	1.27494
+class2	-0.0528104	1.48449
+class2	-0.235009	3.71383
+class2	-0.37641	2.44911
+class2	1.42033	1.52422
+class2	0.676369	1.0499
+class2	-0.110884	0.983616
+class2	0.928093	1.11139
+nover:: 4	0	5	8	9
+class3	1.49598	-1.23476
+class3	1.22798	-1.02512
+class3	0.788127	-4.07867
+class3	-0.43423	-0.92548
+class3	-0.36341	-1.06921
+class3	0.293871	-4.83906
+class3	1.42856	-1.22619
+class3	0.99978	-1.10081
+class3	1.22375	-1.96132
+nover:: 4	0	5	8	9
+15
+nover:: 4	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 11	19	13	13	13
+11
+nover:: 0	0	5	8	9
+class1	0.127118	0.613634
+class1	-0.330701	0.955197
+class1	0.169887	0.833306
+class1	-0.886475	0.53753
+class1	-0.783783	0.931021
+nover:: 3	0	5	8	9
+class2	1.49607	0.922225
+class2	-0.0528104	0.876612
+class2	-0.235009	0.645743
+class2	-0.37641	0.741874
+class2	1.42033	0.86893
+class2	0.676369	0.983898
+class2	-0.110884	1.00552
+class2	0.928093	0.965409
+nover:: 4	0	5	8	9
+class3	1.49598	-0.932122
+class3	1.22798	-0.991764
+class3	0.788127	-0.625884
+class3	-0.43423	-1.02615
+class3	-0.36341	-0.97794
+class3	0.293871	-0.591216
+class3	1.42856	-0.934289
+class3	0.99978	-0.968492
+class3	1.22375	-0.798885
+nover:: 4	0	5	8	9
+15
+nover:: 2	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 14	19	13	13	13
+14
+nover:: 0	0	5	8	9
+class1	0.127118	-0.109762
+class1	-0.330701	-0.249901
+class1	0.169887	-0.110839
+class1	-0.886475	-0.137517
+class1	-0.783783	-0.325673
+nover:: 1	0	5	8	9
+class2	1.49607	0.563653
+class2	-0.0528104	0.627892
+class2	-0.235009	0.0417872
+class2	-0.37641	0.0570999
+class2	1.42033	0.713242
+class2	0.676369	0.601193
+class2	-0.110884	0.30436
+class2	0.928093	0.588291
+nover:: 1	0	5	8	9
+class3	1.49598	-0.73603
+class3	1.22798	-0.97192
+class3	0.788127	-0.161313
+class3	-0.43423	-0.35494
+class3	-0.36341	-0.237363
+class3	0.293871	-0.04684
+class3	1.42856	-0.595781
+class3	0.99978	-0.956706
+class3	1.22375	-0.231225
+nover:: 4	0	5	8	9
+18
+nover:: 0	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 12	19	13	13	13
+12
+nover:: 0	0	5	8	9
+class1	0.127118	-0.243971
+class1	-0.330701	0.584782
+class1	0.169887	0.387098
+class1	-0.886475	-0.730107
+class1	-0.783783	0.403453
+nover:: 2	0	5	8	9
+class2	1.49607	1.50298
+class2	-0.0528104	1.60573
+class2	-0.235009	0.424454
+class2	-0.37641	0.548155
+class2	1.42033	1.74321
+class2	0.676369	1.58366
+class2	-0.110884	1.31603
+class2	0.928093	1.55359
+nover:: 3	0	5	8	9
+class3	1.49598	0.0989428
+class3	1.22798	0.0208427
+class3	0.788127	0.412765
+class3	-0.43423	-0.752031
+class3	-0.36341	-0.681478
+class3	0.293871	0.0200101
+class3	1.42856	-0.0849994
+class3	0.99978	0.144729
+class3	1.22375	-0.056356
+nover:: 4	0	5	8	9
+16
+nover:: 9	19	13	13	13
+nover:: 21	19	13	13	13
+nover:: 32	19	13	13	13
+nover:: 46	19	13	13	13
+46
+nover:: 0	0	5	8	9
+class1	0.371089	-0.157555
+class1	-0.915483	0.120666
+class1	-0.21721	-0.175276
+class1	-0.156369	0.801389
+class1	-1.18724	1.26265
+nover:: 2	0	5	8	9
+class2	-0.00690342	-0.118147
+class2	-1.65854	0.841633
+class2	-0.659463	-0.643453
+class2	-0.924565	1.25726
+class2	-0.322882	1.02528
+class2	-0.907294	1.29975
+class2	-1.42691	1.10103
+class2	-0.625502	0.655527
+nover:: 4	0	5	8	9
+class3	1.39703	-0.339535
+class3	1.20714	-0.122455
+class3	0.375362	0.432834
+class3	0.317801	-1.76351
+class3	0.318068	-0.484982
+class3	0.273861	-0.913971
+class3	1.51356	-0.521283
+class3	0.855051	-1.22289
+class3	1.28011	-1.3604
+nover:: 5	0	5	8	9
+51
+nover:: 15	19	13	13	13
+nover:: 26	19	13	13	13
+nover:: 37	19	13	13	13
+nover:: 48	19	13	13	13
+48
+nover:: 0	0	5	8	9
+class1	0.371089	-0.75326
+class1	-0.915483	0.628871
+class1	-0.21721	-0.0593842
+class1	-0.156369	0.647242
+class1	-1.18724	1.28722
+nover:: 1	0	5	8	9
+class2	-0.00690342	0.817849
+class2	-1.65854	-0.227386
+class2	-0.659463	-0.466541
+class2	-0.924565	0.610938
+class2	-0.322882	1.64152
+class2	-0.907294	0.719807
+class2	-1.42691	0.627681
+class2	-0.625502	1.888
+nover:: 3	0	5	8	9
+class3	1.39703	-0.15381
+class3	1.20714	-1.89686
+class3	0.375362	-0.786544
+class3	0.317801	-1.03634
+class3	0.318068	-1.01359
+class3	0.273861	-0.425693
+class3	1.51356	-0.162984
+class3	0.855051	-0.0469491
+class3	1.28011	-1.30034
+nover:: 4	0	5	8	9
+52
+nover:: 15	19	13	13	13
+nover:: 26	19	13	13	13
+nover:: 44	19	13	13	13
+nover:: 54	19	13	13	13
+54
+nover:: 0	0	5	8	9
+class1	0.371089	0.787986
+class1	-0.915483	1.35172
+class1	-0.21721	-0.382953
+class1	-0.156369	-0.567763
+class1	-1.18724	0.974964
+nover:: 3	0	5	8	9
+class2	-0.00690342	0.605759
+class2	-1.65854	0.570846
+class2	-0.659463	1.0747
+class2	-0.924565	0.432437
+class2	-0.322882	0.964599
+class2	-0.907294	0.0567745
+class2	-1.42691	1.12342
+class2	-0.625502	1.46228
+nover:: 4	0	5	8	9
+class3	1.39703	-0.0148797
+class3	1.20714	-0.0463603
+class3	0.375362	-0.829335
+class3	0.317801	-1.61237
+class3	0.318068	-1.18048
+class3	0.273861	-0.513542
+class3	1.51356	-0.405191
+class3	0.855051	-0.451419
+class3	1.28011	-0.303203
+nover:: 4	0	5	8	9
+58
+nover:: 7	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 30	19	13	13	13
+30
+nover:: 0	0	5	8	9
+class1	0.371089	0.833213
+class1	-0.915483	0.827562
+class1	-0.21721	0.940082
+class1	-0.156369	0.154257
+class1	-1.18724	0.426781
+nover:: 0	0	5	8	9
+class2	-0.00690342	1.5618
+class2	-1.65854	-0.311279
+class2	-0.659463	-0.120935
+class2	-0.924565	-0.107943
+class2	-0.322882	0.989268
+class2	-0.907294	0.997644
+class2	-1.42691	0.606399
+class2	-0.625502	1.17405
+nover:: 2	0	5	8	9
+class3	1.39703	-0.222716
+class3	1.20714	-0.743846
+class3	0.375362	-0.114994
+class3	0.317801	-1.84324
+class3	0.318068	-1.55247
+class3	0.273861	-0.139442
+class3	1.51356	-0.117519
+class3	0.855051	-0.961794
+class3	1.28011	0.260383
+nover:: 2	0	5	8	9
+32
+nover:: 6	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 21	19	13	13	13
+21
+nover:: 0	0	5	8	9
+class1	0.371089	1.07718
+class1	-0.915483	0.24278
+class1	-0.21721	0.552984
+class1	-0.156369	0.884364
+class1	-1.18724	0.0233282
+nover:: 0	0	5	8	9
+class2	-0.00690342	0.0588245
+class2	-1.65854	-1.91701
+class2	-0.659463	-0.54539
+class2	-0.924565	-0.656098
+class2	-0.322882	-0.753942
+class2	-0.907294	-0.586019
+class2	-1.42691	-0.709631
+class2	-0.625502	-0.379547
+nover:: 0	0	5	8	9
+class3	1.39703	-0.321659
+class3	1.20714	-0.764689
+class3	0.375362	-0.527758
+class3	0.317801	-1.09121
+class3	0.318068	-0.870992
+class3	0.273861	-0.159452
+class3	1.51356	-0.0325197
+class3	0.855051	-1.10652
+class3	1.28011	0.316739
+nover:: 0	0	5	8	9
+21
+nover:: 7	19	13	13	13
+nover:: 13	19	13	13	13
+nover:: 25	19	13	13	13
+nover:: 37	19	13	13	13
+37
+nover:: 0	0	5	8	9
+class1	0.371089	0.706095
+class1	-0.915483	1.15826
+class1	-0.21721	0.770195
+class1	-0.156369	1.04073
+class1	-1.18724	1.21056
+nover:: 0	0	5	8	9
+class2	-0.00690342	0.0657279
+class2	-1.65854	0.258469
+class2	-0.659463	0.114074
+class2	-0.924565	0.268467
+class2	-0.322882	0.431059
+class2	-0.907294	0.321275
+class2	-1.42691	0.717283
+class2	-0.625502	0.245956
+nover:: 0	0	5	8	9
+class3	1.39703	1.71869
+class3	1.20714	1.97183
+class3	0.375362	0.90312
+class3	0.317801	1.40901
+class3	0.318068	1.18906
+class3	0.273861	0.433313
+class3	1.51356	1.54608
+class3	0.855051	1.96157
+class3	1.28011	0.963367
+nover:: 1	0	5	8	9
+38
+nover:: 3	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 16	19	13	13	13
+16
+nover:: 0	0	5	8	9
+class1	0.371089	0.621864
+class1	-0.915483	0.750707
+class1	-0.21721	0.82568
+class1	-0.156369	0.412541
+class1	-1.18724	0.667941
+nover:: 0	0	5	8	9
+class2	-0.00690342	2.05161
+class2	-1.65854	2.53984
+class2	-0.659463	1.16788
+class2	-0.924565	1.15009
+class2	-0.322882	2.96576
+class2	-0.907294	1.87985
+class2	-1.42691	1.34901
+class2	-0.625502	1.92287
+nover:: 0	0	5	8	9
+class3	1.39703	2.48139
+class3	1.20714	2.70834
+class3	0.375362	1.93082
+class3	0.317801	1.38887
+class3	0.318068	1.2889
+class3	0.273861	1.25441
+class3	1.51356	2.0762
+class3	0.855051	2.86667
+class3	1.28011	1.57382
+nover:: 0	0	5	8	9
+16
+nover:: 3	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 15	19	13	13	13
+15
+nover:: 0	0	5	8	9
+class1	0.371089	-2.10512
+class1	-0.915483	-3.48748
+class1	-0.21721	-5.22061
+class1	-0.156369	-1.12941
+class1	-1.18724	-2.47797
+nover:: 0	0	5	8	9
+class2	-0.00690342	1.39155
+class2	-1.65854	1.07285
+class2	-0.659463	6.4437
+class2	-0.924565	7.15082
+class2	-0.322882	0.919849
+class2	-0.907294	1.5843
+class2	-1.42691	3.34031
+class2	-0.625502	1.52947
+nover:: 0	0	5	8	9
+class3	1.39703	1.10033
+class3	1.20714	1.00368
+class3	0.375362	1.51989
+class3	0.317801	3.04424
+class3	0.318068	3.94025
+class3	0.273861	4.41186
+class3	1.51356	1.36885
+class3	0.855051	0.949531
+class3	1.28011	2.20505
+nover:: 0	0	5	8	9
+15
+nover:: 3	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 15	19	13	13	13
+15
+nover:: 0	0	5	8	9
+class1	0.371089	-0.780263
+class1	-0.915483	-0.659421
+class1	-0.21721	-0.576447
+class1	-0.156369	-0.960247
+class1	-1.18724	-0.738983
+nover:: 0	0	5	8	9
+class2	-0.00690342	0.895709
+class2	-1.65854	0.976834
+class2	-0.659463	0.537388
+class2	-0.924565	0.519057
+class2	-0.322882	1.02824
+class2	-0.907294	0.857803
+class2	-1.42691	0.668967
+class2	-0.625502	0.867933
+nover:: 0	0	5	8	9
+class3	1.39703	0.968632
+class3	1.20714	0.998777
+class3	0.375362	0.869753
+class3	0.317801	0.689986
+class3	0.318068	0.633129
+class3	0.273861	0.609714
+class3	1.51356	0.900632
+class3	0.855051	1.01741
+class3	1.28011	0.768294
+nover:: 0	0	5	8	9
+15
+nover:: 3	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 22	19	13	13	13
+22
+nover:: 0	0	5	8	9
+class1	0.371089	-0.486412
+class1	-0.915483	-3.03942
+class1	-0.21721	-3.02089
+class1	-0.156369	-0.175412
+class1	-1.18724	-1.99974
+nover:: 0	0	5	8	9
+class2	-0.00690342	1.09146
+class2	-1.65854	0.722703
+class2	-0.659463	1.73506
+class2	-0.924565	2.91975
+class2	-0.322882	0.60349
+class2	-0.907294	1.509
+class2	-1.42691	3.39595
+class2	-0.625502	1.37618
+nover:: 0	0	5	8	9
+class3	1.39703	-0.89113
+class3	1.20714	-0.979081
+class3	0.375362	-0.372643
+class3	0.317801	-3.28936
+class3	0.318068	-3.68519
+class3	0.273861	-0.911718
+class3	1.51356	-1.11635
+class3	0.855051	-0.862576
+class3	1.28011	-1.12427
+nover:: 1	0	5	8	9
+23
+nover:: 3	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 22	19	13	13	13
+22
+nover:: 0	0	5	8	9
+class1	0.371089	-2.05587
+class1	-0.915483	-0.329011
+class1	-0.21721	-0.331029
+class1	-0.156369	-5.70087
+class1	-1.18724	-0.500064
+nover:: 0	0	5	8	9
+class2	-0.00690342	0.916201
+class2	-1.65854	1.38369
+class2	-0.659463	0.57635
+class2	-0.924565	0.342495
+class2	-0.322882	1.65703
+class2	-0.907294	0.662693
+class2	-1.42691	0.294469
+class2	-0.625502	0.726648
+nover:: 0	0	5	8	9
+class3	1.39703	-1.12217
+class3	1.20714	-1.02137
+class3	0.375362	-2.68353
+class3	0.317801	-0.304011
+class3	0.318068	-0.271356
+class3	0.273861	-1.09683
+class3	1.51356	-0.895775
+class3	0.855051	-1.15932
+class3	1.28011	-0.889468
+nover:: 1	0	5	8	9
+23
+nover:: 6	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 21	19	13	13	13
+21
+nover:: 0	0	5	8	9
+class1	0.371089	0.706095
+class1	-0.915483	1.15826
+class1	-0.21721	0.770195
+class1	-0.156369	1.04073
+class1	-1.18724	1.21056
+nover:: 0	0	5	8	9
+class2	-0.00690342	0.0657279
+class2	-1.65854	-0.258469
+class2	-0.659463	0.114074
+class2	-0.924565	0.268467
+class2	-0.322882	-0.431059
+class2	-0.907294	0.321275
+class2	-1.42691	0.717283
+class2	-0.625502	0.245956
+nover:: 0	0	5	8	9
+class3	1.39703	-1.71869
+class3	1.20714	-1.97183
+class3	0.375362	-0.90312
+class3	0.317801	-1.40901
+class3	0.318068	-1.18906
+class3	0.273861	-0.433313
+class3	1.51356	-1.54608
+class3	0.855051	-1.96157
+class3	1.28011	-0.963367
+nover:: 0	0	5	8	9
+21
+nover:: 7	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 30	19	13	13	13
+30
+nover:: 0	0	5	8	9
+class1	0.371089	0.229011
+class1	-0.915483	0.76531
+class1	-0.21721	0.546891
+class1	-0.156369	0.154689
+class1	-1.18724	0.722221
+nover:: 0	0	5	8	9
+class2	-0.00690342	0.706366
+class2	-1.65854	0.623828
+class2	-0.659463	0.266022
+class2	-0.924565	0.39706
+class2	-0.322882	0.610012
+class2	-0.907294	0.814849
+class2	-1.42691	0.850353
+class2	-0.625502	0.783187
+nover:: 2	0	5	8	9
+class3	1.39703	-0.724201
+class3	1.20714	-0.827979
+class3	0.375362	-0.242729
+class3	0.317801	-0.882203
+class3	0.318068	-0.804758
+class3	0.273861	-0.205184
+class3	1.51356	-0.728094
+class3	0.855051	-0.788535
+class3	1.28011	-0.488056
+nover:: 2	0	5	8	9
+32
+nover:: 7	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 18	19	13	13	13
+nover:: 31	19	13	13	13
+31
+nover:: 0	0	5	8	9
+class1	0.371089	4.32785
+class1	-0.915483	1.14742
+class1	-0.21721	1.72817
+class1	-0.156369	6.43861
+class1	-1.18724	1.23914
+nover:: 2	0	5	8	9
+class2	-0.00690342	1.27494
+class2	-1.65854	1.48449
+class2	-0.659463	3.71383
+class2	-0.924565	2.44911
+class2	-0.322882	1.52422
+class2	-0.907294	1.0499
+class2	-1.42691	0.983616
+class2	-0.625502	1.11139
+nover:: 5	0	5	8	9
+class3	1.39703	-1.23476
+class3	1.20714	-1.02512
+class3	0.375362	-4.07867
+class3	0.317801	-0.92548
+class3	0.318068	-1.06921
+class3	0.273861	-4.83906
+class3	1.51356	-1.22619
+class3	0.855051	-1.10081
+class3	1.28011	-1.96132
+nover:: 5	0	5	8	9
+36
+nover:: 7	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 29	19	13	13	13
+29
+nover:: 0	0	5	8	9
+class1	0.371089	0.613634
+class1	-0.915483	0.955197
+class1	-0.21721	0.833306
+class1	-0.156369	0.53753
+class1	-1.18724	0.931021
+nover:: 0	0	5	8	9
+class2	-0.00690342	0.922225
+class2	-1.65854	0.876612
+class2	-0.659463	0.645743
+class2	-0.924565	0.741874
+class2	-0.322882	0.86893
+class2	-0.907294	0.983898
+class2	-1.42691	1.00552
+class2	-0.625502	0.965409
+nover:: 3	0	5	8	9
+class3	1.39703	-0.932122
+class3	1.20714	-0.991764
+class3	0.375362	-0.625884
+class3	0.317801	-1.02615
+class3	0.318068	-0.97794
+class3	0.273861	-0.591216
+class3	1.51356	-0.934289
+class3	0.855051	-0.968492
+class3	1.28011	-0.798885
+nover:: 3	0	5	8	9
+32
+nover:: 3	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 17	19	13	13	13
+17
+nover:: 0	0	5	8	9
+class1	0.371089	-0.109762
+class1	-0.915483	-0.249901
+class1	-0.21721	-0.110839
+class1	-0.156369	-0.137517
+class1	-1.18724	-0.325673
+nover:: 0	0	5	8	9
+class2	-0.00690342	0.563653
+class2	-1.65854	0.627892
+class2	-0.659463	0.0417872
+class2	-0.924565	0.0570999
+class2	-0.322882	0.713242
+class2	-0.907294	0.601193
+class2	-1.42691	0.30436
+class2	-0.625502	0.588291
+nover:: 0	0	5	8	9
+class3	1.39703	-0.73603
+class3	1.20714	-0.97192
+class3	0.375362	-0.161313
+class3	0.317801	-0.35494
+class3	0.318068	-0.237363
+class3	0.273861	-0.04684
+class3	1.51356	-0.595781
+class3	0.855051	-0.956706
+class3	1.28011	-0.231225
+nover:: 1	0	5	8	9
+18
+nover:: 0	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 12	19	13	13	13
+12
+nover:: 0	0	5	8	9
+class1	0.371089	-0.243971
+class1	-0.915483	0.584782
+class1	-0.21721	0.387098
+class1	-0.156369	-0.730107
+class1	-1.18724	0.403453
+nover:: 2	0	5	8	9
+class2	-0.00690342	1.50298
+class2	-1.65854	1.60573
+class2	-0.659463	0.424454
+class2	-0.924565	0.548155
+class2	-0.322882	1.74321
+class2	-0.907294	1.58366
+class2	-1.42691	1.31603
+class2	-0.625502	1.55359
+nover:: 3	0	5	8	9
+class3	1.39703	0.0989428
+class3	1.20714	0.0208427
+class3	0.375362	0.412765
+class3	0.317801	-0.752031
+class3	0.318068	-0.681478
+class3	0.273861	0.0200101
+class3	1.51356	-0.0849994
+class3	0.855051	0.144729
+class3	1.28011	-0.056356
+nover:: 4	0	5	8	9
+16
+nover:: 10	19	13	13	13
+nover:: 22	19	13	13	13
+nover:: 27	19	13	13	13
+nover:: 39	19	13	13	13
+39
+nover:: 0	0	5	8	9
+class1	-0.157555	-0.75326
+class1	0.120666	0.628871
+class1	-0.175276	-0.0593842
+class1	0.801389	0.647242
+class1	1.26265	1.28722
+nover:: 0	0	5	8	9
+class2	-0.118147	0.817849
+class2	0.841633	-0.227386
+class2	-0.643453	-0.466541
+class2	1.25726	0.610938
+class2	1.02528	1.64152
+class2	1.29975	0.719807
+class2	1.10103	0.627681
+class2	0.655527	1.888
+nover:: 4	0	5	8	9
+class3	-0.339535	-0.15381
+class3	-0.122455	-1.89686
+class3	0.432834	-0.786544
+class3	-1.76351	-1.03634
+class3	-0.484982	-1.01359
+class3	-0.913971	-0.425693
+class3	-0.521283	-0.162984
+class3	-1.22289	-0.0469491
+class3	-1.3604	-1.30034
+nover:: 6	0	5	8	9
+45
+nover:: 10	19	13	13	13
+nover:: 16	19	13	13	13
+nover:: 29	19	13	13	13
+nover:: 42	19	13	13	13
+42
+nover:: 0	0	5	8	9
+class1	-0.157555	0.787986
+class1	0.120666	1.35172
+class1	-0.175276	-0.382953
+class1	0.801389	-0.567763
+class1	1.26265	0.974964
+nover:: 4	0	5	8	9
+class2	-0.118147	0.605759
+class2	0.841633	0.570846
+class2	-0.643453	1.0747
+class2	1.25726	0.432437
+class2	1.02528	0.964599
+class2	1.29975	0.0567745
+class2	1.10103	1.12342
+class2	0.655527	1.46228
+nover:: 5	0	5	8	9
+class3	-0.339535	-0.0148797
+class3	-0.122455	-0.0463603
+class3	0.432834	-0.829335
+class3	-1.76351	-1.61237
+class3	-0.484982	-1.18048
+class3	-0.913971	-0.513542
+class3	-0.521283	-0.405191
+class3	-1.22289	-0.451419
+class3	-1.3604	-0.303203
+nover:: 6	0	5	8	9
+48
+nover:: 11	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 28	19	13	13	13
+nover:: 44	19	13	13	13
+44
+nover:: 0	0	5	8	9
+class1	-0.157555	0.833213
+class1	0.120666	0.827562
+class1	-0.175276	0.940082
+class1	0.801389	0.154257
+class1	1.26265	0.426781
+nover:: 0	0	5	8	9
+class2	-0.118147	1.5618
+class2	0.841633	-0.311279
+class2	-0.643453	-0.120935
+class2	1.25726	-0.107943
+class2	1.02528	0.989268
+class2	1.29975	0.997644
+class2	1.10103	0.606399
+class2	0.655527	1.17405
+nover:: 7	0	5	8	9
+class3	-0.339535	-0.222716
+class3	-0.122455	-0.743846
+class3	0.432834	-0.114994
+class3	-1.76351	-1.84324
+class3	-0.484982	-1.55247
+class3	-0.913971	-0.139442
+class3	-0.521283	-0.117519
+class3	-1.22289	-0.961794
+class3	-1.3604	0.260383
+nover:: 8	0	5	8	9
+52
+nover:: 6	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 30	19	13	13	13
+30
+nover:: 0	0	5	8	9
+class1	-0.157555	1.07718
+class1	0.120666	0.24278
+class1	-0.175276	0.552984
+class1	0.801389	0.884364
+class1	1.26265	0.0233282
+nover:: 0	0	5	8	9
+class2	-0.118147	0.0588245
+class2	0.841633	-1.91701
+class2	-0.643453	-0.54539
+class2	1.25726	-0.656098
+class2	1.02528	-0.753942
+class2	1.29975	-0.586019
+class2	1.10103	-0.709631
+class2	0.655527	-0.379547
+nover:: 3	0	5	8	9
+class3	-0.339535	-0.321659
+class3	-0.122455	-0.764689
+class3	0.432834	-0.527758
+class3	-1.76351	-1.09121
+class3	-0.484982	-0.870992
+class3	-0.913971	-0.159452
+class3	-0.521283	-0.0325197
+class3	-1.22289	-1.10652
+class3	-1.3604	0.316739
+nover:: 4	0	5	8	9
+34
+nover:: 5	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 23	19	13	13	13
+23
+nover:: 0	0	5	8	9
+class1	-0.157555	0.706095
+class1	0.120666	1.15826
+class1	-0.175276	0.770195
+class1	0.801389	1.04073
+class1	1.26265	1.21056
+nover:: 0	0	5	8	9
+class2	-0.118147	0.0657279
+class2	0.841633	0.258469
+class2	-0.643453	0.114074
+class2	1.25726	0.268467
+class2	1.02528	0.431059
+class2	1.29975	0.321275
+class2	1.10103	0.717283
+class2	0.655527	0.245956
+nover:: 0	0	5	8	9
+class3	-0.339535	1.71869
+class3	-0.122455	1.97183
+class3	0.432834	0.90312
+class3	-1.76351	1.40901
+class3	-0.484982	1.18906
+class3	-0.913971	0.433313
+class3	-0.521283	1.54608
+class3	-1.22289	1.96157
+class3	-1.3604	0.963367
+nover:: 3	0	5	8	9
+26
+nover:: 3	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 10	19	13	13	13
+10
+nover:: 0	0	5	8	9
+class1	-0.157555	0.621864
+class1	0.120666	0.750707
+class1	-0.175276	0.82568
+class1	0.801389	0.412541
+class1	1.26265	0.667941
+nover:: 0	0	5	8	9
+class2	-0.118147	2.05161
+class2	0.841633	2.53984
+class2	-0.643453	1.16788
+class2	1.25726	1.15009
+class2	1.02528	2.96576
+class2	1.29975	1.87985
+class2	1.10103	1.34901
+class2	0.655527	1.92287
+nover:: 2	0	5	8	9
+class3	-0.339535	2.48139
+class3	-0.122455	2.70834
+class3	0.432834	1.93082
+class3	-1.76351	1.38887
+class3	-0.484982	1.2889
+class3	-0.913971	1.25441
+class3	-0.521283	2.0762
+class3	-1.22289	2.86667
+class3	-1.3604	1.57382
+nover:: 3	0	5	8	9
+13
+16	10	13	10	0.0066992
+nover:: 3	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 10	19	13	13	13
+10
+nover:: 0	0	5	8	9
+class1	-0.157555	-2.10512
+class1	0.120666	-3.48748
+class1	-0.175276	-5.22061
+class1	0.801389	-1.12941
+class1	1.26265	-2.47797
+nover:: 0	0	5	8	9
+class2	-0.118147	1.39155
+class2	0.841633	1.07285
+class2	-0.643453	6.4437
+class2	1.25726	7.15082
+class2	1.02528	0.919849
+class2	1.29975	1.5843
+class2	1.10103	3.34031
+class2	0.655527	1.52947
+nover:: 1	0	5	8	9
+class3	-0.339535	1.10033
+class3	-0.122455	1.00368
+class3	0.432834	1.51989
+class3	-1.76351	3.04424
+class3	-0.484982	3.94025
+class3	-0.913971	4.41186
+class3	-0.521283	1.36885
+class3	-1.22289	0.949531
+class3	-1.3604	2.20505
+nover:: 2	0	5	8	9
+12
+16	9	12	10	0.00657075
+nover:: 3	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 10	19	13	13	13
+10
+nover:: 0	0	5	8	9
+class1	-0.157555	-0.780263
+class1	0.120666	-0.659421
+class1	-0.175276	-0.576447
+class1	0.801389	-0.960247
+class1	1.26265	-0.738983
+nover:: 0	0	5	8	9
+class2	-0.118147	0.895709
+class2	0.841633	0.976834
+class2	-0.643453	0.537388
+class2	1.25726	0.519057
+class2	1.02528	1.02824
+class2	1.29975	0.857803
+class2	1.10103	0.668967
+class2	0.655527	0.867933
+nover:: 2	0	5	8	9
+class3	-0.339535	0.968632
+class3	-0.122455	0.998777
+class3	0.432834	0.869753
+class3	-1.76351	0.689986
+class3	-0.484982	0.633129
+class3	-0.913971	0.609714
+class3	-0.521283	0.900632
+class3	-1.22289	1.01741
+class3	-1.3604	0.768294
+nover:: 3	0	5	8	9
+13
+16	8	13	10	0.00663274
+nover:: 1	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+7
+nover:: 0	0	5	8	9
+class1	-0.157555	-0.486412
+class1	0.120666	-3.03942
+class1	-0.175276	-3.02089
+class1	0.801389	-0.175412
+class1	1.26265	-1.99974
+nover:: 2	0	5	8	9
+class2	-0.118147	1.09146
+class2	0.841633	0.722703
+class2	-0.643453	1.73506
+class2	1.25726	2.91975
+class2	1.02528	0.60349
+class2	1.29975	1.509
+class2	1.10103	3.39595
+class2	0.655527	1.37618
+nover:: 2	0	5	8	9
+class3	-0.339535	-0.89113
+class3	-0.122455	-0.979081
+class3	0.432834	-0.372643
+class3	-1.76351	-3.28936
+class3	-0.484982	-3.68519
+class3	-0.913971	-0.911718
+class3	-0.521283	-1.11635
+class3	-1.22289	-0.862576
+class3	-1.3604	-1.12427
+nover:: 2	0	5	8	9
+9
+16	7	9	9	0.00599347
+nover:: 1	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+7
+nover:: 0	0	5	8	9
+class1	-0.157555	-2.05587
+class1	0.120666	-0.329011
+class1	-0.175276	-0.331029
+class1	0.801389	-5.70087
+class1	1.26265	-0.500064
+nover:: 2	0	5	8	9
+class2	-0.118147	0.916201
+class2	0.841633	1.38369
+class2	-0.643453	0.57635
+class2	1.25726	0.342495
+class2	1.02528	1.65703
+class2	1.29975	0.662693
+class2	1.10103	0.294469
+class2	0.655527	0.726648
+nover:: 2	0	5	8	9
+class3	-0.339535	-1.12217
+class3	-0.122455	-1.02137
+class3	0.432834	-2.68353
+class3	-1.76351	-0.304011
+class3	-0.484982	-0.271356
+class3	-0.913971	-1.09683
+class3	-0.521283	-0.895775
+class3	-1.22289	-1.15932
+class3	-1.3604	-0.889468
+nover:: 3	0	5	8	9
+10
+16	6	10	8	0.0058347
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 11	19	13	13	13
+11
+nover:: 0	0	5	8	9
+class1	-0.157555	0.706095
+class1	0.120666	1.15826
+class1	-0.175276	0.770195
+class1	0.801389	1.04073
+class1	1.26265	1.21056
+nover:: 0	0	5	8	9
+class2	-0.118147	0.0657279
+class2	0.841633	-0.258469
+class2	-0.643453	0.114074
+class2	1.25726	0.268467
+class2	1.02528	-0.431059
+class2	1.29975	0.321275
+class2	1.10103	0.717283
+class2	0.655527	0.245956
+nover:: 0	0	5	8	9
+class3	-0.339535	-1.71869
+class3	-0.122455	-1.97183
+class3	0.432834	-0.90312
+class3	-1.76351	-1.40901
+class3	-0.484982	-1.18906
+class3	-0.913971	-0.433313
+class3	-0.521283	-1.54608
+class3	-1.22289	-1.96157
+class3	-1.3604	-0.963367
+nover:: 0	0	5	8	9
+11
+16	5	11	11	0.0059035
+nover:: 7	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 32	19	13	13	13
+32
+nover:: 0	0	5	8	9
+class1	-0.157555	0.229011
+class1	0.120666	0.76531
+class1	-0.175276	0.546891
+class1	0.801389	0.154689
+class1	1.26265	0.722221
+nover:: 2	0	5	8	9
+class2	-0.118147	0.706366
+class2	0.841633	0.623828
+class2	-0.643453	0.266022
+class2	1.25726	0.39706
+class2	1.02528	0.610012
+class2	1.29975	0.814849
+class2	1.10103	0.850353
+class2	0.655527	0.783187
+nover:: 4	0	5	8	9
+class3	-0.339535	-0.724201
+class3	-0.122455	-0.827979
+class3	0.432834	-0.242729
+class3	-1.76351	-0.882203
+class3	-0.484982	-0.804758
+class3	-0.913971	-0.205184
+class3	-0.521283	-0.728094
+class3	-1.22289	-0.788535
+class3	-1.3604	-0.488056
+nover:: 4	0	5	8	9
+36
+nover:: 8	19	13	13	13
+nover:: 16	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 33	19	13	13	13
+33
+nover:: 0	0	5	8	9
+class1	-0.157555	4.32785
+class1	0.120666	1.14742
+class1	-0.175276	1.72817
+class1	0.801389	6.43861
+class1	1.26265	1.23914
+nover:: 2	0	5	8	9
+class2	-0.118147	1.27494
+class2	0.841633	1.48449
+class2	-0.643453	3.71383
+class2	1.25726	2.44911
+class2	1.02528	1.52422
+class2	1.29975	1.0499
+class2	1.10103	0.983616
+class2	0.655527	1.11139
+nover:: 4	0	5	8	9
+class3	-0.339535	-1.23476
+class3	-0.122455	-1.02512
+class3	0.432834	-4.07867
+class3	-1.76351	-0.92548
+class3	-0.484982	-1.06921
+class3	-0.913971	-4.83906
+class3	-0.521283	-1.22619
+class3	-1.22289	-1.10081
+class3	-1.3604	-1.96132
+nover:: 4	0	5	8	9
+37
+nover:: 7	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 32	19	13	13	13
+32
+nover:: 0	0	5	8	9
+class1	-0.157555	0.613634
+class1	0.120666	0.955197
+class1	-0.175276	0.833306
+class1	0.801389	0.53753
+class1	1.26265	0.931021
+nover:: 2	0	5	8	9
+class2	-0.118147	0.922225
+class2	0.841633	0.876612
+class2	-0.643453	0.645743
+class2	1.25726	0.741874
+class2	1.02528	0.86893
+class2	1.29975	0.983898
+class2	1.10103	1.00552
+class2	0.655527	0.965409
+nover:: 4	0	5	8	9
+class3	-0.339535	-0.932122
+class3	-0.122455	-0.991764
+class3	0.432834	-0.625884
+class3	-1.76351	-1.02615
+class3	-0.484982	-0.97794
+class3	-0.913971	-0.591216
+class3	-0.521283	-0.934289
+class3	-1.22289	-0.968492
+class3	-1.3604	-0.798885
+nover:: 4	0	5	8	9
+36
+nover:: 0	19	13	13	13
+nover:: 1	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 7	19	13	13	13
+7
+nover:: 0	0	5	8	9
+class1	-0.157555	-0.109762
+class1	0.120666	-0.249901
+class1	-0.175276	-0.110839
+class1	0.801389	-0.137517
+class1	1.26265	-0.325673
+nover:: 1	0	5	8	9
+class2	-0.118147	0.563653
+class2	0.841633	0.627892
+class2	-0.643453	0.0417872
+class2	1.25726	0.0570999
+class2	1.02528	0.713242
+class2	1.29975	0.601193
+class2	1.10103	0.30436
+class2	0.655527	0.588291
+nover:: 1	0	5	8	9
+class3	-0.339535	-0.73603
+class3	-0.122455	-0.97192
+class3	0.432834	-0.161313
+class3	-1.76351	-0.35494
+class3	-0.484982	-0.237363
+class3	-0.913971	-0.04684
+class3	-0.521283	-0.595781
+class3	-1.22289	-0.956706
+class3	-1.3604	-0.231225
+nover:: 3	0	5	8	9
+10
+16	1	10	10	0.00565436
+nover:: 0	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 8	19	13	13	13
+8
+nover:: 0	0	5	8	9
+class1	-0.157555	-0.243971
+class1	0.120666	0.584782
+class1	-0.175276	0.387098
+class1	0.801389	-0.730107
+class1	1.26265	0.403453
+nover:: 2	0	5	8	9
+class2	-0.118147	1.50298
+class2	0.841633	1.60573
+class2	-0.643453	0.424454
+class2	1.25726	0.548155
+class2	1.02528	1.74321
+class2	1.29975	1.58366
+class2	1.10103	1.31603
+class2	0.655527	1.55359
+nover:: 3	0	5	8	9
+class3	-0.339535	0.0989428
+class3	-0.122455	0.0208427
+class3	0.432834	0.412765
+class3	-1.76351	-0.752031
+class3	-0.484982	-0.681478
+class3	-0.913971	0.0200101
+class3	-0.521283	-0.0849994
+class3	-1.22289	0.144729
+class3	-1.3604	-0.056356
+nover:: 4	0	5	8	9
+12
+nover:: 19	19	13	13	13
+nover:: 27	19	13	13	13
+nover:: 39	19	13	13	13
+nover:: 48	19	13	13	13
+48
+nover:: 0	0	5	8	9
+class1	-0.75326	0.787986
+class1	0.628871	1.35172
+class1	-0.0593842	-0.382953
+class1	0.647242	-0.567763
+class1	1.28722	0.974964
+nover:: 6	0	5	8	9
+class2	0.817849	0.605759
+class2	-0.227386	0.570846
+class2	-0.466541	1.0747
+class2	0.610938	0.432437
+class2	1.64152	0.964599
+class2	0.719807	0.0567745
+class2	0.627681	1.12342
+class2	1.888	1.46228
+nover:: 7	0	5	8	9
+class3	-0.15381	-0.0148797
+class3	-1.89686	-0.0463603
+class3	-0.786544	-0.829335
+class3	-1.03634	-1.61237
+class3	-1.01359	-1.18048
+class3	-0.425693	-0.513542
+class3	-0.162984	-0.405191
+class3	-0.0469491	-0.451419
+class3	-1.30034	-0.303203
+nover:: 7	0	5	8	9
+55
+nover:: 17	19	13	13	13
+nover:: 30	19	13	13	13
+nover:: 38	19	13	13	13
+nover:: 52	19	13	13	13
+52
+nover:: 0	0	5	8	9
+class1	-0.75326	0.833213
+class1	0.628871	0.827562
+class1	-0.0593842	0.940082
+class1	0.647242	0.154257
+class1	1.28722	0.426781
+nover:: 1	0	5	8	9
+class2	0.817849	1.5618
+class2	-0.227386	-0.311279
+class2	-0.466541	-0.120935
+class2	0.610938	-0.107943
+class2	1.64152	0.989268
+class2	0.719807	0.997644
+class2	0.627681	0.606399
+class2	1.888	1.17405
+nover:: 6	0	5	8	9
+class3	-0.15381	-0.222716
+class3	-1.89686	-0.743846
+class3	-0.786544	-0.114994
+class3	-1.03634	-1.84324
+class3	-1.01359	-1.55247
+class3	-0.425693	-0.139442
+class3	-0.162984	-0.117519
+class3	-0.0469491	-0.961794
+class3	-1.30034	0.260383
+nover:: 8	0	5	8	9
+60
+nover:: 16	19	13	13	13
+nover:: 25	19	13	13	13
+nover:: 32	19	13	13	13
+nover:: 43	19	13	13	13
+43
+nover:: 0	0	5	8	9
+class1	-0.75326	1.07718
+class1	0.628871	0.24278
+class1	-0.0593842	0.552984
+class1	0.647242	0.884364
+class1	1.28722	0.0233282
+nover:: 0	0	5	8	9
+class2	0.817849	0.0588245
+class2	-0.227386	-1.91701
+class2	-0.466541	-0.54539
+class2	0.610938	-0.656098
+class2	1.64152	-0.753942
+class2	0.719807	-0.586019
+class2	0.627681	-0.709631
+class2	1.888	-0.379547
+nover:: 1	0	5	8	9
+class3	-0.15381	-0.321659
+class3	-1.89686	-0.764689
+class3	-0.786544	-0.527758
+class3	-1.03634	-1.09121
+class3	-1.01359	-0.870992
+class3	-0.425693	-0.159452
+class3	-0.162984	-0.0325197
+class3	-0.0469491	-1.10652
+class3	-1.30034	0.316739
+nover:: 2	0	5	8	9
+45
+nover:: 13	19	13	13	13
+nover:: 21	19	13	13	13
+nover:: 28	19	13	13	13
+nover:: 39	19	13	13	13
+39
+nover:: 0	0	5	8	9
+class1	-0.75326	0.706095
+class1	0.628871	1.15826
+class1	-0.0593842	0.770195
+class1	0.647242	1.04073
+class1	1.28722	1.21056
+nover:: 0	0	5	8	9
+class2	0.817849	0.0657279
+class2	-0.227386	0.258469
+class2	-0.466541	0.114074
+class2	0.610938	0.268467
+class2	1.64152	0.431059
+class2	0.719807	0.321275
+class2	0.627681	0.717283
+class2	1.888	0.245956
+nover:: 0	0	5	8	9
+class3	-0.15381	1.71869
+class3	-1.89686	1.97183
+class3	-0.786544	0.90312
+class3	-1.03634	1.40901
+class3	-1.01359	1.18906
+class3	-0.425693	0.433313
+class3	-0.162984	1.54608
+class3	-0.0469491	1.96157
+class3	-1.30034	0.963367
+nover:: 1	0	5	8	9
+40
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 16	19	13	13	13
+16
+nover:: 0	0	5	8	9
+class1	-0.75326	0.621864
+class1	0.628871	0.750707
+class1	-0.0593842	0.82568
+class1	0.647242	0.412541
+class1	1.28722	0.667941
+nover:: 0	0	5	8	9
+class2	0.817849	2.05161
+class2	-0.227386	2.53984
+class2	-0.466541	1.16788
+class2	0.610938	1.15009
+class2	1.64152	2.96576
+class2	0.719807	1.87985
+class2	0.627681	1.34901
+class2	1.888	1.92287
+nover:: 3	0	5	8	9
+class3	-0.15381	2.48139
+class3	-1.89686	2.70834
+class3	-0.786544	1.93082
+class3	-1.03634	1.38887
+class3	-1.01359	1.2889
+class3	-0.425693	1.25441
+class3	-0.162984	2.0762
+class3	-0.0469491	2.86667
+class3	-1.30034	1.57382
+nover:: 4	0	5	8	9
+20
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 16	19	13	13	13
+16
+nover:: 0	0	5	8	9
+class1	-0.75326	-2.10512
+class1	0.628871	-3.48748
+class1	-0.0593842	-5.22061
+class1	0.647242	-1.12941
+class1	1.28722	-2.47797
+nover:: 0	0	5	8	9
+class2	0.817849	1.39155
+class2	-0.227386	1.07285
+class2	-0.466541	6.4437
+class2	0.610938	7.15082
+class2	1.64152	0.919849
+class2	0.719807	1.5843
+class2	0.627681	3.34031
+class2	1.888	1.52947
+nover:: 2	0	5	8	9
+class3	-0.15381	1.10033
+class3	-1.89686	1.00368
+class3	-0.786544	1.51989
+class3	-1.03634	3.04424
+class3	-1.01359	3.94025
+class3	-0.425693	4.41186
+class3	-0.162984	1.36885
+class3	-0.0469491	0.949531
+class3	-1.30034	2.20505
+nover:: 3	0	5	8	9
+19
+nover:: 6	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 17	19	13	13	13
+17
+nover:: 0	0	5	8	9
+class1	-0.75326	-0.780263
+class1	0.628871	-0.659421
+class1	-0.0593842	-0.576447
+class1	0.647242	-0.960247
+class1	1.28722	-0.738983
+nover:: 0	0	5	8	9
+class2	0.817849	0.895709
+class2	-0.227386	0.976834
+class2	-0.466541	0.537388
+class2	0.610938	0.519057
+class2	1.64152	1.02824
+class2	0.719807	0.857803
+class2	0.627681	0.668967
+class2	1.888	0.867933
+nover:: 3	0	5	8	9
+class3	-0.15381	0.968632
+class3	-1.89686	0.998777
+class3	-0.786544	0.869753
+class3	-1.03634	0.689986
+class3	-1.01359	0.633129
+class3	-0.425693	0.609714
+class3	-0.162984	0.900632
+class3	-0.0469491	1.01741
+class3	-1.30034	0.768294
+nover:: 4	0	5	8	9
+21
+nover:: 6	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 16	19	13	13	13
+16
+nover:: 0	0	5	8	9
+class1	-0.75326	-0.486412
+class1	0.628871	-3.03942
+class1	-0.0593842	-3.02089
+class1	0.647242	-0.175412
+class1	1.28722	-1.99974
+nover:: 4	0	5	8	9
+class2	0.817849	1.09146
+class2	-0.227386	0.722703
+class2	-0.466541	1.73506
+class2	0.610938	2.91975
+class2	1.64152	0.60349
+class2	0.719807	1.509
+class2	0.627681	3.39595
+class2	1.888	1.37618
+nover:: 4	0	5	8	9
+class3	-0.15381	-0.89113
+class3	-1.89686	-0.979081
+class3	-0.786544	-0.372643
+class3	-1.03634	-3.28936
+class3	-1.01359	-3.68519
+class3	-0.425693	-0.911718
+class3	-0.162984	-1.11635
+class3	-0.0469491	-0.862576
+class3	-1.30034	-1.12427
+nover:: 5	0	5	8	9
+21
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 14	19	13	13	13
+14
+nover:: 0	0	5	8	9
+class1	-0.75326	-2.05587
+class1	0.628871	-0.329011
+class1	-0.0593842	-0.331029
+class1	0.647242	-5.70087
+class1	1.28722	-0.500064
+nover:: 3	0	5	8	9
+class2	0.817849	0.916201
+class2	-0.227386	1.38369
+class2	-0.466541	0.57635
+class2	0.610938	0.342495
+class2	1.64152	1.65703
+class2	0.719807	0.662693
+class2	0.627681	0.294469
+class2	1.888	0.726648
+nover:: 3	0	5	8	9
+class3	-0.15381	-1.12217
+class3	-1.89686	-1.02137
+class3	-0.786544	-2.68353
+class3	-1.03634	-0.304011
+class3	-1.01359	-0.271356
+class3	-0.425693	-1.09683
+class3	-0.162984	-0.895775
+class3	-0.0469491	-1.15932
+class3	-1.30034	-0.889468
+nover:: 4	0	5	8	9
+18
+nover:: 12	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 21	19	13	13	13
+nover:: 24	19	13	13	13
+24
+nover:: 0	0	5	8	9
+class1	-0.75326	0.706095
+class1	0.628871	1.15826
+class1	-0.0593842	0.770195
+class1	0.647242	1.04073
+class1	1.28722	1.21056
+nover:: 0	0	5	8	9
+class2	0.817849	0.0657279
+class2	-0.227386	-0.258469
+class2	-0.466541	0.114074
+class2	0.610938	0.268467
+class2	1.64152	-0.431059
+class2	0.719807	0.321275
+class2	0.627681	0.717283
+class2	1.888	0.245956
+nover:: 0	0	5	8	9
+class3	-0.15381	-1.71869
+class3	-1.89686	-1.97183
+class3	-0.786544	-0.90312
+class3	-1.03634	-1.40901
+class3	-1.01359	-1.18906
+class3	-0.425693	-0.433313
+class3	-0.162984	-1.54608
+class3	-0.0469491	-1.96157
+class3	-1.30034	-0.963367
+nover:: 0	0	5	8	9
+24
+nover:: 9	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 24	19	13	13	13
+24
+nover:: 0	0	5	8	9
+class1	-0.75326	0.229011
+class1	0.628871	0.76531
+class1	-0.0593842	0.546891
+class1	0.647242	0.154689
+class1	1.28722	0.722221
+nover:: 3	0	5	8	9
+class2	0.817849	0.706366
+class2	-0.227386	0.623828
+class2	-0.466541	0.266022
+class2	0.610938	0.39706
+class2	1.64152	0.610012
+class2	0.719807	0.814849
+class2	0.627681	0.850353
+class2	1.888	0.783187
+nover:: 6	0	5	8	9
+class3	-0.15381	-0.724201
+class3	-1.89686	-0.827979
+class3	-0.786544	-0.242729
+class3	-1.03634	-0.882203
+class3	-1.01359	-0.804758
+class3	-0.425693	-0.205184
+class3	-0.162984	-0.728094
+class3	-0.0469491	-0.788535
+class3	-1.30034	-0.488056
+nover:: 6	0	5	8	9
+30
+nover:: 9	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 26	19	13	13	13
+26
+nover:: 0	0	5	8	9
+class1	-0.75326	4.32785
+class1	0.628871	1.14742
+class1	-0.0593842	1.72817
+class1	0.647242	6.43861
+class1	1.28722	1.23914
+nover:: 3	0	5	8	9
+class2	0.817849	1.27494
+class2	-0.227386	1.48449
+class2	-0.466541	3.71383
+class2	0.610938	2.44911
+class2	1.64152	1.52422
+class2	0.719807	1.0499
+class2	0.627681	0.983616
+class2	1.888	1.11139
+nover:: 6	0	5	8	9
+class3	-0.15381	-1.23476
+class3	-1.89686	-1.02512
+class3	-0.786544	-4.07867
+class3	-1.03634	-0.92548
+class3	-1.01359	-1.06921
+class3	-0.425693	-4.83906
+class3	-0.162984	-1.22619
+class3	-0.0469491	-1.10081
+class3	-1.30034	-1.96132
+nover:: 6	0	5	8	9
+32
+nover:: 9	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 21	19	13	13	13
+nover:: 27	19	13	13	13
+27
+nover:: 0	0	5	8	9
+class1	-0.75326	0.613634
+class1	0.628871	0.955197
+class1	-0.0593842	0.833306
+class1	0.647242	0.53753
+class1	1.28722	0.931021
+nover:: 3	0	5	8	9
+class2	0.817849	0.922225
+class2	-0.227386	0.876612
+class2	-0.466541	0.645743
+class2	0.610938	0.741874
+class2	1.64152	0.86893
+class2	0.719807	0.983898
+class2	0.627681	1.00552
+class2	1.888	0.965409
+nover:: 6	0	5	8	9
+class3	-0.15381	-0.932122
+class3	-1.89686	-0.991764
+class3	-0.786544	-0.625884
+class3	-1.03634	-1.02615
+class3	-1.01359	-0.97794
+class3	-0.425693	-0.591216
+class3	-0.162984	-0.934289
+class3	-0.0469491	-0.968492
+class3	-1.30034	-0.798885
+nover:: 6	0	5	8	9
+33
+nover:: 4	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 13	19	13	13	13
+nover:: 18	19	13	13	13
+18
+nover:: 0	0	5	8	9
+class1	-0.75326	-0.109762
+class1	0.628871	-0.249901
+class1	-0.0593842	-0.110839
+class1	0.647242	-0.137517
+class1	1.28722	-0.325673
+nover:: 0	0	5	8	9
+class2	0.817849	0.563653
+class2	-0.227386	0.627892
+class2	-0.466541	0.0417872
+class2	0.610938	0.0570999
+class2	1.64152	0.713242
+class2	0.719807	0.601193
+class2	0.627681	0.30436
+class2	1.888	0.588291
+nover:: 0	0	5	8	9
+class3	-0.15381	-0.73603
+class3	-1.89686	-0.97192
+class3	-0.786544	-0.161313
+class3	-1.03634	-0.35494
+class3	-1.01359	-0.237363
+class3	-0.425693	-0.04684
+class3	-0.162984	-0.595781
+class3	-0.0469491	-0.956706
+class3	-1.30034	-0.231225
+nover:: 0	0	5	8	9
+18
+nover:: 1	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 11	19	13	13	13
+11
+nover:: 0	0	5	8	9
+class1	-0.75326	-0.243971
+class1	0.628871	0.584782
+class1	-0.0593842	0.387098
+class1	0.647242	-0.730107
+class1	1.28722	0.403453
+nover:: 5	0	5	8	9
+class2	0.817849	1.50298
+class2	-0.227386	1.60573
+class2	-0.466541	0.424454
+class2	0.610938	0.548155
+class2	1.64152	1.74321
+class2	0.719807	1.58366
+class2	0.627681	1.31603
+class2	1.888	1.55359
+nover:: 6	0	5	8	9
+class3	-0.15381	0.0989428
+class3	-1.89686	0.0208427
+class3	-0.786544	0.412765
+class3	-1.03634	-0.752031
+class3	-1.01359	-0.681478
+class3	-0.425693	0.0200101
+class3	-0.162984	-0.0849994
+class3	-0.0469491	0.144729
+class3	-1.30034	-0.056356
+nover:: 7	0	5	8	9
+18
+nover:: 16	19	13	13	13
+nover:: 25	19	13	13	13
+nover:: 34	19	13	13	13
+nover:: 45	19	13	13	13
+45
+nover:: 0	0	5	8	9
+class1	0.787986	0.833213
+class1	1.35172	0.827562
+class1	-0.382953	0.940082
+class1	-0.567763	0.154257
+class1	0.974964	0.426781
+nover:: 2	0	5	8	9
+class2	0.605759	1.5618
+class2	0.570846	-0.311279
+class2	1.0747	-0.120935
+class2	0.432437	-0.107943
+class2	0.964599	0.989268
+class2	0.0567745	0.997644
+class2	1.12342	0.606399
+class2	1.46228	1.17405
+nover:: 5	0	5	8	9
+class3	-0.0148797	-0.222716
+class3	-0.0463603	-0.743846
+class3	-0.829335	-0.114994
+class3	-1.61237	-1.84324
+class3	-1.18048	-1.55247
+class3	-0.513542	-0.139442
+class3	-0.405191	-0.117519
+class3	-0.451419	-0.961794
+class3	-0.303203	0.260383
+nover:: 5	0	5	8	9
+50
+nover:: 14	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 31	19	13	13	13
+nover:: 38	19	13	13	13
+38
+nover:: 0	0	5	8	9
+class1	0.787986	1.07718
+class1	1.35172	0.24278
+class1	-0.382953	0.552984
+class1	-0.567763	0.884364
+class1	0.974964	0.0233282
+nover:: 0	0	5	8	9
+class2	0.605759	0.0588245
+class2	0.570846	-1.91701
+class2	1.0747	-0.54539
+class2	0.432437	-0.656098
+class2	0.964599	-0.753942
+class2	0.0567745	-0.586019
+class2	1.12342	-0.709631
+class2	1.46228	-0.379547
+nover:: 0	0	5	8	9
+class3	-0.0148797	-0.321659
+class3	-0.0463603	-0.764689
+class3	-0.829335	-0.527758
+class3	-1.61237	-1.09121
+class3	-1.18048	-0.870992
+class3	-0.513542	-0.159452
+class3	-0.405191	-0.0325197
+class3	-0.451419	-1.10652
+class3	-0.303203	0.316739
+nover:: 0	0	5	8	9
+38
+nover:: 9	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 28	19	13	13	13
+nover:: 39	19	13	13	13
+39
+nover:: 0	0	5	8	9
+class1	0.787986	0.706095
+class1	1.35172	1.15826
+class1	-0.382953	0.770195
+class1	-0.567763	1.04073
+class1	0.974964	1.21056
+nover:: 1	0	5	8	9
+class2	0.605759	0.0657279
+class2	0.570846	0.258469
+class2	1.0747	0.114074
+class2	0.432437	0.268467
+class2	0.964599	0.431059
+class2	0.0567745	0.321275
+class2	1.12342	0.717283
+class2	1.46228	0.245956
+nover:: 1	0	5	8	9
+class3	-0.0148797	1.71869
+class3	-0.0463603	1.97183
+class3	-0.829335	0.90312
+class3	-1.61237	1.40901
+class3	-1.18048	1.18906
+class3	-0.513542	0.433313
+class3	-0.405191	1.54608
+class3	-0.451419	1.96157
+class3	-0.303203	0.963367
+nover:: 3	0	5	8	9
+42
+nover:: 6	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 15	19	13	13	13
+15
+nover:: 0	0	5	8	9
+class1	0.787986	0.621864
+class1	1.35172	0.750707
+class1	-0.382953	0.82568
+class1	-0.567763	0.412541
+class1	0.974964	0.667941
+nover:: 0	0	5	8	9
+class2	0.605759	2.05161
+class2	0.570846	2.53984
+class2	1.0747	1.16788
+class2	0.432437	1.15009
+class2	0.964599	2.96576
+class2	0.0567745	1.87985
+class2	1.12342	1.34901
+class2	1.46228	1.92287
+nover:: 0	0	5	8	9
+class3	-0.0148797	2.48139
+class3	-0.0463603	2.70834
+class3	-0.829335	1.93082
+class3	-1.61237	1.38887
+class3	-1.18048	1.2889
+class3	-0.513542	1.25441
+class3	-0.405191	2.0762
+class3	-0.451419	2.86667
+class3	-0.303203	1.57382
+nover:: 0	0	5	8	9
+15
+nover:: 6	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 13	19	13	13	13
+nover:: 14	19	13	13	13
+14
+nover:: 0	0	5	8	9
+class1	0.787986	-2.10512
+class1	1.35172	-3.48748
+class1	-0.382953	-5.22061
+class1	-0.567763	-1.12941
+class1	0.974964	-2.47797
+nover:: 0	0	5	8	9
+class2	0.605759	1.39155
+class2	0.570846	1.07285
+class2	1.0747	6.4437
+class2	0.432437	7.15082
+class2	0.964599	0.919849
+class2	0.0567745	1.5843
+class2	1.12342	3.34031
+class2	1.46228	1.52947
+nover:: 0	0	5	8	9
+class3	-0.0148797	1.10033
+class3	-0.0463603	1.00368
+class3	-0.829335	1.51989
+class3	-1.61237	3.04424
+class3	-1.18048	3.94025
+class3	-0.513542	4.41186
+class3	-0.405191	1.36885
+class3	-0.451419	0.949531
+class3	-0.303203	2.20505
+nover:: 0	0	5	8	9
+14
+nover:: 6	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 13	19	13	13	13
+13
+nover:: 0	0	5	8	9
+class1	0.787986	-0.780263
+class1	1.35172	-0.659421
+class1	-0.382953	-0.576447
+class1	-0.567763	-0.960247
+class1	0.974964	-0.738983
+nover:: 0	0	5	8	9
+class2	0.605759	0.895709
+class2	0.570846	0.976834
+class2	1.0747	0.537388
+class2	0.432437	0.519057
+class2	0.964599	1.02824
+class2	0.0567745	0.857803
+class2	1.12342	0.668967
+class2	1.46228	0.867933
+nover:: 0	0	5	8	9
+class3	-0.0148797	0.968632
+class3	-0.0463603	0.998777
+class3	-0.829335	0.869753
+class3	-1.61237	0.689986
+class3	-1.18048	0.633129
+class3	-0.513542	0.609714
+class3	-0.405191	0.900632
+class3	-0.451419	1.01741
+class3	-0.303203	0.768294
+nover:: 0	0	5	8	9
+13
+nover:: 3	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 13	19	13	13	13
+nover:: 17	19	13	13	13
+17
+nover:: 0	0	5	8	9
+class1	0.787986	-0.486412
+class1	1.35172	-3.03942
+class1	-0.382953	-3.02089
+class1	-0.567763	-0.175412
+class1	0.974964	-1.99974
+nover:: 6	0	5	8	9
+class2	0.605759	1.09146
+class2	0.570846	0.722703
+class2	1.0747	1.73506
+class2	0.432437	2.91975
+class2	0.964599	0.60349
+class2	0.0567745	1.509
+class2	1.12342	3.39595
+class2	1.46228	1.37618
+nover:: 6	0	5	8	9
+class3	-0.0148797	-0.89113
+class3	-0.0463603	-0.979081
+class3	-0.829335	-0.372643
+class3	-1.61237	-3.28936
+class3	-1.18048	-3.68519
+class3	-0.513542	-0.911718
+class3	-0.405191	-1.11635
+class3	-0.451419	-0.862576
+class3	-0.303203	-1.12427
+nover:: 6	0	5	8	9
+23
+nover:: 2	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 16	19	13	13	13
+16
+nover:: 0	0	5	8	9
+class1	0.787986	-2.05587
+class1	1.35172	-0.329011
+class1	-0.382953	-0.331029
+class1	-0.567763	-5.70087
+class1	0.974964	-0.500064
+nover:: 3	0	5	8	9
+class2	0.605759	0.916201
+class2	0.570846	1.38369
+class2	1.0747	0.57635
+class2	0.432437	0.342495
+class2	0.964599	1.65703
+class2	0.0567745	0.662693
+class2	1.12342	0.294469
+class2	1.46228	0.726648
+nover:: 3	0	5	8	9
+class3	-0.0148797	-1.12217
+class3	-0.0463603	-1.02137
+class3	-0.829335	-2.68353
+class3	-1.61237	-0.304011
+class3	-1.18048	-0.271356
+class3	-0.513542	-1.09683
+class3	-0.405191	-0.895775
+class3	-0.451419	-1.15932
+class3	-0.303203	-0.889468
+nover:: 3	0	5	8	9
+19
+nover:: 7	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 24	19	13	13	13
+24
+nover:: 0	0	5	8	9
+class1	0.787986	0.706095
+class1	1.35172	1.15826
+class1	-0.382953	0.770195
+class1	-0.567763	1.04073
+class1	0.974964	1.21056
+nover:: 0	0	5	8	9
+class2	0.605759	0.0657279
+class2	0.570846	-0.258469
+class2	1.0747	0.114074
+class2	0.432437	0.268467
+class2	0.964599	-0.431059
+class2	0.0567745	0.321275
+class2	1.12342	0.717283
+class2	1.46228	0.245956
+nover:: 0	0	5	8	9
+class3	-0.0148797	-1.71869
+class3	-0.0463603	-1.97183
+class3	-0.829335	-0.90312
+class3	-1.61237	-1.40901
+class3	-1.18048	-1.18906
+class3	-0.513542	-0.433313
+class3	-0.405191	-1.54608
+class3	-0.451419	-1.96157
+class3	-0.303203	-0.963367
+nover:: 0	0	5	8	9
+24
+nover:: 8	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 24	19	13	13	13
+24
+nover:: 0	0	5	8	9
+class1	0.787986	0.229011
+class1	1.35172	0.76531
+class1	-0.382953	0.546891
+class1	-0.567763	0.154689
+class1	0.974964	0.722221
+nover:: 3	0	5	8	9
+class2	0.605759	0.706366
+class2	0.570846	0.623828
+class2	1.0747	0.266022
+class2	0.432437	0.39706
+class2	0.964599	0.610012
+class2	0.0567745	0.814849
+class2	1.12342	0.850353
+class2	1.46228	0.783187
+nover:: 5	0	5	8	9
+class3	-0.0148797	-0.724201
+class3	-0.0463603	-0.827979
+class3	-0.829335	-0.242729
+class3	-1.61237	-0.882203
+class3	-1.18048	-0.804758
+class3	-0.513542	-0.205184
+class3	-0.405191	-0.728094
+class3	-0.451419	-0.788535
+class3	-0.303203	-0.488056
+nover:: 5	0	5	8	9
+29
+nover:: 9	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 20	19	13	13	13
+nover:: 26	19	13	13	13
+26
+nover:: 0	0	5	8	9
+class1	0.787986	4.32785
+class1	1.35172	1.14742
+class1	-0.382953	1.72817
+class1	-0.567763	6.43861
+class1	0.974964	1.23914
+nover:: 3	0	5	8	9
+class2	0.605759	1.27494
+class2	0.570846	1.48449
+class2	1.0747	3.71383
+class2	0.432437	2.44911
+class2	0.964599	1.52422
+class2	0.0567745	1.0499
+class2	1.12342	0.983616
+class2	1.46228	1.11139
+nover:: 5	0	5	8	9
+class3	-0.0148797	-1.23476
+class3	-0.0463603	-1.02512
+class3	-0.829335	-4.07867
+class3	-1.61237	-0.92548
+class3	-1.18048	-1.06921
+class3	-0.513542	-4.83906
+class3	-0.405191	-1.22619
+class3	-0.451419	-1.10081
+class3	-0.303203	-1.96132
+nover:: 5	0	5	8	9
+31
+nover:: 9	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 24	19	13	13	13
+24
+nover:: 0	0	5	8	9
+class1	0.787986	0.613634
+class1	1.35172	0.955197
+class1	-0.382953	0.833306
+class1	-0.567763	0.53753
+class1	0.974964	0.931021
+nover:: 3	0	5	8	9
+class2	0.605759	0.922225
+class2	0.570846	0.876612
+class2	1.0747	0.645743
+class2	0.432437	0.741874
+class2	0.964599	0.86893
+class2	0.0567745	0.983898
+class2	1.12342	1.00552
+class2	1.46228	0.965409
+nover:: 5	0	5	8	9
+class3	-0.0148797	-0.932122
+class3	-0.0463603	-0.991764
+class3	-0.829335	-0.625884
+class3	-1.61237	-1.02615
+class3	-1.18048	-0.97794
+class3	-0.513542	-0.591216
+class3	-0.405191	-0.934289
+class3	-0.451419	-0.968492
+class3	-0.303203	-0.798885
+nover:: 5	0	5	8	9
+29
+nover:: 4	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 15	19	13	13	13
+15
+nover:: 0	0	5	8	9
+class1	0.787986	-0.109762
+class1	1.35172	-0.249901
+class1	-0.382953	-0.110839
+class1	-0.567763	-0.137517
+class1	0.974964	-0.325673
+nover:: 0	0	5	8	9
+class2	0.605759	0.563653
+class2	0.570846	0.627892
+class2	1.0747	0.0417872
+class2	0.432437	0.0570999
+class2	0.964599	0.713242
+class2	0.0567745	0.601193
+class2	1.12342	0.30436
+class2	1.46228	0.588291
+nover:: 0	0	5	8	9
+class3	-0.0148797	-0.73603
+class3	-0.0463603	-0.97192
+class3	-0.829335	-0.161313
+class3	-1.61237	-0.35494
+class3	-1.18048	-0.237363
+class3	-0.513542	-0.04684
+class3	-0.405191	-0.595781
+class3	-0.451419	-0.956706
+class3	-0.303203	-0.231225
+nover:: 1	0	5	8	9
+16
+nover:: 1	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 10	19	13	13	13
+10
+nover:: 0	0	5	8	9
+class1	0.787986	-0.243971
+class1	1.35172	0.584782
+class1	-0.382953	0.387098
+class1	-0.567763	-0.730107
+class1	0.974964	0.403453
+nover:: 5	0	5	8	9
+class2	0.605759	1.50298
+class2	0.570846	1.60573
+class2	1.0747	0.424454
+class2	0.432437	0.548155
+class2	0.964599	1.74321
+class2	0.0567745	1.58366
+class2	1.12342	1.31603
+class2	1.46228	1.55359
+nover:: 5	0	5	8	9
+class3	-0.0148797	0.0989428
+class3	-0.0463603	0.0208427
+class3	-0.829335	0.412765
+class3	-1.61237	-0.752031
+class3	-1.18048	-0.681478
+class3	-0.513542	0.0200101
+class3	-0.405191	-0.0849994
+class3	-0.451419	0.144729
+class3	-0.303203	-0.056356
+nover:: 5	0	5	8	9
+15
+nover:: 2	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 12	19	13	13	13
+12
+nover:: 0	0	5	8	9
+class1	0.833213	1.07718
+class1	0.827562	0.24278
+class1	0.940082	0.552984
+class1	0.154257	0.884364
+class1	0.426781	0.0233282
+nover:: 0	0	5	8	9
+class2	1.5618	0.0588245
+class2	-0.311279	-1.91701
+class2	-0.120935	-0.54539
+class2	-0.107943	-0.656098
+class2	0.989268	-0.753942
+class2	0.997644	-0.586019
+class2	0.606399	-0.709631
+class2	1.17405	-0.379547
+nover:: 0	0	5	8	9
+class3	-0.222716	-0.321659
+class3	-0.743846	-0.764689
+class3	-0.114994	-0.527758
+class3	-1.84324	-1.09121
+class3	-1.55247	-0.870992
+class3	-0.139442	-0.159452
+class3	-0.117519	-0.0325197
+class3	-0.961794	-1.10652
+class3	0.260383	0.316739
+nover:: 0	0	5	8	9
+12
+nover:: 8	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 18	19	13	13	13
+nover:: 28	19	13	13	13
+28
+nover:: 0	0	5	8	9
+class1	0.833213	0.706095
+class1	0.827562	1.15826
+class1	0.940082	0.770195
+class1	0.154257	1.04073
+class1	0.426781	1.21056
+nover:: 0	0	5	8	9
+class2	1.5618	0.0657279
+class2	-0.311279	0.258469
+class2	-0.120935	0.114074
+class2	-0.107943	0.268467
+class2	0.989268	0.431059
+class2	0.997644	0.321275
+class2	0.606399	0.717283
+class2	1.17405	0.245956
+nover:: 0	0	5	8	9
+class3	-0.222716	1.71869
+class3	-0.743846	1.97183
+class3	-0.114994	0.90312
+class3	-1.84324	1.40901
+class3	-1.55247	1.18906
+class3	-0.139442	0.433313
+class3	-0.117519	1.54608
+class3	-0.961794	1.96157
+class3	0.260383	0.963367
+nover:: 1	0	5	8	9
+29
+nover:: 4	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 9	19	13	13	13
+9
+nover:: 0	0	5	8	9
+class1	0.833213	0.621864
+class1	0.827562	0.750707
+class1	0.940082	0.82568
+class1	0.154257	0.412541
+class1	0.426781	0.667941
+nover:: 0	0	5	8	9
+class2	1.5618	2.05161
+class2	-0.311279	2.53984
+class2	-0.120935	1.16788
+class2	-0.107943	1.15009
+class2	0.989268	2.96576
+class2	0.997644	1.87985
+class2	0.606399	1.34901
+class2	1.17405	1.92287
+nover:: 4	0	5	8	9
+class3	-0.222716	2.48139
+class3	-0.743846	2.70834
+class3	-0.114994	1.93082
+class3	-1.84324	1.38887
+class3	-1.55247	1.2889
+class3	-0.139442	1.25441
+class3	-0.117519	2.0762
+class3	-0.961794	2.86667
+class3	0.260383	1.57382
+nover:: 4	0	5	8	9
+13
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 10	19	13	13	13
+10
+nover:: 0	0	5	8	9
+class1	0.833213	-2.10512
+class1	0.827562	-3.48748
+class1	0.940082	-5.22061
+class1	0.154257	-1.12941
+class1	0.426781	-2.47797
+nover:: 0	0	5	8	9
+class2	1.5618	1.39155
+class2	-0.311279	1.07285
+class2	-0.120935	6.4437
+class2	-0.107943	7.15082
+class2	0.989268	0.919849
+class2	0.997644	1.5843
+class2	0.606399	3.34031
+class2	1.17405	1.52947
+nover:: 5	0	5	8	9
+class3	-0.222716	1.10033
+class3	-0.743846	1.00368
+class3	-0.114994	1.51989
+class3	-1.84324	3.04424
+class3	-1.55247	3.94025
+class3	-0.139442	4.41186
+class3	-0.117519	1.36885
+class3	-0.961794	0.949531
+class3	0.260383	2.20505
+nover:: 5	0	5	8	9
+15
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 10	19	13	13	13
+10
+nover:: 0	0	5	8	9
+class1	0.833213	-0.780263
+class1	0.827562	-0.659421
+class1	0.940082	-0.576447
+class1	0.154257	-0.960247
+class1	0.426781	-0.738983
+nover:: 0	0	5	8	9
+class2	1.5618	0.895709
+class2	-0.311279	0.976834
+class2	-0.120935	0.537388
+class2	-0.107943	0.519057
+class2	0.989268	1.02824
+class2	0.997644	0.857803
+class2	0.606399	0.668967
+class2	1.17405	0.867933
+nover:: 5	0	5	8	9
+class3	-0.222716	0.968632
+class3	-0.743846	0.998777
+class3	-0.114994	0.869753
+class3	-1.84324	0.689986
+class3	-1.55247	0.633129
+class3	-0.139442	0.609714
+class3	-0.117519	0.900632
+class3	-0.961794	1.01741
+class3	0.260383	0.768294
+nover:: 5	0	5	8	9
+15
+nover:: 2	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 10	19	13	13	13
+10
+nover:: 0	0	5	8	9
+class1	0.833213	-0.486412
+class1	0.827562	-3.03942
+class1	0.940082	-3.02089
+class1	0.154257	-0.175412
+class1	0.426781	-1.99974
+nover:: 0	0	5	8	9
+class2	1.5618	1.09146
+class2	-0.311279	0.722703
+class2	-0.120935	1.73506
+class2	-0.107943	2.91975
+class2	0.989268	0.60349
+class2	0.997644	1.509
+class2	0.606399	3.39595
+class2	1.17405	1.37618
+nover:: 0	0	5	8	9
+class3	-0.222716	-0.89113
+class3	-0.743846	-0.979081
+class3	-0.114994	-0.372643
+class3	-1.84324	-3.28936
+class3	-1.55247	-3.68519
+class3	-0.139442	-0.911718
+class3	-0.117519	-1.11635
+class3	-0.961794	-0.862576
+class3	0.260383	-1.12427
+nover:: 0	0	5	8	9
+10
+13	7	10	7	0.00556223
+nover:: 1	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 8	19	13	13	13
+8
+nover:: 0	0	5	8	9
+class1	0.833213	-2.05587
+class1	0.827562	-0.329011
+class1	0.940082	-0.331029
+class1	0.154257	-5.70087
+class1	0.426781	-0.500064
+nover:: 0	0	5	8	9
+class2	1.5618	0.916201
+class2	-0.311279	1.38369
+class2	-0.120935	0.57635
+class2	-0.107943	0.342495
+class2	0.989268	1.65703
+class2	0.997644	0.662693
+class2	0.606399	0.294469
+class2	1.17405	0.726648
+nover:: 0	0	5	8	9
+class3	-0.222716	-1.12217
+class3	-0.743846	-1.02137
+class3	-0.114994	-2.68353
+class3	-1.84324	-0.304011
+class3	-1.55247	-0.271356
+class3	-0.139442	-1.09683
+class3	-0.117519	-0.895775
+class3	-0.961794	-1.15932
+class3	0.260383	-0.889468
+nover:: 0	0	5	8	9
+8
+13	6	8	6	0.00640694
+nover:: 5	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 11	19	13	13	13
+nover:: 14	19	13	13	13
+14
+nover:: 0	0	5	8	9
+class1	0.833213	0.706095
+class1	0.827562	1.15826
+class1	0.940082	0.770195
+class1	0.154257	1.04073
+class1	0.426781	1.21056
+nover:: 0	0	5	8	9
+class2	1.5618	0.0657279
+class2	-0.311279	-0.258469
+class2	-0.120935	0.114074
+class2	-0.107943	0.268467
+class2	0.989268	-0.431059
+class2	0.997644	0.321275
+class2	0.606399	0.717283
+class2	1.17405	0.245956
+nover:: 0	0	5	8	9
+class3	-0.222716	-1.71869
+class3	-0.743846	-1.97183
+class3	-0.114994	-0.90312
+class3	-1.84324	-1.40901
+class3	-1.55247	-1.18906
+class3	-0.139442	-0.433313
+class3	-0.117519	-1.54608
+class3	-0.961794	-1.96157
+class3	0.260383	-0.963367
+nover:: 0	0	5	8	9
+14
+nover:: 8	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 22	19	13	13	13
+nover:: 32	19	13	13	13
+32
+nover:: 0	0	5	8	9
+class1	0.833213	0.229011
+class1	0.827562	0.76531
+class1	0.940082	0.546891
+class1	0.154257	0.154689
+class1	0.426781	0.722221
+nover:: 0	0	5	8	9
+class2	1.5618	0.706366
+class2	-0.311279	0.623828
+class2	-0.120935	0.266022
+class2	-0.107943	0.39706
+class2	0.989268	0.610012
+class2	0.997644	0.814849
+class2	0.606399	0.850353
+class2	1.17405	0.783187
+nover:: 3	0	5	8	9
+class3	-0.222716	-0.724201
+class3	-0.743846	-0.827979
+class3	-0.114994	-0.242729
+class3	-1.84324	-0.882203
+class3	-1.55247	-0.804758
+class3	-0.139442	-0.205184
+class3	-0.117519	-0.728094
+class3	-0.961794	-0.788535
+class3	0.260383	-0.488056
+nover:: 3	0	5	8	9
+35
+nover:: 9	19	13	13	13
+nover:: 19	19	13	13	13
+nover:: 24	19	13	13	13
+nover:: 34	19	13	13	13
+34
+nover:: 0	0	5	8	9
+class1	0.833213	4.32785
+class1	0.827562	1.14742
+class1	0.940082	1.72817
+class1	0.154257	6.43861
+class1	0.426781	1.23914
+nover:: 0	0	5	8	9
+class2	1.5618	1.27494
+class2	-0.311279	1.48449
+class2	-0.120935	3.71383
+class2	-0.107943	2.44911
+class2	0.989268	1.52422
+class2	0.997644	1.0499
+class2	0.606399	0.983616
+class2	1.17405	1.11139
+nover:: 3	0	5	8	9
+class3	-0.222716	-1.23476
+class3	-0.743846	-1.02512
+class3	-0.114994	-4.07867
+class3	-1.84324	-0.92548
+class3	-1.55247	-1.06921
+class3	-0.139442	-4.83906
+class3	-0.117519	-1.22619
+class3	-0.961794	-1.10081
+class3	0.260383	-1.96132
+nover:: 3	0	5	8	9
+37
+nover:: 8	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 22	19	13	13	13
+nover:: 32	19	13	13	13
+32
+nover:: 0	0	5	8	9
+class1	0.833213	0.613634
+class1	0.827562	0.955197
+class1	0.940082	0.833306
+class1	0.154257	0.53753
+class1	0.426781	0.931021
+nover:: 0	0	5	8	9
+class2	1.5618	0.922225
+class2	-0.311279	0.876612
+class2	-0.120935	0.645743
+class2	-0.107943	0.741874
+class2	0.989268	0.86893
+class2	0.997644	0.983898
+class2	0.606399	1.00552
+class2	1.17405	0.965409
+nover:: 3	0	5	8	9
+class3	-0.222716	-0.932122
+class3	-0.743846	-0.991764
+class3	-0.114994	-0.625884
+class3	-1.84324	-1.02615
+class3	-1.55247	-0.97794
+class3	-0.139442	-0.591216
+class3	-0.117519	-0.934289
+class3	-0.961794	-0.968492
+class3	0.260383	-0.798885
+nover:: 3	0	5	8	9
+35
+nover:: 3	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 9	19	13	13	13
+9
+nover:: 0	0	5	8	9
+class1	0.833213	-0.109762
+class1	0.827562	-0.249901
+class1	0.940082	-0.110839
+class1	0.154257	-0.137517
+class1	0.426781	-0.325673
+nover:: 0	0	5	8	9
+class2	1.5618	0.563653
+class2	-0.311279	0.627892
+class2	-0.120935	0.0417872
+class2	-0.107943	0.0570999
+class2	0.989268	0.713242
+class2	0.997644	0.601193
+class2	0.606399	0.30436
+class2	1.17405	0.588291
+nover:: 0	0	5	8	9
+class3	-0.222716	-0.73603
+class3	-0.743846	-0.97192
+class3	-0.114994	-0.161313
+class3	-1.84324	-0.35494
+class3	-1.55247	-0.237363
+class3	-0.139442	-0.04684
+class3	-0.117519	-0.595781
+class3	-0.961794	-0.956706
+class3	0.260383	-0.231225
+nover:: 0	0	5	8	9
+9
+13	1	9	7	0.00728175
+nover:: 2	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 12	19	13	13	13
+12
+nover:: 0	0	5	8	9
+class1	0.833213	-0.243971
+class1	0.827562	0.584782
+class1	0.940082	0.387098
+class1	0.154257	-0.730107
+class1	0.426781	0.403453
+nover:: 0	0	5	8	9
+class2	1.5618	1.50298
+class2	-0.311279	1.60573
+class2	-0.120935	0.424454
+class2	-0.107943	0.548155
+class2	0.989268	1.74321
+class2	0.997644	1.58366
+class2	0.606399	1.31603
+class2	1.17405	1.55359
+nover:: 0	0	5	8	9
+class3	-0.222716	0.0989428
+class3	-0.743846	0.0208427
+class3	-0.114994	0.412765
+class3	-1.84324	-0.752031
+class3	-1.55247	-0.681478
+class3	-0.139442	0.0200101
+class3	-0.117519	-0.0849994
+class3	-0.961794	0.144729
+class3	0.260383	-0.056356
+nover:: 0	0	5	8	9
+12
+nover:: 7	19	13	13	13
+nover:: 13	19	13	13	13
+nover:: 27	19	13	13	13
+nover:: 39	19	13	13	13
+39
+nover:: 0	0	5	8	9
+class1	1.07718	0.706095
+class1	0.24278	1.15826
+class1	0.552984	0.770195
+class1	0.884364	1.04073
+class1	0.0233282	1.21056
+nover:: 0	0	5	8	9
+class2	0.0588245	0.0657279
+class2	-1.91701	0.258469
+class2	-0.54539	0.114074
+class2	-0.656098	0.268467
+class2	-0.753942	0.431059
+class2	-0.586019	0.321275
+class2	-0.709631	0.717283
+class2	-0.379547	0.245956
+nover:: 0	0	5	8	9
+class3	-0.321659	1.71869
+class3	-0.764689	1.97183
+class3	-0.527758	0.90312
+class3	-1.09121	1.40901
+class3	-0.870992	1.18906
+class3	-0.159452	0.433313
+class3	-0.0325197	1.54608
+class3	-1.10652	1.96157
+class3	0.316739	0.963367
+nover:: 1	0	5	8	9
+40
+nover:: 11	19	13	13	13
+nover:: 22	19	13	13	13
+nover:: 28	19	13	13	13
+nover:: 32	19	13	13	13
+32
+nover:: 0	0	5	8	9
+class1	1.07718	0.621864
+class1	0.24278	0.750707
+class1	0.552984	0.82568
+class1	0.884364	0.412541
+class1	0.0233282	0.667941
+nover:: 0	0	5	8	9
+class2	0.0588245	2.05161
+class2	-1.91701	2.53984
+class2	-0.54539	1.16788
+class2	-0.656098	1.15009
+class2	-0.753942	2.96576
+class2	-0.586019	1.87985
+class2	-0.709631	1.34901
+class2	-0.379547	1.92287
+nover:: 3	0	5	8	9
+class3	-0.321659	2.48139
+class3	-0.764689	2.70834
+class3	-0.527758	1.93082
+class3	-1.09121	1.38887
+class3	-0.870992	1.2889
+class3	-0.159452	1.25441
+class3	-0.0325197	2.0762
+class3	-1.10652	2.86667
+class3	0.316739	1.57382
+nover:: 6	0	5	8	9
+38
+nover:: 11	19	13	13	13
+nover:: 22	19	13	13	13
+nover:: 29	19	13	13	13
+nover:: 34	19	13	13	13
+34
+nover:: 0	0	5	8	9
+class1	1.07718	-2.10512
+class1	0.24278	-3.48748
+class1	0.552984	-5.22061
+class1	0.884364	-1.12941
+class1	0.0233282	-2.47797
+nover:: 0	0	5	8	9
+class2	0.0588245	1.39155
+class2	-1.91701	1.07285
+class2	-0.54539	6.4437
+class2	-0.656098	7.15082
+class2	-0.753942	0.919849
+class2	-0.586019	1.5843
+class2	-0.709631	3.34031
+class2	-0.379547	1.52947
+nover:: 5	0	5	8	9
+class3	-0.321659	1.10033
+class3	-0.764689	1.00368
+class3	-0.527758	1.51989
+class3	-1.09121	3.04424
+class3	-0.870992	3.94025
+class3	-0.159452	4.41186
+class3	-0.0325197	1.36885
+class3	-1.10652	0.949531
+class3	0.316739	2.20505
+nover:: 8	0	5	8	9
+42
+nover:: 11	19	13	13	13
+nover:: 22	19	13	13	13
+nover:: 28	19	13	13	13
+nover:: 32	19	13	13	13
+32
+nover:: 0	0	5	8	9
+class1	1.07718	-0.780263
+class1	0.24278	-0.659421
+class1	0.552984	-0.576447
+class1	0.884364	-0.960247
+class1	0.0233282	-0.738983
+nover:: 0	0	5	8	9
+class2	0.0588245	0.895709
+class2	-1.91701	0.976834
+class2	-0.54539	0.537388
+class2	-0.656098	0.519057
+class2	-0.753942	1.02824
+class2	-0.586019	0.857803
+class2	-0.709631	0.668967
+class2	-0.379547	0.867933
+nover:: 5	0	5	8	9
+class3	-0.321659	0.968632
+class3	-0.764689	0.998777
+class3	-0.527758	0.869753
+class3	-1.09121	0.689986
+class3	-0.870992	0.633129
+class3	-0.159452	0.609714
+class3	-0.0325197	0.900632
+class3	-1.10652	1.01741
+class3	0.316739	0.768294
+nover:: 8	0	5	8	9
+40
+nover:: 5	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 16	19	13	13	13
+nover:: 20	19	13	13	13
+20
+nover:: 0	0	5	8	9
+class1	1.07718	-0.486412
+class1	0.24278	-3.03942
+class1	0.552984	-3.02089
+class1	0.884364	-0.175412
+class1	0.0233282	-1.99974
+nover:: 0	0	5	8	9
+class2	0.0588245	1.09146
+class2	-1.91701	0.722703
+class2	-0.54539	1.73506
+class2	-0.656098	2.91975
+class2	-0.753942	0.60349
+class2	-0.586019	1.509
+class2	-0.709631	3.39595
+class2	-0.379547	1.37618
+nover:: 0	0	5	8	9
+class3	-0.321659	-0.89113
+class3	-0.764689	-0.979081
+class3	-0.527758	-0.372643
+class3	-1.09121	-3.28936
+class3	-0.870992	-3.68519
+class3	-0.159452	-0.911718
+class3	-0.0325197	-1.11635
+class3	-1.10652	-0.862576
+class3	0.316739	-1.12427
+nover:: 0	0	5	8	9
+20
+nover:: 4	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 16	19	13	13	13
+nover:: 20	19	13	13	13
+20
+nover:: 0	0	5	8	9
+class1	1.07718	-2.05587
+class1	0.24278	-0.329011
+class1	0.552984	-0.331029
+class1	0.884364	-5.70087
+class1	0.0233282	-0.500064
+nover:: 1	0	5	8	9
+class2	0.0588245	0.916201
+class2	-1.91701	1.38369
+class2	-0.54539	0.57635
+class2	-0.656098	0.342495
+class2	-0.753942	1.65703
+class2	-0.586019	0.662693
+class2	-0.709631	0.294469
+class2	-0.379547	0.726648
+nover:: 1	0	5	8	9
+class3	-0.321659	-1.12217
+class3	-0.764689	-1.02137
+class3	-0.527758	-2.68353
+class3	-1.09121	-0.304011
+class3	-0.870992	-0.271356
+class3	-0.159452	-1.09683
+class3	-0.0325197	-0.895775
+class3	-1.10652	-1.15932
+class3	0.316739	-0.889468
+nover:: 1	0	5	8	9
+21
+nover:: 6	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 21	19	13	13	13
+21
+nover:: 0	0	5	8	9
+class1	1.07718	0.706095
+class1	0.24278	1.15826
+class1	0.552984	0.770195
+class1	0.884364	1.04073
+class1	0.0233282	1.21056
+nover:: 0	0	5	8	9
+class2	0.0588245	0.0657279
+class2	-1.91701	-0.258469
+class2	-0.54539	0.114074
+class2	-0.656098	0.268467
+class2	-0.753942	-0.431059
+class2	-0.586019	0.321275
+class2	-0.709631	0.717283
+class2	-0.379547	0.245956
+nover:: 0	0	5	8	9
+class3	-0.321659	-1.71869
+class3	-0.764689	-1.97183
+class3	-0.527758	-0.90312
+class3	-1.09121	-1.40901
+class3	-0.870992	-1.18906
+class3	-0.159452	-0.433313
+class3	-0.0325197	-1.54608
+class3	-1.10652	-1.96157
+class3	0.316739	-0.963367
+nover:: 0	0	5	8	9
+21
+nover:: 3	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 18	19	13	13	13
+18
+nover:: 0	0	5	8	9
+class1	1.07718	0.229011
+class1	0.24278	0.76531
+class1	0.552984	0.546891
+class1	0.884364	0.154689
+class1	0.0233282	0.722221
+nover:: 1	0	5	8	9
+class2	0.0588245	0.706366
+class2	-1.91701	0.623828
+class2	-0.54539	0.266022
+class2	-0.656098	0.39706
+class2	-0.753942	0.610012
+class2	-0.586019	0.814849
+class2	-0.709631	0.850353
+class2	-0.379547	0.783187
+nover:: 1	0	5	8	9
+class3	-0.321659	-0.724201
+class3	-0.764689	-0.827979
+class3	-0.527758	-0.242729
+class3	-1.09121	-0.882203
+class3	-0.870992	-0.804758
+class3	-0.159452	-0.205184
+class3	-0.0325197	-0.728094
+class3	-1.10652	-0.788535
+class3	0.316739	-0.488056
+nover:: 1	0	5	8	9
+19
+nover:: 3	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 18	19	13	13	13
+18
+nover:: 0	0	5	8	9
+class1	1.07718	4.32785
+class1	0.24278	1.14742
+class1	0.552984	1.72817
+class1	0.884364	6.43861
+class1	0.0233282	1.23914
+nover:: 1	0	5	8	9
+class2	0.0588245	1.27494
+class2	-1.91701	1.48449
+class2	-0.54539	3.71383
+class2	-0.656098	2.44911
+class2	-0.753942	1.52422
+class2	-0.586019	1.0499
+class2	-0.709631	0.983616
+class2	-0.379547	1.11139
+nover:: 1	0	5	8	9
+class3	-0.321659	-1.23476
+class3	-0.764689	-1.02512
+class3	-0.527758	-4.07867
+class3	-1.09121	-0.92548
+class3	-0.870992	-1.06921
+class3	-0.159452	-4.83906
+class3	-0.0325197	-1.22619
+class3	-1.10652	-1.10081
+class3	0.316739	-1.96132
+nover:: 1	0	5	8	9
+19
+nover:: 3	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 10	19	13	13	13
+nover:: 18	19	13	13	13
+18
+nover:: 0	0	5	8	9
+class1	1.07718	0.613634
+class1	0.24278	0.955197
+class1	0.552984	0.833306
+class1	0.884364	0.53753
+class1	0.0233282	0.931021
+nover:: 1	0	5	8	9
+class2	0.0588245	0.922225
+class2	-1.91701	0.876612
+class2	-0.54539	0.645743
+class2	-0.656098	0.741874
+class2	-0.753942	0.86893
+class2	-0.586019	0.983898
+class2	-0.709631	1.00552
+class2	-0.379547	0.965409
+nover:: 1	0	5	8	9
+class3	-0.321659	-0.932122
+class3	-0.764689	-0.991764
+class3	-0.527758	-0.625884
+class3	-1.09121	-1.02615
+class3	-0.870992	-0.97794
+class3	-0.159452	-0.591216
+class3	-0.0325197	-0.934289
+class3	-1.10652	-0.968492
+class3	0.316739	-0.798885
+nover:: 1	0	5	8	9
+19
+nover:: 2	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 15	19	13	13	13
+15
+nover:: 0	0	5	8	9
+class1	1.07718	-0.109762
+class1	0.24278	-0.249901
+class1	0.552984	-0.110839
+class1	0.884364	-0.137517
+class1	0.0233282	-0.325673
+nover:: 1	0	5	8	9
+class2	0.0588245	0.563653
+class2	-1.91701	0.627892
+class2	-0.54539	0.0417872
+class2	-0.656098	0.0570999
+class2	-0.753942	0.713242
+class2	-0.586019	0.601193
+class2	-0.709631	0.30436
+class2	-0.379547	0.588291
+nover:: 1	0	5	8	9
+class3	-0.321659	-0.73603
+class3	-0.764689	-0.97192
+class3	-0.527758	-0.161313
+class3	-1.09121	-0.35494
+class3	-0.870992	-0.237363
+class3	-0.159452	-0.04684
+class3	-0.0325197	-0.595781
+class3	-1.10652	-0.956706
+class3	0.316739	-0.231225
+nover:: 3	0	5	8	9
+18
+nover:: 2	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 12	19	13	13	13
+12
+nover:: 0	0	5	8	9
+class1	1.07718	-0.243971
+class1	0.24278	0.584782
+class1	0.552984	0.387098
+class1	0.884364	-0.730107
+class1	0.0233282	0.403453
+nover:: 0	0	5	8	9
+class2	0.0588245	1.50298
+class2	-1.91701	1.60573
+class2	-0.54539	0.424454
+class2	-0.656098	0.548155
+class2	-0.753942	1.74321
+class2	-0.586019	1.58366
+class2	-0.709631	1.31603
+class2	-0.379547	1.55359
+nover:: 0	0	5	8	9
+class3	-0.321659	0.0989428
+class3	-0.764689	0.0208427
+class3	-0.527758	0.412765
+class3	-1.09121	-0.752031
+class3	-0.870992	-0.681478
+class3	-0.159452	0.0200101
+class3	-0.0325197	-0.0849994
+class3	-1.10652	0.144729
+class3	0.316739	-0.056356
+nover:: 0	0	5	8	9
+12
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 9	19	13	13	13
+9
+nover:: 0	0	5	8	9
+class1	0.706095	0.621864
+class1	1.15826	0.750707
+class1	0.770195	0.82568
+class1	1.04073	0.412541
+class1	1.21056	0.667941
+nover:: 0	0	5	8	9
+class2	0.0657279	2.05161
+class2	0.258469	2.53984
+class2	0.114074	1.16788
+class2	0.268467	1.15009
+class2	0.431059	2.96576
+class2	0.321275	1.87985
+class2	0.717283	1.34901
+class2	0.245956	1.92287
+nover:: 1	0	5	8	9
+class3	1.71869	2.48139
+class3	1.97183	2.70834
+class3	0.90312	1.93082
+class3	1.40901	1.38887
+class3	1.18906	1.2889
+class3	0.433313	1.25441
+class3	1.54608	2.0762
+class3	1.96157	2.86667
+class3	0.963367	1.57382
+nover:: 2	0	5	8	9
+11
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 7	19	13	13	13
+7
+nover:: 0	0	5	8	9
+class1	0.706095	-2.10512
+class1	1.15826	-3.48748
+class1	0.770195	-5.22061
+class1	1.04073	-1.12941
+class1	1.21056	-2.47797
+nover:: 0	0	5	8	9
+class2	0.0657279	1.39155
+class2	0.258469	1.07285
+class2	0.114074	6.4437
+class2	0.268467	7.15082
+class2	0.431059	0.919849
+class2	0.321275	1.5843
+class2	0.717283	3.34031
+class2	0.245956	1.52947
+nover:: 1	0	5	8	9
+class3	1.71869	1.10033
+class3	1.97183	1.00368
+class3	0.90312	1.51989
+class3	1.40901	3.04424
+class3	1.18906	3.94025
+class3	0.433313	4.41186
+class3	1.54608	1.36885
+class3	1.96157	0.949531
+class3	0.963367	2.20505
+nover:: 2	0	5	8	9
+9
+11	9	9	7	0.00757173
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 6	19	13	13	13
+nover:: 7	19	13	13	13
+7
+nover:: 0	0	5	8	9
+class1	0.706095	-0.780263
+class1	1.15826	-0.659421
+class1	0.770195	-0.576447
+class1	1.04073	-0.960247
+class1	1.21056	-0.738983
+nover:: 0	0	5	8	9
+class2	0.0657279	0.895709
+class2	0.258469	0.976834
+class2	0.114074	0.537388
+class2	0.268467	0.519057
+class2	0.431059	1.02824
+class2	0.321275	0.857803
+class2	0.717283	0.668967
+class2	0.245956	0.867933
+nover:: 1	0	5	8	9
+class3	1.71869	0.968632
+class3	1.97183	0.998777
+class3	0.90312	0.869753
+class3	1.40901	0.689986
+class3	1.18906	0.633129
+class3	0.433313	0.609714
+class3	1.54608	0.900632
+class3	1.96157	1.01741
+class3	0.963367	0.768294
+nover:: 2	0	5	8	9
+9
+11	8	9	6	0.00764478
+nover:: 10	19	13	13	13
+nover:: 16	19	13	13	13
+nover:: 29	19	13	13	13
+nover:: 38	19	13	13	13
+38
+nover:: 0	0	5	8	9
+class1	0.706095	-0.486412
+class1	1.15826	-3.03942
+class1	0.770195	-3.02089
+class1	1.04073	-0.175412
+class1	1.21056	-1.99974
+nover:: 2	0	5	8	9
+class2	0.0657279	1.09146
+class2	0.258469	0.722703
+class2	0.114074	1.73506
+class2	0.268467	2.91975
+class2	0.431059	0.60349
+class2	0.321275	1.509
+class2	0.717283	3.39595
+class2	0.245956	1.37618
+nover:: 2	0	5	8	9
+class3	1.71869	-0.89113
+class3	1.97183	-0.979081
+class3	0.90312	-0.372643
+class3	1.40901	-3.28936
+class3	1.18906	-3.68519
+class3	0.433313	-0.911718
+class3	1.54608	-1.11635
+class3	1.96157	-0.862576
+class3	0.963367	-1.12427
+nover:: 4	0	5	8	9
+42
+nover:: 10	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 31	19	13	13	13
+nover:: 40	19	13	13	13
+40
+nover:: 0	0	5	8	9
+class1	0.706095	-2.05587
+class1	1.15826	-0.329011
+class1	0.770195	-0.331029
+class1	1.04073	-5.70087
+class1	1.21056	-0.500064
+nover:: 2	0	5	8	9
+class2	0.0657279	0.916201
+class2	0.258469	1.38369
+class2	0.114074	0.57635
+class2	0.268467	0.342495
+class2	0.431059	1.65703
+class2	0.321275	0.662693
+class2	0.717283	0.294469
+class2	0.245956	0.726648
+nover:: 2	0	5	8	9
+class3	1.71869	-1.12217
+class3	1.97183	-1.02137
+class3	0.90312	-2.68353
+class3	1.40901	-0.304011
+class3	1.18906	-0.271356
+class3	0.433313	-1.09683
+class3	1.54608	-0.895775
+class3	1.96157	-1.15932
+class3	0.963367	-0.889468
+nover:: 4	0	5	8	9
+44
+nover:: 19	19	13	13	13
+nover:: 23	19	13	13	13
+nover:: 45	19	13	13	13
+nover:: 52	19	13	13	13
+52
+nover:: 0	0	5	8	9
+class1	0.706095	0.706095
+class1	1.15826	1.15826
+class1	0.770195	0.770195
+class1	1.04073	1.04073
+class1	1.21056	1.21056
+nover:: 1	0	5	8	9
+class2	0.0657279	0.0657279
+class2	0.258469	-0.258469
+class2	0.114074	0.114074
+class2	0.268467	0.268467
+class2	0.431059	-0.431059
+class2	0.321275	0.321275
+class2	0.717283	0.717283
+class2	0.245956	0.245956
+nover:: 2	0	5	8	9
+class3	1.71869	-1.71869
+class3	1.97183	-1.97183
+class3	0.90312	-0.90312
+class3	1.40901	-1.40901
+class3	1.18906	-1.18906
+class3	0.433313	-0.433313
+class3	1.54608	-1.54608
+class3	1.96157	-1.96157
+class3	0.963367	-0.963367
+nover:: 2	0	5	8	9
+54
+nover:: 2	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 9	19	13	13	13
+9
+nover:: 0	0	5	8	9
+class1	0.706095	0.229011
+class1	1.15826	0.76531
+class1	0.770195	0.546891
+class1	1.04073	0.154689
+class1	1.21056	0.722221
+nover:: 0	0	5	8	9
+class2	0.0657279	0.706366
+class2	0.258469	0.623828
+class2	0.114074	0.266022
+class2	0.268467	0.39706
+class2	0.431059	0.610012
+class2	0.321275	0.814849
+class2	0.717283	0.850353
+class2	0.245956	0.783187
+nover:: 0	0	5	8	9
+class3	1.71869	-0.724201
+class3	1.97183	-0.827979
+class3	0.90312	-0.242729
+class3	1.40901	-0.882203
+class3	1.18906	-0.804758
+class3	0.433313	-0.205184
+class3	1.54608	-0.728094
+class3	1.96157	-0.788535
+class3	0.963367	-0.488056
+nover:: 0	0	5	8	9
+9
+11	4	9	5	0.00765516
+nover:: 2	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 9	19	13	13	13
+9
+nover:: 0	0	5	8	9
+class1	0.706095	4.32785
+class1	1.15826	1.14742
+class1	0.770195	1.72817
+class1	1.04073	6.43861
+class1	1.21056	1.23914
+nover:: 0	0	5	8	9
+class2	0.0657279	1.27494
+class2	0.258469	1.48449
+class2	0.114074	3.71383
+class2	0.268467	2.44911
+class2	0.431059	1.52422
+class2	0.321275	1.0499
+class2	0.717283	0.983616
+class2	0.245956	1.11139
+nover:: 0	0	5	8	9
+class3	1.71869	-1.23476
+class3	1.97183	-1.02512
+class3	0.90312	-4.07867
+class3	1.40901	-0.92548
+class3	1.18906	-1.06921
+class3	0.433313	-4.83906
+class3	1.54608	-1.22619
+class3	1.96157	-1.10081
+class3	0.963367	-1.96132
+nover:: 0	0	5	8	9
+9
+nover:: 2	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 9	19	13	13	13
+9
+nover:: 0	0	5	8	9
+class1	0.706095	0.613634
+class1	1.15826	0.955197
+class1	0.770195	0.833306
+class1	1.04073	0.53753
+class1	1.21056	0.931021
+nover:: 0	0	5	8	9
+class2	0.0657279	0.922225
+class2	0.258469	0.876612
+class2	0.114074	0.645743
+class2	0.268467	0.741874
+class2	0.431059	0.86893
+class2	0.321275	0.983898
+class2	0.717283	1.00552
+class2	0.245956	0.965409
+nover:: 0	0	5	8	9
+class3	1.71869	-0.932122
+class3	1.97183	-0.991764
+class3	0.90312	-0.625884
+class3	1.40901	-1.02615
+class3	1.18906	-0.97794
+class3	0.433313	-0.591216
+class3	1.54608	-0.934289
+class3	1.96157	-0.968492
+class3	0.963367	-0.798885
+nover:: 0	0	5	8	9
+9
+11	2	9	5	0.00708467
+nover:: 7	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 28	19	13	13	13
+nover:: 39	19	13	13	13
+39
+nover:: 0	0	5	8	9
+class1	0.706095	-0.109762
+class1	1.15826	-0.249901
+class1	0.770195	-0.110839
+class1	1.04073	-0.137517
+class1	1.21056	-0.325673
+nover:: 1	0	5	8	9
+class2	0.0657279	0.563653
+class2	0.258469	0.627892
+class2	0.114074	0.0417872
+class2	0.268467	0.0570999
+class2	0.431059	0.713242
+class2	0.321275	0.601193
+class2	0.717283	0.30436
+class2	0.245956	0.588291
+nover:: 1	0	5	8	9
+class3	1.71869	-0.73603
+class3	1.97183	-0.97192
+class3	0.90312	-0.161313
+class3	1.40901	-0.35494
+class3	1.18906	-0.237363
+class3	0.433313	-0.04684
+class3	1.54608	-0.595781
+class3	1.96157	-0.956706
+class3	0.963367	-0.231225
+nover:: 3	0	5	8	9
+42
+nover:: 0	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 16	19	13	13	13
+16
+nover:: 0	0	5	8	9
+class1	0.706095	-0.243971
+class1	1.15826	0.584782
+class1	0.770195	0.387098
+class1	1.04073	-0.730107
+class1	1.21056	0.403453
+nover:: 2	0	5	8	9
+class2	0.0657279	1.50298
+class2	0.258469	1.60573
+class2	0.114074	0.424454
+class2	0.268467	0.548155
+class2	0.431059	1.74321
+class2	0.321275	1.58366
+class2	0.717283	1.31603
+class2	0.245956	1.55359
+nover:: 2	0	5	8	9
+class3	1.71869	0.0989428
+class3	1.97183	0.0208427
+class3	0.90312	0.412765
+class3	1.40901	-0.752031
+class3	1.18906	-0.681478
+class3	0.433313	0.0200101
+class3	1.54608	-0.0849994
+class3	1.96157	0.144729
+class3	0.963367	-0.056356
+nover:: 2	0	5	8	9
+18
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 1	19	13	13	13
+nover:: 2	19	13	13	13
+2
+nover:: 0	0	5	8	9
+class1	0.621864	-2.10512
+class1	0.750707	-3.48748
+class1	0.82568	-5.22061
+class1	0.412541	-1.12941
+class1	0.667941	-2.47797
+nover:: 0	0	5	8	9
+class2	2.05161	1.39155
+class2	2.53984	1.07285
+class2	1.16788	6.4437
+class2	1.15009	7.15082
+class2	2.96576	0.919849
+class2	1.87985	1.5843
+class2	1.34901	3.34031
+class2	1.92287	1.52947
+nover:: 0	0	5	8	9
+class3	2.48139	1.10033
+class3	2.70834	1.00368
+class3	1.93082	1.51989
+class3	1.38887	3.04424
+class3	1.2889	3.94025
+class3	1.25441	4.41186
+class3	2.0762	1.36885
+class3	2.86667	0.949531
+class3	1.57382	2.20505
+nover:: 0	0	5	8	9
+2
+10	9	2	28	0.00696602
+nover:: 8	19	13	13	13
+nover:: 16	19	13	13	13
+nover:: 18	19	13	13	13
+nover:: 19	19	13	13	13
+19
+nover:: 0	0	5	8	9
+class1	0.621864	-0.780263
+class1	0.750707	-0.659421
+class1	0.82568	-0.576447
+class1	0.412541	-0.960247
+class1	0.667941	-0.738983
+nover:: 0	0	5	8	9
+class2	2.05161	0.895709
+class2	2.53984	0.976834
+class2	1.16788	0.537388
+class2	1.15009	0.519057
+class2	2.96576	1.02824
+class2	1.87985	0.857803
+class2	1.34901	0.668967
+class2	1.92287	0.867933
+nover:: 0	0	5	8	9
+class3	2.48139	0.968632
+class3	2.70834	0.998777
+class3	1.93082	0.869753
+class3	1.38887	0.689986
+class3	1.2889	0.633129
+class3	1.25441	0.609714
+class3	2.0762	0.900632
+class3	2.86667	1.01741
+class3	1.57382	0.768294
+nover:: 0	0	5	8	9
+19
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.621864	-0.486412
+class1	0.750707	-3.03942
+class1	0.82568	-3.02089
+class1	0.412541	-0.175412
+class1	0.667941	-1.99974
+nover:: 0	0	5	8	9
+class2	2.05161	1.09146
+class2	2.53984	0.722703
+class2	1.16788	1.73506
+class2	1.15009	2.91975
+class2	2.96576	0.60349
+class2	1.87985	1.509
+class2	1.34901	3.39595
+class2	1.92287	1.37618
+nover:: 0	0	5	8	9
+class3	2.48139	-0.89113
+class3	2.70834	-0.979081
+class3	1.93082	-0.372643
+class3	1.38887	-3.28936
+class3	1.2889	-3.68519
+class3	1.25441	-0.911718
+class3	2.0762	-1.11635
+class3	2.86667	-0.862576
+class3	1.57382	-1.12427
+nover:: 0	0	5	8	9
+0
+10	7	0	0	0.00602858
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.621864	-2.05587
+class1	0.750707	-0.329011
+class1	0.82568	-0.331029
+class1	0.412541	-5.70087
+class1	0.667941	-0.500064
+nover:: 0	0	5	8	9
+class2	2.05161	0.916201
+class2	2.53984	1.38369
+class2	1.16788	0.57635
+class2	1.15009	0.342495
+class2	2.96576	1.65703
+class2	1.87985	0.662693
+class2	1.34901	0.294469
+class2	1.92287	0.726648
+nover:: 0	0	5	8	9
+class3	2.48139	-1.12217
+class3	2.70834	-1.02137
+class3	1.93082	-2.68353
+class3	1.38887	-0.304011
+class3	1.2889	-0.271356
+class3	1.25441	-1.09683
+class3	2.0762	-0.895775
+class3	2.86667	-1.15932
+class3	1.57382	-0.889468
+nover:: 0	0	5	8	9
+0
+10	6	0	0	0.00531742
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.621864	0.706095
+class1	0.750707	1.15826
+class1	0.82568	0.770195
+class1	0.412541	1.04073
+class1	0.667941	1.21056
+nover:: 0	0	5	8	9
+class2	2.05161	0.0657279
+class2	2.53984	-0.258469
+class2	1.16788	0.114074
+class2	1.15009	0.268467
+class2	2.96576	-0.431059
+class2	1.87985	0.321275
+class2	1.34901	0.717283
+class2	1.92287	0.245956
+nover:: 0	0	5	8	9
+class3	2.48139	-1.71869
+class3	2.70834	-1.97183
+class3	1.93082	-0.90312
+class3	1.38887	-1.40901
+class3	1.2889	-1.18906
+class3	1.25441	-0.433313
+class3	2.0762	-1.54608
+class3	2.86667	-1.96157
+class3	1.57382	-0.963367
+nover:: 0	0	5	8	9
+0
+10	5	0	0	0.00514348
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.621864	0.229011
+class1	0.750707	0.76531
+class1	0.82568	0.546891
+class1	0.412541	0.154689
+class1	0.667941	0.722221
+nover:: 0	0	5	8	9
+class2	2.05161	0.706366
+class2	2.53984	0.623828
+class2	1.16788	0.266022
+class2	1.15009	0.39706
+class2	2.96576	0.610012
+class2	1.87985	0.814849
+class2	1.34901	0.850353
+class2	1.92287	0.783187
+nover:: 0	0	5	8	9
+class3	2.48139	-0.724201
+class3	2.70834	-0.827979
+class3	1.93082	-0.242729
+class3	1.38887	-0.882203
+class3	1.2889	-0.804758
+class3	1.25441	-0.205184
+class3	2.0762	-0.728094
+class3	2.86667	-0.788535
+class3	1.57382	-0.488056
+nover:: 0	0	5	8	9
+0
+10	4	0	0	0.00493589
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.621864	4.32785
+class1	0.750707	1.14742
+class1	0.82568	1.72817
+class1	0.412541	6.43861
+class1	0.667941	1.23914
+nover:: 0	0	5	8	9
+class2	2.05161	1.27494
+class2	2.53984	1.48449
+class2	1.16788	3.71383
+class2	1.15009	2.44911
+class2	2.96576	1.52422
+class2	1.87985	1.0499
+class2	1.34901	0.983616
+class2	1.92287	1.11139
+nover:: 0	0	5	8	9
+class3	2.48139	-1.23476
+class3	2.70834	-1.02512
+class3	1.93082	-4.07867
+class3	1.38887	-0.92548
+class3	1.2889	-1.06921
+class3	1.25441	-4.83906
+class3	2.0762	-1.22619
+class3	2.86667	-1.10081
+class3	1.57382	-1.96132
+nover:: 0	0	5	8	9
+0
+10	3	0	0	0.00469009
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.621864	0.613634
+class1	0.750707	0.955197
+class1	0.82568	0.833306
+class1	0.412541	0.53753
+class1	0.667941	0.931021
+nover:: 0	0	5	8	9
+class2	2.05161	0.922225
+class2	2.53984	0.876612
+class2	1.16788	0.645743
+class2	1.15009	0.741874
+class2	2.96576	0.86893
+class2	1.87985	0.983898
+class2	1.34901	1.00552
+class2	1.92287	0.965409
+nover:: 0	0	5	8	9
+class3	2.48139	-0.932122
+class3	2.70834	-0.991764
+class3	1.93082	-0.625884
+class3	1.38887	-1.02615
+class3	1.2889	-0.97794
+class3	1.25441	-0.591216
+class3	2.0762	-0.934289
+class3	2.86667	-0.968492
+class3	1.57382	-0.798885
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.621864	-0.109762
+class1	0.750707	-0.249901
+class1	0.82568	-0.110839
+class1	0.412541	-0.137517
+class1	0.667941	-0.325673
+nover:: 0	0	5	8	9
+class2	2.05161	0.563653
+class2	2.53984	0.627892
+class2	1.16788	0.0417872
+class2	1.15009	0.0570999
+class2	2.96576	0.713242
+class2	1.87985	0.601193
+class2	1.34901	0.30436
+class2	1.92287	0.588291
+nover:: 0	0	5	8	9
+class3	2.48139	-0.73603
+class3	2.70834	-0.97192
+class3	1.93082	-0.161313
+class3	1.38887	-0.35494
+class3	1.2889	-0.237363
+class3	1.25441	-0.04684
+class3	2.0762	-0.595781
+class3	2.86667	-0.956706
+class3	1.57382	-0.231225
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.621864	-0.243971
+class1	0.750707	0.584782
+class1	0.82568	0.387098
+class1	0.412541	-0.730107
+class1	0.667941	0.403453
+nover:: 0	0	5	8	9
+class2	2.05161	1.50298
+class2	2.53984	1.60573
+class2	1.16788	0.424454
+class2	1.15009	0.548155
+class2	2.96576	1.74321
+class2	1.87985	1.58366
+class2	1.34901	1.31603
+class2	1.92287	1.55359
+nover:: 0	0	5	8	9
+class3	2.48139	0.0989428
+class3	2.70834	0.0208427
+class3	1.93082	0.412765
+class3	1.38887	-0.752031
+class3	1.2889	-0.681478
+class3	1.25441	0.0200101
+class3	2.0762	-0.0849994
+class3	2.86667	0.144729
+class3	1.57382	-0.056356
+nover:: 0	0	5	8	9
+0
+nover:: 2	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 6	19	13	13	13
+6
+nover:: 0	0	5	8	9
+class1	-2.10512	-0.780263
+class1	-3.48748	-0.659421
+class1	-5.22061	-0.576447
+class1	-1.12941	-0.960247
+class1	-2.47797	-0.738983
+nover:: 0	0	5	8	9
+class2	1.39155	0.895709
+class2	1.07285	0.976834
+class2	6.4437	0.537388
+class2	7.15082	0.519057
+class2	0.919849	1.02824
+class2	1.5843	0.857803
+class2	3.34031	0.668967
+class2	1.52947	0.867933
+nover:: 0	0	5	8	9
+class3	1.10033	0.968632
+class3	1.00368	0.998777
+class3	1.51989	0.869753
+class3	3.04424	0.689986
+class3	3.94025	0.633129
+class3	4.41186	0.609714
+class3	1.36885	0.900632
+class3	0.949531	1.01741
+class3	2.20505	0.768294
+nover:: 0	0	5	8	9
+6
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.10512	-0.486412
+class1	-3.48748	-3.03942
+class1	-5.22061	-3.02089
+class1	-1.12941	-0.175412
+class1	-2.47797	-1.99974
+nover:: 0	0	5	8	9
+class2	1.39155	1.09146
+class2	1.07285	0.722703
+class2	6.4437	1.73506
+class2	7.15082	2.91975
+class2	0.919849	0.60349
+class2	1.5843	1.509
+class2	3.34031	3.39595
+class2	1.52947	1.37618
+nover:: 0	0	5	8	9
+class3	1.10033	-0.89113
+class3	1.00368	-0.979081
+class3	1.51989	-0.372643
+class3	3.04424	-3.28936
+class3	3.94025	-3.68519
+class3	4.41186	-0.911718
+class3	1.36885	-1.11635
+class3	0.949531	-0.862576
+class3	2.20505	-1.12427
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.10512	-2.05587
+class1	-3.48748	-0.329011
+class1	-5.22061	-0.331029
+class1	-1.12941	-5.70087
+class1	-2.47797	-0.500064
+nover:: 0	0	5	8	9
+class2	1.39155	0.916201
+class2	1.07285	1.38369
+class2	6.4437	0.57635
+class2	7.15082	0.342495
+class2	0.919849	1.65703
+class2	1.5843	0.662693
+class2	3.34031	0.294469
+class2	1.52947	0.726648
+nover:: 0	0	5	8	9
+class3	1.10033	-1.12217
+class3	1.00368	-1.02137
+class3	1.51989	-2.68353
+class3	3.04424	-0.304011
+class3	3.94025	-0.271356
+class3	4.41186	-1.09683
+class3	1.36885	-0.895775
+class3	0.949531	-1.15932
+class3	2.20505	-0.889468
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.10512	0.706095
+class1	-3.48748	1.15826
+class1	-5.22061	0.770195
+class1	-1.12941	1.04073
+class1	-2.47797	1.21056
+nover:: 0	0	5	8	9
+class2	1.39155	0.0657279
+class2	1.07285	-0.258469
+class2	6.4437	0.114074
+class2	7.15082	0.268467
+class2	0.919849	-0.431059
+class2	1.5843	0.321275
+class2	3.34031	0.717283
+class2	1.52947	0.245956
+nover:: 0	0	5	8	9
+class3	1.10033	-1.71869
+class3	1.00368	-1.97183
+class3	1.51989	-0.90312
+class3	3.04424	-1.40901
+class3	3.94025	-1.18906
+class3	4.41186	-0.433313
+class3	1.36885	-1.54608
+class3	0.949531	-1.96157
+class3	2.20505	-0.963367
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.10512	0.229011
+class1	-3.48748	0.76531
+class1	-5.22061	0.546891
+class1	-1.12941	0.154689
+class1	-2.47797	0.722221
+nover:: 0	0	5	8	9
+class2	1.39155	0.706366
+class2	1.07285	0.623828
+class2	6.4437	0.266022
+class2	7.15082	0.39706
+class2	0.919849	0.610012
+class2	1.5843	0.814849
+class2	3.34031	0.850353
+class2	1.52947	0.783187
+nover:: 0	0	5	8	9
+class3	1.10033	-0.724201
+class3	1.00368	-0.827979
+class3	1.51989	-0.242729
+class3	3.04424	-0.882203
+class3	3.94025	-0.804758
+class3	4.41186	-0.205184
+class3	1.36885	-0.728094
+class3	0.949531	-0.788535
+class3	2.20505	-0.488056
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.10512	4.32785
+class1	-3.48748	1.14742
+class1	-5.22061	1.72817
+class1	-1.12941	6.43861
+class1	-2.47797	1.23914
+nover:: 0	0	5	8	9
+class2	1.39155	1.27494
+class2	1.07285	1.48449
+class2	6.4437	3.71383
+class2	7.15082	2.44911
+class2	0.919849	1.52422
+class2	1.5843	1.0499
+class2	3.34031	0.983616
+class2	1.52947	1.11139
+nover:: 0	0	5	8	9
+class3	1.10033	-1.23476
+class3	1.00368	-1.02512
+class3	1.51989	-4.07867
+class3	3.04424	-0.92548
+class3	3.94025	-1.06921
+class3	4.41186	-4.83906
+class3	1.36885	-1.22619
+class3	0.949531	-1.10081
+class3	2.20505	-1.96132
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.10512	0.613634
+class1	-3.48748	0.955197
+class1	-5.22061	0.833306
+class1	-1.12941	0.53753
+class1	-2.47797	0.931021
+nover:: 0	0	5	8	9
+class2	1.39155	0.922225
+class2	1.07285	0.876612
+class2	6.4437	0.645743
+class2	7.15082	0.741874
+class2	0.919849	0.86893
+class2	1.5843	0.983898
+class2	3.34031	1.00552
+class2	1.52947	0.965409
+nover:: 0	0	5	8	9
+class3	1.10033	-0.932122
+class3	1.00368	-0.991764
+class3	1.51989	-0.625884
+class3	3.04424	-1.02615
+class3	3.94025	-0.97794
+class3	4.41186	-0.591216
+class3	1.36885	-0.934289
+class3	0.949531	-0.968492
+class3	2.20505	-0.798885
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.10512	-0.109762
+class1	-3.48748	-0.249901
+class1	-5.22061	-0.110839
+class1	-1.12941	-0.137517
+class1	-2.47797	-0.325673
+nover:: 0	0	5	8	9
+class2	1.39155	0.563653
+class2	1.07285	0.627892
+class2	6.4437	0.0417872
+class2	7.15082	0.0570999
+class2	0.919849	0.713242
+class2	1.5843	0.601193
+class2	3.34031	0.30436
+class2	1.52947	0.588291
+nover:: 0	0	5	8	9
+class3	1.10033	-0.73603
+class3	1.00368	-0.97192
+class3	1.51989	-0.161313
+class3	3.04424	-0.35494
+class3	3.94025	-0.237363
+class3	4.41186	-0.04684
+class3	1.36885	-0.595781
+class3	0.949531	-0.956706
+class3	2.20505	-0.231225
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.10512	-0.243971
+class1	-3.48748	0.584782
+class1	-5.22061	0.387098
+class1	-1.12941	-0.730107
+class1	-2.47797	0.403453
+nover:: 0	0	5	8	9
+class2	1.39155	1.50298
+class2	1.07285	1.60573
+class2	6.4437	0.424454
+class2	7.15082	0.548155
+class2	0.919849	1.74321
+class2	1.5843	1.58366
+class2	3.34031	1.31603
+class2	1.52947	1.55359
+nover:: 0	0	5	8	9
+class3	1.10033	0.0989428
+class3	1.00368	0.0208427
+class3	1.51989	0.412765
+class3	3.04424	-0.752031
+class3	3.94025	-0.681478
+class3	4.41186	0.0200101
+class3	1.36885	-0.0849994
+class3	0.949531	0.144729
+class3	2.20505	-0.056356
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.780263	-0.486412
+class1	-0.659421	-3.03942
+class1	-0.576447	-3.02089
+class1	-0.960247	-0.175412
+class1	-0.738983	-1.99974
+nover:: 0	0	5	8	9
+class2	0.895709	1.09146
+class2	0.976834	0.722703
+class2	0.537388	1.73506
+class2	0.519057	2.91975
+class2	1.02824	0.60349
+class2	0.857803	1.509
+class2	0.668967	3.39595
+class2	0.867933	1.37618
+nover:: 0	0	5	8	9
+class3	0.968632	-0.89113
+class3	0.998777	-0.979081
+class3	0.869753	-0.372643
+class3	0.689986	-3.28936
+class3	0.633129	-3.68519
+class3	0.609714	-0.911718
+class3	0.900632	-1.11635
+class3	1.01741	-0.862576
+class3	0.768294	-1.12427
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.780263	-2.05587
+class1	-0.659421	-0.329011
+class1	-0.576447	-0.331029
+class1	-0.960247	-5.70087
+class1	-0.738983	-0.500064
+nover:: 0	0	5	8	9
+class2	0.895709	0.916201
+class2	0.976834	1.38369
+class2	0.537388	0.57635
+class2	0.519057	0.342495
+class2	1.02824	1.65703
+class2	0.857803	0.662693
+class2	0.668967	0.294469
+class2	0.867933	0.726648
+nover:: 0	0	5	8	9
+class3	0.968632	-1.12217
+class3	0.998777	-1.02137
+class3	0.869753	-2.68353
+class3	0.689986	-0.304011
+class3	0.633129	-0.271356
+class3	0.609714	-1.09683
+class3	0.900632	-0.895775
+class3	1.01741	-1.15932
+class3	0.768294	-0.889468
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.780263	0.706095
+class1	-0.659421	1.15826
+class1	-0.576447	0.770195
+class1	-0.960247	1.04073
+class1	-0.738983	1.21056
+nover:: 0	0	5	8	9
+class2	0.895709	0.0657279
+class2	0.976834	-0.258469
+class2	0.537388	0.114074
+class2	0.519057	0.268467
+class2	1.02824	-0.431059
+class2	0.857803	0.321275
+class2	0.668967	0.717283
+class2	0.867933	0.245956
+nover:: 0	0	5	8	9
+class3	0.968632	-1.71869
+class3	0.998777	-1.97183
+class3	0.869753	-0.90312
+class3	0.689986	-1.40901
+class3	0.633129	-1.18906
+class3	0.609714	-0.433313
+class3	0.900632	-1.54608
+class3	1.01741	-1.96157
+class3	0.768294	-0.963367
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.780263	0.229011
+class1	-0.659421	0.76531
+class1	-0.576447	0.546891
+class1	-0.960247	0.154689
+class1	-0.738983	0.722221
+nover:: 0	0	5	8	9
+class2	0.895709	0.706366
+class2	0.976834	0.623828
+class2	0.537388	0.266022
+class2	0.519057	0.39706
+class2	1.02824	0.610012
+class2	0.857803	0.814849
+class2	0.668967	0.850353
+class2	0.867933	0.783187
+nover:: 0	0	5	8	9
+class3	0.968632	-0.724201
+class3	0.998777	-0.827979
+class3	0.869753	-0.242729
+class3	0.689986	-0.882203
+class3	0.633129	-0.804758
+class3	0.609714	-0.205184
+class3	0.900632	-0.728094
+class3	1.01741	-0.788535
+class3	0.768294	-0.488056
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.780263	4.32785
+class1	-0.659421	1.14742
+class1	-0.576447	1.72817
+class1	-0.960247	6.43861
+class1	-0.738983	1.23914
+nover:: 0	0	5	8	9
+class2	0.895709	1.27494
+class2	0.976834	1.48449
+class2	0.537388	3.71383
+class2	0.519057	2.44911
+class2	1.02824	1.52422
+class2	0.857803	1.0499
+class2	0.668967	0.983616
+class2	0.867933	1.11139
+nover:: 0	0	5	8	9
+class3	0.968632	-1.23476
+class3	0.998777	-1.02512
+class3	0.869753	-4.07867
+class3	0.689986	-0.92548
+class3	0.633129	-1.06921
+class3	0.609714	-4.83906
+class3	0.900632	-1.22619
+class3	1.01741	-1.10081
+class3	0.768294	-1.96132
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.780263	0.613634
+class1	-0.659421	0.955197
+class1	-0.576447	0.833306
+class1	-0.960247	0.53753
+class1	-0.738983	0.931021
+nover:: 0	0	5	8	9
+class2	0.895709	0.922225
+class2	0.976834	0.876612
+class2	0.537388	0.645743
+class2	0.519057	0.741874
+class2	1.02824	0.86893
+class2	0.857803	0.983898
+class2	0.668967	1.00552
+class2	0.867933	0.965409
+nover:: 0	0	5	8	9
+class3	0.968632	-0.932122
+class3	0.998777	-0.991764
+class3	0.869753	-0.625884
+class3	0.689986	-1.02615
+class3	0.633129	-0.97794
+class3	0.609714	-0.591216
+class3	0.900632	-0.934289
+class3	1.01741	-0.968492
+class3	0.768294	-0.798885
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.780263	-0.109762
+class1	-0.659421	-0.249901
+class1	-0.576447	-0.110839
+class1	-0.960247	-0.137517
+class1	-0.738983	-0.325673
+nover:: 0	0	5	8	9
+class2	0.895709	0.563653
+class2	0.976834	0.627892
+class2	0.537388	0.0417872
+class2	0.519057	0.0570999
+class2	1.02824	0.713242
+class2	0.857803	0.601193
+class2	0.668967	0.30436
+class2	0.867933	0.588291
+nover:: 0	0	5	8	9
+class3	0.968632	-0.73603
+class3	0.998777	-0.97192
+class3	0.869753	-0.161313
+class3	0.689986	-0.35494
+class3	0.633129	-0.237363
+class3	0.609714	-0.04684
+class3	0.900632	-0.595781
+class3	1.01741	-0.956706
+class3	0.768294	-0.231225
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.780263	-0.243971
+class1	-0.659421	0.584782
+class1	-0.576447	0.387098
+class1	-0.960247	-0.730107
+class1	-0.738983	0.403453
+nover:: 0	0	5	8	9
+class2	0.895709	1.50298
+class2	0.976834	1.60573
+class2	0.537388	0.424454
+class2	0.519057	0.548155
+class2	1.02824	1.74321
+class2	0.857803	1.58366
+class2	0.668967	1.31603
+class2	0.867933	1.55359
+nover:: 0	0	5	8	9
+class3	0.968632	0.0989428
+class3	0.998777	0.0208427
+class3	0.869753	0.412765
+class3	0.689986	-0.752031
+class3	0.633129	-0.681478
+class3	0.609714	0.0200101
+class3	0.900632	-0.0849994
+class3	1.01741	0.144729
+class3	0.768294	-0.056356
+nover:: 0	0	5	8	9
+0
+nover:: 1	19	13	13	13
+nover:: 1	19	13	13	13
+nover:: 1	19	13	13	13
+nover:: 1	19	13	13	13
+1
+nover:: 0	0	5	8	9
+class1	-0.486412	-2.05587
+class1	-3.03942	-0.329011
+class1	-3.02089	-0.331029
+class1	-0.175412	-5.70087
+class1	-1.99974	-0.500064
+nover:: 0	0	5	8	9
+class2	1.09146	0.916201
+class2	0.722703	1.38369
+class2	1.73506	0.57635
+class2	2.91975	0.342495
+class2	0.60349	1.65703
+class2	1.509	0.662693
+class2	3.39595	0.294469
+class2	1.37618	0.726648
+nover:: 0	0	5	8	9
+class3	-0.89113	-1.12217
+class3	-0.979081	-1.02137
+class3	-0.372643	-2.68353
+class3	-3.28936	-0.304011
+class3	-3.68519	-0.271356
+class3	-0.911718	-1.09683
+class3	-1.11635	-0.895775
+class3	-0.862576	-1.15932
+class3	-1.12427	-0.889468
+nover:: 0	0	5	8	9
+1
+nover:: 4	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 13	19	13	13	13
+nover:: 13	19	13	13	13
+13
+nover:: 0	0	5	8	9
+class1	-0.486412	0.706095
+class1	-3.03942	1.15826
+class1	-3.02089	0.770195
+class1	-0.175412	1.04073
+class1	-1.99974	1.21056
+nover:: 0	0	5	8	9
+class2	1.09146	0.0657279
+class2	0.722703	-0.258469
+class2	1.73506	0.114074
+class2	2.91975	0.268467
+class2	0.60349	-0.431059
+class2	1.509	0.321275
+class2	3.39595	0.717283
+class2	1.37618	0.245956
+nover:: 0	0	5	8	9
+class3	-0.89113	-1.71869
+class3	-0.979081	-1.97183
+class3	-0.372643	-0.90312
+class3	-3.28936	-1.40901
+class3	-3.68519	-1.18906
+class3	-0.911718	-0.433313
+class3	-1.11635	-1.54608
+class3	-0.862576	-1.96157
+class3	-1.12427	-0.963367
+nover:: 0	0	5	8	9
+13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.486412	0.229011
+class1	-3.03942	0.76531
+class1	-3.02089	0.546891
+class1	-0.175412	0.154689
+class1	-1.99974	0.722221
+nover:: 0	0	5	8	9
+class2	1.09146	0.706366
+class2	0.722703	0.623828
+class2	1.73506	0.266022
+class2	2.91975	0.39706
+class2	0.60349	0.610012
+class2	1.509	0.814849
+class2	3.39595	0.850353
+class2	1.37618	0.783187
+nover:: 0	0	5	8	9
+class3	-0.89113	-0.724201
+class3	-0.979081	-0.827979
+class3	-0.372643	-0.242729
+class3	-3.28936	-0.882203
+class3	-3.68519	-0.804758
+class3	-0.911718	-0.205184
+class3	-1.11635	-0.728094
+class3	-0.862576	-0.788535
+class3	-1.12427	-0.488056
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.486412	4.32785
+class1	-3.03942	1.14742
+class1	-3.02089	1.72817
+class1	-0.175412	6.43861
+class1	-1.99974	1.23914
+nover:: 0	0	5	8	9
+class2	1.09146	1.27494
+class2	0.722703	1.48449
+class2	1.73506	3.71383
+class2	2.91975	2.44911
+class2	0.60349	1.52422
+class2	1.509	1.0499
+class2	3.39595	0.983616
+class2	1.37618	1.11139
+nover:: 0	0	5	8	9
+class3	-0.89113	-1.23476
+class3	-0.979081	-1.02512
+class3	-0.372643	-4.07867
+class3	-3.28936	-0.92548
+class3	-3.68519	-1.06921
+class3	-0.911718	-4.83906
+class3	-1.11635	-1.22619
+class3	-0.862576	-1.10081
+class3	-1.12427	-1.96132
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-0.486412	0.613634
+class1	-3.03942	0.955197
+class1	-3.02089	0.833306
+class1	-0.175412	0.53753
+class1	-1.99974	0.931021
+nover:: 0	0	5	8	9
+class2	1.09146	0.922225
+class2	0.722703	0.876612
+class2	1.73506	0.645743
+class2	2.91975	0.741874
+class2	0.60349	0.86893
+class2	1.509	0.983898
+class2	3.39595	1.00552
+class2	1.37618	0.965409
+nover:: 0	0	5	8	9
+class3	-0.89113	-0.932122
+class3	-0.979081	-0.991764
+class3	-0.372643	-0.625884
+class3	-3.28936	-1.02615
+class3	-3.68519	-0.97794
+class3	-0.911718	-0.591216
+class3	-1.11635	-0.934289
+class3	-0.862576	-0.968492
+class3	-1.12427	-0.798885
+nover:: 0	0	5	8	9
+0
+nover:: 8	19	13	13	13
+nover:: 14	19	13	13	13
+nover:: 27	19	13	13	13
+nover:: 36	19	13	13	13
+36
+nover:: 0	0	5	8	9
+class1	-0.486412	-0.109762
+class1	-3.03942	-0.249901
+class1	-3.02089	-0.110839
+class1	-0.175412	-0.137517
+class1	-1.99974	-0.325673
+nover:: 1	0	5	8	9
+class2	1.09146	0.563653
+class2	0.722703	0.627892
+class2	1.73506	0.0417872
+class2	2.91975	0.0570999
+class2	0.60349	0.713242
+class2	1.509	0.601193
+class2	3.39595	0.30436
+class2	1.37618	0.588291
+nover:: 1	0	5	8	9
+class3	-0.89113	-0.73603
+class3	-0.979081	-0.97192
+class3	-0.372643	-0.161313
+class3	-3.28936	-0.35494
+class3	-3.68519	-0.237363
+class3	-0.911718	-0.04684
+class3	-1.11635	-0.595781
+class3	-0.862576	-0.956706
+class3	-1.12427	-0.231225
+nover:: 3	0	5	8	9
+39
+nover:: 0	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 8	19	13	13	13
+8
+nover:: 0	0	5	8	9
+class1	-0.486412	-0.243971
+class1	-3.03942	0.584782
+class1	-3.02089	0.387098
+class1	-0.175412	-0.730107
+class1	-1.99974	0.403453
+nover:: 2	0	5	8	9
+class2	1.09146	1.50298
+class2	0.722703	1.60573
+class2	1.73506	0.424454
+class2	2.91975	0.548155
+class2	0.60349	1.74321
+class2	1.509	1.58366
+class2	3.39595	1.31603
+class2	1.37618	1.55359
+nover:: 2	0	5	8	9
+class3	-0.89113	0.0989428
+class3	-0.979081	0.0208427
+class3	-0.372643	0.412765
+class3	-3.28936	-0.752031
+class3	-3.68519	-0.681478
+class3	-0.911718	0.0200101
+class3	-1.11635	-0.0849994
+class3	-0.862576	0.144729
+class3	-1.12427	-0.056356
+nover:: 2	0	5	8	9
+10
+nover:: 4	19	13	13	13
+nover:: 4	19	13	13	13
+nover:: 9	19	13	13	13
+nover:: 9	19	13	13	13
+9
+nover:: 0	0	5	8	9
+class1	-2.05587	0.706095
+class1	-0.329011	1.15826
+class1	-0.331029	0.770195
+class1	-5.70087	1.04073
+class1	-0.500064	1.21056
+nover:: 0	0	5	8	9
+class2	0.916201	0.0657279
+class2	1.38369	-0.258469
+class2	0.57635	0.114074
+class2	0.342495	0.268467
+class2	1.65703	-0.431059
+class2	0.662693	0.321275
+class2	0.294469	0.717283
+class2	0.726648	0.245956
+nover:: 0	0	5	8	9
+class3	-1.12217	-1.71869
+class3	-1.02137	-1.97183
+class3	-2.68353	-0.90312
+class3	-0.304011	-1.40901
+class3	-0.271356	-1.18906
+class3	-1.09683	-0.433313
+class3	-0.895775	-1.54608
+class3	-1.15932	-1.96157
+class3	-0.889468	-0.963367
+nover:: 0	0	5	8	9
+9
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.05587	0.229011
+class1	-0.329011	0.76531
+class1	-0.331029	0.546891
+class1	-5.70087	0.154689
+class1	-0.500064	0.722221
+nover:: 0	0	5	8	9
+class2	0.916201	0.706366
+class2	1.38369	0.623828
+class2	0.57635	0.266022
+class2	0.342495	0.39706
+class2	1.65703	0.610012
+class2	0.662693	0.814849
+class2	0.294469	0.850353
+class2	0.726648	0.783187
+nover:: 0	0	5	8	9
+class3	-1.12217	-0.724201
+class3	-1.02137	-0.827979
+class3	-2.68353	-0.242729
+class3	-0.304011	-0.882203
+class3	-0.271356	-0.804758
+class3	-1.09683	-0.205184
+class3	-0.895775	-0.728094
+class3	-1.15932	-0.788535
+class3	-0.889468	-0.488056
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.05587	4.32785
+class1	-0.329011	1.14742
+class1	-0.331029	1.72817
+class1	-5.70087	6.43861
+class1	-0.500064	1.23914
+nover:: 0	0	5	8	9
+class2	0.916201	1.27494
+class2	1.38369	1.48449
+class2	0.57635	3.71383
+class2	0.342495	2.44911
+class2	1.65703	1.52422
+class2	0.662693	1.0499
+class2	0.294469	0.983616
+class2	0.726648	1.11139
+nover:: 0	0	5	8	9
+class3	-1.12217	-1.23476
+class3	-1.02137	-1.02512
+class3	-2.68353	-4.07867
+class3	-0.304011	-0.92548
+class3	-0.271356	-1.06921
+class3	-1.09683	-4.83906
+class3	-0.895775	-1.22619
+class3	-1.15932	-1.10081
+class3	-0.889468	-1.96132
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	-2.05587	0.613634
+class1	-0.329011	0.955197
+class1	-0.331029	0.833306
+class1	-5.70087	0.53753
+class1	-0.500064	0.931021
+nover:: 0	0	5	8	9
+class2	0.916201	0.922225
+class2	1.38369	0.876612
+class2	0.57635	0.645743
+class2	0.342495	0.741874
+class2	1.65703	0.86893
+class2	0.662693	0.983898
+class2	0.294469	1.00552
+class2	0.726648	0.965409
+nover:: 0	0	5	8	9
+class3	-1.12217	-0.932122
+class3	-1.02137	-0.991764
+class3	-2.68353	-0.625884
+class3	-0.304011	-1.02615
+class3	-0.271356	-0.97794
+class3	-1.09683	-0.591216
+class3	-0.895775	-0.934289
+class3	-1.15932	-0.968492
+class3	-0.889468	-0.798885
+nover:: 0	0	5	8	9
+0
+nover:: 10	19	13	13	13
+nover:: 17	19	13	13	13
+nover:: 32	19	13	13	13
+nover:: 41	19	13	13	13
+41
+nover:: 0	0	5	8	9
+class1	-2.05587	-0.109762
+class1	-0.329011	-0.249901
+class1	-0.331029	-0.110839
+class1	-5.70087	-0.137517
+class1	-0.500064	-0.325673
+nover:: 2	0	5	8	9
+class2	0.916201	0.563653
+class2	1.38369	0.627892
+class2	0.57635	0.0417872
+class2	0.342495	0.0570999
+class2	1.65703	0.713242
+class2	0.662693	0.601193
+class2	0.294469	0.30436
+class2	0.726648	0.588291
+nover:: 2	0	5	8	9
+class3	-1.12217	-0.73603
+class3	-1.02137	-0.97192
+class3	-2.68353	-0.161313
+class3	-0.304011	-0.35494
+class3	-0.271356	-0.237363
+class3	-1.09683	-0.04684
+class3	-0.895775	-0.595781
+class3	-1.15932	-0.956706
+class3	-0.889468	-0.231225
+nover:: 4	0	5	8	9
+45
+nover:: 0	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 5	19	13	13	13
+nover:: 8	19	13	13	13
+8
+nover:: 0	0	5	8	9
+class1	-2.05587	-0.243971
+class1	-0.329011	0.584782
+class1	-0.331029	0.387098
+class1	-5.70087	-0.730107
+class1	-0.500064	0.403453
+nover:: 2	0	5	8	9
+class2	0.916201	1.50298
+class2	1.38369	1.60573
+class2	0.57635	0.424454
+class2	0.342495	0.548155
+class2	1.65703	1.74321
+class2	0.662693	1.58366
+class2	0.294469	1.31603
+class2	0.726648	1.55359
+nover:: 2	0	5	8	9
+class3	-1.12217	0.0989428
+class3	-1.02137	0.0208427
+class3	-2.68353	0.412765
+class3	-0.304011	-0.752031
+class3	-0.271356	-0.681478
+class3	-1.09683	0.0200101
+class3	-0.895775	-0.0849994
+class3	-1.15932	0.144729
+class3	-0.889468	-0.056356
+nover:: 2	0	5	8	9
+10
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.706095	0.229011
+class1	1.15826	0.76531
+class1	0.770195	0.546891
+class1	1.04073	0.154689
+class1	1.21056	0.722221
+nover:: 0	0	5	8	9
+class2	0.0657279	0.706366
+class2	-0.258469	0.623828
+class2	0.114074	0.266022
+class2	0.268467	0.39706
+class2	-0.431059	0.610012
+class2	0.321275	0.814849
+class2	0.717283	0.850353
+class2	0.245956	0.783187
+nover:: 0	0	5	8	9
+class3	-1.71869	-0.724201
+class3	-1.97183	-0.827979
+class3	-0.90312	-0.242729
+class3	-1.40901	-0.882203
+class3	-1.18906	-0.804758
+class3	-0.433313	-0.205184
+class3	-1.54608	-0.728094
+class3	-1.96157	-0.788535
+class3	-0.963367	-0.488056
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.706095	4.32785
+class1	1.15826	1.14742
+class1	0.770195	1.72817
+class1	1.04073	6.43861
+class1	1.21056	1.23914
+nover:: 0	0	5	8	9
+class2	0.0657279	1.27494
+class2	-0.258469	1.48449
+class2	0.114074	3.71383
+class2	0.268467	2.44911
+class2	-0.431059	1.52422
+class2	0.321275	1.0499
+class2	0.717283	0.983616
+class2	0.245956	1.11139
+nover:: 0	0	5	8	9
+class3	-1.71869	-1.23476
+class3	-1.97183	-1.02512
+class3	-0.90312	-4.07867
+class3	-1.40901	-0.92548
+class3	-1.18906	-1.06921
+class3	-0.433313	-4.83906
+class3	-1.54608	-1.22619
+class3	-1.96157	-1.10081
+class3	-0.963367	-1.96132
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.706095	0.613634
+class1	1.15826	0.955197
+class1	0.770195	0.833306
+class1	1.04073	0.53753
+class1	1.21056	0.931021
+nover:: 0	0	5	8	9
+class2	0.0657279	0.922225
+class2	-0.258469	0.876612
+class2	0.114074	0.645743
+class2	0.268467	0.741874
+class2	-0.431059	0.86893
+class2	0.321275	0.983898
+class2	0.717283	1.00552
+class2	0.245956	0.965409
+nover:: 0	0	5	8	9
+class3	-1.71869	-0.932122
+class3	-1.97183	-0.991764
+class3	-0.90312	-0.625884
+class3	-1.40901	-1.02615
+class3	-1.18906	-0.97794
+class3	-0.433313	-0.591216
+class3	-1.54608	-0.934289
+class3	-1.96157	-0.968492
+class3	-0.963367	-0.798885
+nover:: 0	0	5	8	9
+0
+nover:: 8	19	13	13	13
+nover:: 8	19	13	13	13
+nover:: 22	19	13	13	13
+nover:: 22	19	13	13	13
+22
+nover:: 0	0	5	8	9
+class1	0.706095	-0.109762
+class1	1.15826	-0.249901
+class1	0.770195	-0.110839
+class1	1.04073	-0.137517
+class1	1.21056	-0.325673
+nover:: 0	0	5	8	9
+class2	0.0657279	0.563653
+class2	-0.258469	0.627892
+class2	0.114074	0.0417872
+class2	0.268467	0.0570999
+class2	-0.431059	0.713242
+class2	0.321275	0.601193
+class2	0.717283	0.30436
+class2	0.245956	0.588291
+nover:: 0	0	5	8	9
+class3	-1.71869	-0.73603
+class3	-1.97183	-0.97192
+class3	-0.90312	-0.161313
+class3	-1.40901	-0.35494
+class3	-1.18906	-0.237363
+class3	-0.433313	-0.04684
+class3	-1.54608	-0.595781
+class3	-1.96157	-0.956706
+class3	-0.963367	-0.231225
+nover:: 0	0	5	8	9
+22
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.706095	-0.243971
+class1	1.15826	0.584782
+class1	0.770195	0.387098
+class1	1.04073	-0.730107
+class1	1.21056	0.403453
+nover:: 0	0	5	8	9
+class2	0.0657279	1.50298
+class2	-0.258469	1.60573
+class2	0.114074	0.424454
+class2	0.268467	0.548155
+class2	-0.431059	1.74321
+class2	0.321275	1.58366
+class2	0.717283	1.31603
+class2	0.245956	1.55359
+nover:: 0	0	5	8	9
+class3	-1.71869	0.0989428
+class3	-1.97183	0.0208427
+class3	-0.90312	0.412765
+class3	-1.40901	-0.752031
+class3	-1.18906	-0.681478
+class3	-0.433313	0.0200101
+class3	-1.54608	-0.0849994
+class3	-1.96157	0.144729
+class3	-0.963367	-0.056356
+nover:: 0	0	5	8	9
+0
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 4	19	13	13	13
+4
+nover:: 0	0	5	8	9
+class1	0.229011	4.32785
+class1	0.76531	1.14742
+class1	0.546891	1.72817
+class1	0.154689	6.43861
+class1	0.722221	1.23914
+nover:: 0	0	5	8	9
+class2	0.706366	1.27494
+class2	0.623828	1.48449
+class2	0.266022	3.71383
+class2	0.39706	2.44911
+class2	0.610012	1.52422
+class2	0.814849	1.0499
+class2	0.850353	0.983616
+class2	0.783187	1.11139
+nover:: 0	0	5	8	9
+class3	-0.724201	-1.23476
+class3	-0.827979	-1.02512
+class3	-0.242729	-4.07867
+class3	-0.882203	-0.92548
+class3	-0.804758	-1.06921
+class3	-0.205184	-4.83906
+class3	-0.728094	-1.22619
+class3	-0.788535	-1.10081
+class3	-0.488056	-1.96132
+nover:: 0	0	5	8	9
+4
+nover:: 6	19	13	13	13
+nover:: 12	19	13	13	13
+nover:: 15	19	13	13	13
+nover:: 25	19	13	13	13
+25
+nover:: 0	0	5	8	9
+class1	0.229011	0.613634
+class1	0.76531	0.955197
+class1	0.546891	0.833306
+class1	0.154689	0.53753
+class1	0.722221	0.931021
+nover:: 1	0	5	8	9
+class2	0.706366	0.922225
+class2	0.623828	0.876612
+class2	0.266022	0.645743
+class2	0.39706	0.741874
+class2	0.610012	0.86893
+class2	0.814849	0.983898
+class2	0.850353	1.00552
+class2	0.783187	0.965409
+nover:: 3	0	5	8	9
+class3	-0.724201	-0.932122
+class3	-0.827979	-0.991764
+class3	-0.242729	-0.625884
+class3	-0.882203	-1.02615
+class3	-0.804758	-0.97794
+class3	-0.205184	-0.591216
+class3	-0.728094	-0.934289
+class3	-0.788535	-0.968492
+class3	-0.488056	-0.798885
+nover:: 3	0	5	8	9
+28
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.229011	-0.109762
+class1	0.76531	-0.249901
+class1	0.546891	-0.110839
+class1	0.154689	-0.137517
+class1	0.722221	-0.325673
+nover:: 0	0	5	8	9
+class2	0.706366	0.563653
+class2	0.623828	0.627892
+class2	0.266022	0.0417872
+class2	0.39706	0.0570999
+class2	0.610012	0.713242
+class2	0.814849	0.601193
+class2	0.850353	0.30436
+class2	0.783187	0.588291
+nover:: 0	0	5	8	9
+class3	-0.724201	-0.73603
+class3	-0.827979	-0.97192
+class3	-0.242729	-0.161313
+class3	-0.882203	-0.35494
+class3	-0.804758	-0.237363
+class3	-0.205184	-0.04684
+class3	-0.728094	-0.595781
+class3	-0.788535	-0.956706
+class3	-0.488056	-0.231225
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.229011	-0.243971
+class1	0.76531	0.584782
+class1	0.546891	0.387098
+class1	0.154689	-0.730107
+class1	0.722221	0.403453
+nover:: 0	0	5	8	9
+class2	0.706366	1.50298
+class2	0.623828	1.60573
+class2	0.266022	0.424454
+class2	0.39706	0.548155
+class2	0.610012	1.74321
+class2	0.814849	1.58366
+class2	0.850353	1.31603
+class2	0.783187	1.55359
+nover:: 0	0	5	8	9
+class3	-0.724201	0.0989428
+class3	-0.827979	0.0208427
+class3	-0.242729	0.412765
+class3	-0.882203	-0.752031
+class3	-0.804758	-0.681478
+class3	-0.205184	0.0200101
+class3	-0.728094	-0.0849994
+class3	-0.788535	0.144729
+class3	-0.488056	-0.056356
+nover:: 0	0	5	8	9
+0
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 3	19	13	13	13
+nover:: 4	19	13	13	13
+4
+nover:: 0	0	5	8	9
+class1	4.32785	0.613634
+class1	1.14742	0.955197
+class1	1.72817	0.833306
+class1	6.43861	0.53753
+class1	1.23914	0.931021
+nover:: 0	0	5	8	9
+class2	1.27494	0.922225
+class2	1.48449	0.876612
+class2	3.71383	0.645743
+class2	2.44911	0.741874
+class2	1.52422	0.86893
+class2	1.0499	0.983898
+class2	0.983616	1.00552
+class2	1.11139	0.965409
+nover:: 0	0	5	8	9
+class3	-1.23476	-0.932122
+class3	-1.02512	-0.991764
+class3	-4.07867	-0.625884
+class3	-0.92548	-1.02615
+class3	-1.06921	-0.97794
+class3	-4.83906	-0.591216
+class3	-1.22619	-0.934289
+class3	-1.10081	-0.968492
+class3	-1.96132	-0.798885
+nover:: 0	0	5	8	9
+4
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	4.32785	-0.109762
+class1	1.14742	-0.249901
+class1	1.72817	-0.110839
+class1	6.43861	-0.137517
+class1	1.23914	-0.325673
+nover:: 0	0	5	8	9
+class2	1.27494	0.563653
+class2	1.48449	0.627892
+class2	3.71383	0.0417872
+class2	2.44911	0.0570999
+class2	1.52422	0.713242
+class2	1.0499	0.601193
+class2	0.983616	0.30436
+class2	1.11139	0.588291
+nover:: 0	0	5	8	9
+class3	-1.23476	-0.73603
+class3	-1.02512	-0.97192
+class3	-4.07867	-0.161313
+class3	-0.92548	-0.35494
+class3	-1.06921	-0.237363
+class3	-4.83906	-0.04684
+class3	-1.22619	-0.595781
+class3	-1.10081	-0.956706
+class3	-1.96132	-0.231225
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 1	19	13	13	13
+1
+nover:: 0	0	5	8	9
+class1	4.32785	-0.243971
+class1	1.14742	0.584782
+class1	1.72817	0.387098
+class1	6.43861	-0.730107
+class1	1.23914	0.403453
+nover:: 0	0	5	8	9
+class2	1.27494	1.50298
+class2	1.48449	1.60573
+class2	3.71383	0.424454
+class2	2.44911	0.548155
+class2	1.52422	1.74321
+class2	1.0499	1.58366
+class2	0.983616	1.31603
+class2	1.11139	1.55359
+nover:: 0	0	5	8	9
+class3	-1.23476	0.0989428
+class3	-1.02512	0.0208427
+class3	-4.07867	0.412765
+class3	-0.92548	-0.752031
+class3	-1.06921	-0.681478
+class3	-4.83906	0.0200101
+class3	-1.22619	-0.0849994
+class3	-1.10081	0.144729
+class3	-1.96132	-0.056356
+nover:: 0	0	5	8	9
+1
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.613634	-0.109762
+class1	0.955197	-0.249901
+class1	0.833306	-0.110839
+class1	0.53753	-0.137517
+class1	0.931021	-0.325673
+nover:: 0	0	5	8	9
+class2	0.922225	0.563653
+class2	0.876612	0.627892
+class2	0.645743	0.0417872
+class2	0.741874	0.0570999
+class2	0.86893	0.713242
+class2	0.983898	0.601193
+class2	1.00552	0.30436
+class2	0.965409	0.588291
+nover:: 0	0	5	8	9
+class3	-0.932122	-0.73603
+class3	-0.991764	-0.97192
+class3	-0.625884	-0.161313
+class3	-1.02615	-0.35494
+class3	-0.97794	-0.237363
+class3	-0.591216	-0.04684
+class3	-0.934289	-0.595781
+class3	-0.968492	-0.956706
+class3	-0.798885	-0.231225
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+nover:: 0	19	13	13	13
+0
+nover:: 0	0	5	8	9
+class1	0.613634	-0.243971
+class1	0.955197	0.584782
+class1	0.833306	0.387098
+class1	0.53753	-0.730107
+class1	0.931021	0.403453
+nover:: 0	0	5	8	9
+class2	0.922225	1.50298
+class2	0.876612	1.60573
+class2	0.645743	0.424454
+class2	0.741874	0.548155
+class2	0.86893	1.74321
+class2	0.983898	1.58366
+class2	1.00552	1.31603
+class2	0.965409	1.55359
+nover:: 0	0	5	8	9
+class3	-0.932122	0.0989428
+class3	-0.991764	0.0208427
+class3	-0.625884	0.412765
+class3	-1.02615	-0.752031
+class3	-0.97794	-0.681478
+class3	-0.591216	0.0200101
+class3	-0.934289	-0.0849994
+class3	-0.968492	0.144729
+class3	-0.798885	-0.056356
+nover:: 0	0	5	8	9
+0
+nover:: 0	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 7	19	13	13	13
+nover:: 15	19	13	13	13
+15
+nover:: 0	0	5	8	9
+class1	-0.109762	-0.243971
+class1	-0.249901	0.584782
+class1	-0.110839	0.387098
+class1	-0.137517	-0.730107
+class1	-0.325673	0.403453
+nover:: 2	0	5	8	9
+class2	0.563653	1.50298
+class2	0.627892	1.60573
+class2	0.0417872	0.424454
+class2	0.0570999	0.548155
+class2	0.713242	1.74321
+class2	0.601193	1.58366
+class2	0.30436	1.31603
+class2	0.588291	1.55359
+nover:: 2	0	5	8	9
+class3	-0.73603	0.0989428
+class3	-0.97192	0.0208427
+class3	-0.161313	0.412765
+class3	-0.35494	-0.752031
+class3	-0.237363	-0.681478
+class3	-0.04684	0.0200101
+class3	-0.595781	-0.0849994
+class3	-0.956706	0.144729
+class3	-0.231225	-0.056356
+nover:: 2	0	5	8	9
+17
+0
+Time for l0-norm: 1.8686 s
+Percent of training data in the convex overlap region: 63.75%; Percent of test data in the convex overlap region: 70%
+[(B + A)]
+
+Percent of training data in the convex overlap region: 0%; Percent of test data in the convex overlap region: 0%
+[exp(A), (B / A)]
+
diff --git a/tests/exec_test/classification_gen_proj/check_model.py b/tests/exec_test/classification_gen_proj/check_model.py
index 5958c75a5d751910b4baa9cd22593a2de8289c5d..067d895e46bd7976c91576c4749a233bc59279df 100644
--- a/tests/exec_test/classification_gen_proj/check_model.py
+++ b/tests/exec_test/classification_gen_proj/check_model.py
@@ -4,7 +4,6 @@ from pathlib import Path
 import numpy as np
 
 model = ModelClassifier(
-    str("models/train_dim_2_model_0.dat"), str("models/test_dim_2_model_0.dat")
+    str("models/train_dim_2_model_0.dat")
 )
 assert model.percent_error < 1e-7
-assert model.percent_test_error < 1e-7
diff --git a/tests/exec_test/classification_gen_proj/data.csv b/tests/exec_test/classification_gen_proj/data.csv
index 3fa9f64bd0b9d2133040ea4849a71789ac8f078a..3de3e29a3d3a8510a21ace99268501120ce46af4 100644
--- a/tests/exec_test/classification_gen_proj/data.csv
+++ b/tests/exec_test/classification_gen_proj/data.csv
@@ -1,101 +1,101 @@
-index,prop,A,B,C,D,E,F,G,H,I,J
-0,1,0.1,-0.3,10,10,-0.492825179098274,0.173025977694162,0.598942935224295,-0.298754475196825,-0.581254909010269,-0.110656494210556
-1,1,-1.89442810374214,-1.31996134398007,0.955713896876243,0.478117201427488,0.777586191100329,0.655369716778557,0.174914171427966,-0.288498877530604,-0.045316536149489,-0.606586193752411
-2,1,-1.47460150711424,-1.22614964523433,0.330140292484796,-0.389505498689912,0.138935265824808,-0.871869282167555,0.37472462048701,0.16418591189513,0.293560701443717,0.285608940220021
-3,1,-1.30213414336735,-1.82621262418812,-0.71381302228685,0.968769585007681,0.683148179202864,0.799125092538796,0.309479173526504,0.728052031003468,0.725495580994886,-0.676576302804248
-4,1,-1.73938632269334,-1.58349866505488,-0.067783417095816,0.229988549891323,0.582427598044647,0.261947149184825,-0.31573435079735,0.61178122144268,-0.187058216967238,0.871764347690334
-5,1,-1.56660896632398,-1.05861814902183,-0.212800982302764,0.915459776146607,-0.263465552591813,-0.666126495988014,-0.195028996490007,-0.237170057680116,-0.933358858596883,-0.19726273171241
-6,1,-1.55340876153895,-1.25209231285838,0.972585787901787,-0.872502887185675,0.729110910814452,0.265244787210866,-0.726900973624432,0.248626170107939,0.809004396880265,-0.278494064780479
-7,1,-1.54625325136447,-1.81238888450819,0.431645876221063,-0.595223273279383,0.953557069758112,-0.222915219121563,0.257670939076174,0.728927879098318,-0.579783055417687,-0.179960319428095
-8,1,-1.12735554524035,-1.69261497444728,0.827091199052693,0.208781482910759,-0.573317187361529,-0.00488758921352,0.341918716034638,-0.292901671356202,-0.34531700628784,0.766920547630073
-9,1,-1.35367834815884,-1.38141056472962,0.023965203621816,-0.805412569418484,0.806810139718495,-0.806576263127819,-0.39279977856172,-0.463819363774079,0.16095248005356,-0.553419747131608
-10,1,-1.17853151888796,-1.27705829298504,0.916015229666356,0.0275946645157,-0.829507007977635,-0.700063689327201,-0.715601456588714,-0.439799165143527,-0.487241220494887,0.245279267056121
-11,1,-1.17547049766875,-1.05613281246665,0.404557718897757,0.175997361062361,0.227383730822478,0.774254652577977,-0.616080996917636,-0.948639007451084,0.954076433375225,-0.497102001172339
-12,1,-1.67277915033943,-1.86190239883588,0.906708844886064,-0.592737030373698,0.885229803890949,0.822069297241907,0.204367485562992,0.24610324883505,-0.079476866422163,-0.244006995583434
-13,1,-1.96326165438884,-1.31680458089693,-0.284570394188414,-0.198686061574238,-0.168897609541112,-0.92939259112691,0.265899059671913,-0.828727642584781,-0.427453097474168,-0.738368091608883
-14,1,-1.79497769808481,-1.13948217357082,0.011556817105957,0.499408314757229,0.253163424774478,0.017645446880421,0.401735167095264,-0.650287617298501,-0.262217482830833,-0.510102120130588
-15,1,-1.07957120536262,-1.93245955077991,-0.520161174281201,-0.392478459347396,0.147495839565868,0.780879606474075,-0.281044687799173,-0.148036908135786,-0.208810472224093,0.278961929718128
-16,1,-1.52555145037507,-1.72455209196736,0.562803219191695,0.323676061636996,0.490737136410372,-0.95476192699496,0.028603504036769,-0.246295219650507,-0.297736293597739,-0.632473830957653
-17,1,-1.29142309507315,-1.9506961526212,0.859669461054104,0.485772819254089,0.268883598825009,0.253553321699552,-0.045743087569395,0.66793403278249,0.308591963919947,0.771084301464027
-18,1,-1.23404787121001,-1.68173519287847,-0.118871100462413,0.159000937768132,0.2985428841756,-0.203829205332538,-0.637945695251352,0.658049690810909,0.949529589134008,-0.577812553880056
-19,1,-1.13513050029551,-1.3119036669604,0.74037411093045,0.558782660077505,-0.096052126354583,0.529119817360537,0.372190604770144,0.688656466253683,-0.819433165315505,-0.12814415930811
-20,1,-0.2,0.132,-10,-10,0.785237349732891,-0.387217730495401,-0.942409218899448,0.160806577297675,-0.723969983661972,-0.452650134415823
-21,1,1.1507658618081,1.7260505392724,-0.92290734251321,0.465751384219632,-0.81727500527083,-0.182472640926628,0.887526070620356,0.111592883978488,0.386435078880162,-0.440017211221272
-22,1,1.90389768224701,1.71880759316591,0.28033979546451,0.379365407838544,0.634843008192624,0.371753918780839,-0.611653305369863,0.732567927874185,0.85803611350317,-0.577973441708411
-23,1,1.77751976452381,1.28697050370578,0.222850898945077,-0.424057088828287,-0.27619426781836,0.616941667680694,-0.696779972923147,0.23612770730498,0.760705889780527,0.34004139732033
-24,1,1.65314327493874,1.16282810211312,-0.339501197382944,0.898529591365812,-0.881538228231582,0.090728826664301,-0.858395870780934,0.907272331515896,0.160531735619067,0.678911811768841
-25,1,1.30955180558204,1.36827755737648,-0.444576754578563,-0.78871174512572,0.246625773070183,-0.663474018818313,-0.446355552060464,0.750312773134634,-0.98959522970136,-0.150120109840706
-26,1,1.44924431171893,1.40328864581169,-0.388679577334402,-0.708193450791952,0.850310084800308,-0.767256338531612,0.370509317329194,0.387354921015751,0.533160321164986,0.149390212455131
-27,1,1.61362501391753,1.05448314414567,-0.359644680155969,0.682555404147728,-0.53195400936544,0.934101689590862,-0.73600284018832,-0.29960291454053,0.351125596355818,-0.187842884669279
-28,1,1.0243392720598,1.91059602121133,-0.249409157470717,-0.137070024386644,-0.707128531134104,0.944932049234295,0.431233366052987,0.449543990959262,0.912901932280027,0.77394610963827
-29,1,1.99444678594607,1.67204984441306,0.935302642480463,0.833720966523807,0.254167956717343,-0.007922712021391,-0.114776295376767,-0.276042896002242,-0.813098403125419,0.341922052212475
-30,1,1.40110330287926,1.109011516196,0.425716772255087,0.544174803732763,0.492863854358204,-0.047589791717166,-0.743840790633672,0.525289489060411,0.829611715544936,0.015193221845522
-31,1,1.94995090625471,1.05727410799969,0.12665368551441,0.469705238170149,0.874436248273008,-0.759571175468135,0.310230735619265,-0.80342084374485,-0.462431082486477,-0.407165886759129
-32,1,1.47264625042994,1.18913643279065,-0.731393018031039,0.461102224603009,-0.884528391885322,-0.419893944840798,0.647518214389067,0.688126326408485,0.754656371691603,0.116881923067816
-33,1,1.45901677207507,1.17024364037294,-0.527433424947131,-0.598774697808646,0.113197791601676,-0.50528865259863,0.117572114288939,0.590400320594326,-0.155159386769126,0.354827977413197
-34,1,1.32042744243041,1.19801952930384,-0.818626622405165,-0.029008564510599,0.952315968378468,0.817495784213924,0.182224554845043,-0.01377304364653,-0.26273195293588,-0.859530562808673
-35,1,1.88138237976289,1.03670081839679,0.305218688016626,-0.060885647660027,-0.387903446605514,-0.108064042735465,-0.962980405009682,-0.424289604203511,-0.253442293077285,0.309637368411297
-36,1,1.9986688782461,1.36909257128618,0.54312844740039,0.591372473040837,-0.835367086693457,0.198315253422507,-0.181434739783802,0.636429105754948,0.420628646992331,0.990122364664621
-37,1,1.50455818499044,1.19094974349673,-0.653263607332762,0.359359450868376,0.30131719114182,0.649581794356589,0.942268955633086,0.884659894489377,-0.473171239344398,0.039635066570717
-38,1,1.00833361547154,1.98150630000827,-0.812352457176761,0.219766101590983,-0.65021067790289,0.423621690291556,-0.58865099275791,0.061487886019891,-0.237737474016087,0.641284347380825
-39,1,1.60179185724619,1.12508599627141,-0.829819386940741,-0.345104687573802,0.485166070545119,-0.258839727448056,-0.920615208326881,0.275498215871427,-0.629075534110342,-0.642527887960687
-40,0,0.2,0.58,10,-10,0.041519856511361,0.23303461629095,-0.497233246191187,-0.023544587617095,-0.418540837770003,-0.550233932792512
-41,0,-1.09147574370355,1.70418701701285,-0.480316474702795,-0.753784710340632,-0.613234235616998,0.167955573662474,0.455636631315042,-0.380227635953206,0.48021383007369,-0.453674929885108
-42,0,-1.9425392252915,1.59311394144654,0.310098050913387,-0.835007082906627,0.407580140850853,0.556924247596553,-0.388616604639346,0.60215104751412,-0.984322198098753,-0.996332888983337
-43,0,-1.40302421044915,1.05041379743038,-0.898456453446964,-0.797029924245349,0.47491891024478,0.480193220538417,-0.750856163558686,-0.043960372032018,-0.242651391805662,-0.109239061054006
-44,0,-1.45810616907354,1.08468326497063,0.571329522934018,-0.677379826379623,0.098396984688832,-0.961599170104035,-0.753922591922157,0.361435891257559,-0.638030455493982,0.404349024843908
-45,0,-1.60421432901638,1.57730973247518,0.402433205555268,-0.06801187450078,-0.373089661152032,0.23970878487105,0.416451106643361,-0.50599166271433,-0.88669034806741,0.30364523616443
-46,0,-1.54868661350102,1.32883184576708,-0.174925245509766,0.050330391451536,-0.388676795741932,-0.72307604978553,0.646076107724964,-0.00105589909588,0.491928720743773,-0.647995101369186
-47,0,-1.8920756792535,1.76576258461153,0.289410761217525,0.48566510896872,0.338684773860801,0.374319581439648,-0.105479014627167,0.004520417892418,0.222862261975939,0.23538363683764
-48,0,-1.51442922313653,1.69840409315155,-0.199044563017407,-0.855804112781183,0.947572000564906,0.654939562810152,0.802084131057488,0.010033694468233,0.449766366250574,0.119974134618433
-49,0,-1.33469144171841,1.80124846893287,-0.346729568989951,-0.614828863660672,0.578150372001527,-0.697356489908387,-0.272496177427547,-0.326679505363186,0.403185907494623,0.659834986972357
-50,0,-1.39216086591683,1.96030807097305,-0.470918775433235,-0.165965173767875,-0.118373275802139,-0.804671542299309,-0.273096283874977,0.964676290034708,-0.240786016285174,0.781092750718218
-51,0,-1.10818774496527,1.1321805921252,-0.546315077724052,-0.263397808061131,0.073416112160648,-0.561584513583351,-0.003812545601594,-0.067901708659798,-0.797337624892413,-0.502494288676279
-52,0,-1.12733422378345,1.22290093390908,-0.239618151680487,0.281282683112064,-0.122253338243164,-0.416340912422471,-0.302944823763312,0.950697167857575,0.084774348269755,0.245643637478141
-53,0,-1.54504585318447,1.46465556555859,0.672465261607398,0.963677112876299,-0.732866944741014,0.269879007022312,-0.734121763984793,-0.18475004364869,0.494783604230457,-0.563469688908407
-54,0,-1.69728989778812,1.93427938064611,0.916674666213795,0.744100669613517,-0.536325680879341,0.745349313896706,-0.608494971121628,-0.036147807131094,0.730097211981708,-0.986020687921255
-55,0,-1.46716685688328,1.91950733639359,-0.040012375137611,0.248257524389148,-0.795936343325832,-0.755933622220192,0.664943062567423,-0.560825069941966,-0.987328335835364,0.00918182383389
-56,0,-1.5078580841421,1.11065681931139,-0.75655271526814,-0.433965979475654,-0.925820800763387,0.621204380538264,-0.725355435802351,-0.087195045278291,0.500040007799584,-0.351024070867477
-57,0,-1.79333947783294,1.64615611570236,0.593670368718185,0.74125415566331,-0.835056311664806,-0.128283340965351,0.795769070113583,0.338062872249377,0.961610282279288,-0.519755961049099
-58,0,-1.68562328688306,1.79136645116331,-0.917792004629201,-0.224807652067029,0.751172530954049,0.744925497765574,0.054821387540181,-0.268146122719043,-0.373795753322288,-0.023619900695578
-59,0,-1.70325116873164,1.56173898398367,0.937331444475048,-0.189146596668676,0.726757528139029,0.571196020214809,0.150478496659529,0.716370904753891,0.645947936391794,-0.096512499841381
-60,0,-0.31,-0.164,-10,10,0.303748234076738,0.094684069184242,0.846651908762107,0.505710991097032,-0.664846620425076,-0.722934785670171
-61,0,1.51747503460744,-1.57976833969122,-0.313853456471656,-0.670641690437042,0.337481189036041,-0.695059667580877,0.382512664766648,-0.754051294565859,-0.540912893771664,-0.152736592481289
-62,0,1.36729416399966,-1.54942606995245,0.746279765035798,0.320667909398266,0.075712278316126,0.557089028326803,-0.314459962457274,-0.091179395352991,-0.712572618352738,-0.862523770264919
-63,0,1.87551859565403,-1.01245024447758,0.961634242304571,0.99902517180177,0.428576472620752,0.790254229843056,-0.162732148014183,0.057108415575022,0.099625367521191,-0.41779573726667
-64,0,1.8407338686869,-1.58680706359952,-0.293737994923213,-0.654603713924763,-0.15830470325221,-0.4506171823593,0.106217286056366,-0.250165079508456,-0.598894350859836,-0.860382476004742
-65,0,1.47999238640346,-1.68861965445586,0.661374709635725,0.335413696048534,0.295408469126627,-0.340725080366546,0.611961227458239,0.53327702260923,-0.960254363897463,0.913251337834092
-66,0,1.0735581028252,-1.06052424530937,-0.790281335013236,0.372594655247821,-0.940075790261345,0.972106617215367,-0.246874887198155,-0.501544524013033,-0.134947611932188,0.130090806976322
-67,0,1.63769743034008,-1.64946099093265,-0.600590046972624,0.281621309709353,0.836244003088172,0.56250556179443,-0.244248244001593,0.274273110413607,0.988229164412892,-0.903492892429764
-68,0,1.9226795203725,-1.58810792001545,0.230397844467249,0.458000795025685,0.160534364807898,0.106760231103633,0.084376336290482,-0.410257096809632,-0.388975913032382,0.233684932760446
-69,0,1.42821810695172,-1.75832976379165,0.122894112900537,-0.193746425367835,0.602411133999453,-0.895694511099768,0.347280223444287,0.045175117581033,-0.232661771389541,-0.314648785155521
-70,0,1.42602875697361,-1.16082451050484,0.906027162216176,0.736418182225292,-0.041284854438203,0.308524126840497,0.369205540497406,0.333193031466162,0.98544497734097,-0.253876502721057
-71,0,1.73002019404142,-1.80947421953802,-0.677661468333469,0.07388223501889,-0.682147267310905,0.024126391992196,0.848946249678909,-0.516253994735439,0.202627425635043,-0.897477249843204
-72,0,1.11605808678586,-1.05622349137538,0.492431513300772,-0.737330353527688,0.594894327441348,0.805436037154752,-0.912716679245893,-0.390199322338262,-0.735805203184445,-0.05803264345169
-73,0,1.52878306779173,-1.52822073704896,-0.863821530585294,-0.987125905118183,-0.698190916645222,-0.17859271120364,-0.902497993400075,0.777448050547606,0.03349780154213,0.569802193246196
-74,0,1.69602091303769,-1.68791329506752,-0.919679036112179,-0.083795023015624,0.492078750634905,-0.102786002654994,0.168000984501864,-0.984910911120671,-0.901017886055053,0.639813560268343
-75,0,1.82292095427058,-1.79921516167805,0.107455937171145,-0.854711756750333,0.344969246269787,0.519092986129825,0.410230657805076,-0.91216461269154,0.033943611687528,-0.306643316979961
-76,0,1.15382245032495,-1.9125109596393,-0.80848616018294,-0.010443047871684,-0.706296790283886,0.822118261736111,0.163327430772402,0.252786291364115,-0.501338527911191,-0.28349201031843
-77,0,1.19521627831595,-1.4347201247938,-0.814416838367815,-0.02940231646999,-0.841428202408144,-0.004586605289201,-0.606434730541928,0.714277316437912,-0.44481897692423,-0.753698456302665
-78,0,1.99358961720643,-1.52499478281942,-0.877637461379848,0.414405535550407,-0.03365581494898,0.624692043559635,-0.832402658891314,-0.723028062732401,-0.867099034604054,-0.185632378061498
-79,0,1.6235192049324,-1.52045677356057,0.977828685636029,-0.57502380941392,-0.402617609462035,0.631967959251952,-0.426504420434097,0.480579460496328,0.686338078276468,-0.793812851707889
-80,1,-1.9061964810895,-1.28908450646839,10,10,0.101102136284509,-0.416199695149021,-0.494850987164782,-0.568698448483212,-0.184782382471875,-0.552230498856606
-81,1,-1.12334568706136,-1.43192728687949,-0.202671045004157,-0.138914163603925,-0.937156710796857,-0.116790109384378,-0.094237431941851,-0.896761118553971,-0.183423320636867,0.458624633065419
-82,1,-1.85938009020988,-1.2014277824818,-0.699402902052328,0.801606907908076,0.618074329335756,-0.172568708757076,-0.075693445304373,0.488815268086692,-0.612225386267585,0.515474858015819
-83,1,-1.44593059276162,-1.50738144143115,-0.146467066237161,0.773717178872341,0.498796984960351,-0.015862721592055,0.487162827649467,-0.002016922590367,-0.480395455657278,0.140660394856319
-84,1,-1.5068337349461,-1.39605748721966,0.935273336022611,-0.65840232577507,0.254028615496319,-0.207949363786322,0.494233964181716,0.342544015156094,-0.2790717466048,0.681766781920308
-85,1,1.29459521637362,1.25954745515179,-10,-10,-0.772948300582061,-0.755591080857131,-0.795691897784493,0.140653835392209,-0.160483486922781,0.460920935704452
-86,1,1.04689401512909,1.48899924906156,-0.102806023076495,-0.232256721754397,0.982487312078063,0.220639487969972,0.466108251058299,-0.328239000603224,0.955688285869012,0.98401214247364
-87,1,1.58830474403604,1.70226055213414,-0.400909948872293,-0.887240029691788,-0.796366553971199,-0.189011341359002,-0.984264269832423,0.228539348323108,0.696045037642922,-0.734941166556072
-88,1,1.07001216284605,1.81845698640496,-0.906675421892372,0.643501800272306,0.2964442904515,-0.212339822521429,-0.624947347663644,-0.076505534185115,0.690006945874019,0.603178865697037
-89,1,1.47818853391931,1.1810797217516,-0.777878371782176,0.158700400185078,0.77008386941758,0.318201581494366,-0.577373286340777,0.207915408782256,0.169898207168944,-0.134718349741109
-90,0,-1.39792536337696,1.8903759983709,10,-10,-0.381543623044489,-0.150608604917312,0.940200935058958,-0.260126956593852,0.011178432296195,-0.552646188796202
-91,0,-1.34181919280501,1.37770384290606,-0.273896107346467,0.9218628887177,0.183329714125041,0.794995796775324,0.47034078624241,0.587159127993906,0.656065190534019,0.710378359435155
-92,0,-1.08535749655328,1.25684564483175,-0.093438684660175,0.867363731909897,0.501979335337794,0.929133531466716,0.853038546233495,0.231647371842096,-0.921363933789468,0.9955206665909
-93,0,-1.5078347061732,1.75537297346943,-0.353047628963401,0.686996459628496,0.12650715249212,-0.584157551233493,0.67801198459735,0.130184075673761,-0.541365882749818,0.804095414322346
-94,0,-1.67232665291775,1.91398842184753,-0.055989266428472,0.083972688856283,0.495406878960658,-0.531851511151842,-0.68298755038252,-0.762719341237422,0.044183568378214,0.569492860435106
-95,0,1.52196747373202,-1.81272431584475,-10,10,-0.592427348924565,-0.245215291809175,0.450286805609337,-0.61720080602177,-0.078323806376631,-0.138400199664094
-96,0,1.34277619089321,-1.04264614535854,-0.840523610880692,-0.579768061766314,0.207088065224924,-0.30689024242517,-0.707319832593209,0.067209487208095,-0.219041441615042,0.651618314592841
-97,0,1.72996670685819,-1.26148185356343,-0.071347258910479,-0.571647931880792,0.00248497405952,0.410346123251162,0.294254262248804,0.698018369247902,0.652553267893053,-0.960621219815728
-98,0,1.63679608599505,-1.40483117266873,0.133355343382705,0.785183623637213,0.106494106522641,0.457003384754942,-0.314470768070196,-0.05337112691883,0.86147345141363,-0.770167158107586
-99,0,1.22531932528574,-1.39832123108255,0.751819680541469,0.843477659731268,0.880714646905367,0.20665859661747,-0.85053999542226,0.770244035843202,-0.790477429383416,-0.219373260405667
+index,prop,task,A,B,C,D,E,F,G,H,I,J
+0,0,Task_1,-0.815427620522422,-0.549653782587197,10,10,-0.492825179098274,0.173025977694162,0.598942935224295,-0.298754475196825,-0.581254909010269,-0.110656494210556
+1,0,Task_1,-0.69992853524861,-0.229332112274544,0.955713896876243,0.478117201427488,0.777586191100329,0.655369716778557,0.174914171427966,-0.288498877530604,-0.045316536149489,-0.606586193752411
+2,0,Task_1,-0.368076290946109,-0.969405272021421,0.330140292484796,-0.389505498689912,0.138935265824808,-0.871869282167555,0.37472462048701,0.16418591189513,0.293560701443717,0.285608940220021
+3,0,Task_1,-0.573491802821712,-0.815581340289383,-0.71381302228685,0.968769585007681,0.683148179202864,0.799125092538796,0.309479173526504,0.728052031003468,0.725495580994886,-0.676576302804248
+4,0,Task_1,-0.676358754897705,-0.548282221764748,-0.067783417095816,0.229988549891323,0.582427598044647,0.261947149184825,-0.31573435079735,0.61178122144268,-0.187058216967238,0.871764347690334
+5,0,Task_1,-0.757605167585471,-0.808298008590424,-0.212800982302764,0.915459776146607,-0.263465552591813,-0.666126495988014,-0.195028996490007,-0.237170057680116,-0.933358858596883,-0.19726273171241
+6,0,Task_1,-0.886003489547442,-0.509472491633194,0.972585787901787,-0.872502887185675,0.729110910814452,0.265244787210866,-0.726900973624432,0.248626170107939,0.809004396880265,-0.278494064780479
+7,0,Task_1,-1.02924299329947,-0.618392550297407,0.431645876221063,-0.595223273279383,0.953557069758112,-0.222915219121563,0.257670939076174,0.728927879098318,-0.579783055417687,-0.179960319428095
+8,0,Task_1,-0.502456609281931,-0.196195032500234,0.827091199052693,0.208781482910759,-0.573317187361529,-0.00488758921352,0.341918716034638,-0.292901671356202,-0.34531700628784,0.766920547630073
+9,0,Task_1,-0.517308486454666,-0.58057651993072,0.023965203621816,-0.805412569418484,0.806810139718495,-0.806576263127819,-0.39279977856172,-0.463819363774079,0.16095248005356,-0.553419747131608
+10,0,Task_1,-0.634057125095051,-1.01875520243377,0.916015229666356,0.0275946645157,-0.829507007977635,-0.700063689327201,-0.715601456588714,-0.439799165143527,-0.487241220494887,0.245279267056121
+11,0,Task_1,-0.577778256396397,-0.425744718740636,0.404557718897757,0.175997361062361,0.227383730822478,0.774254652577977,-0.616080996917636,-0.948639007451084,0.954076433375225,-0.497102001172339
+12,0,Task_1,-0.197376045004303,-0.404709510371676,0.906708844886064,-0.592737030373698,0.885229803890949,0.822069297241907,0.204367485562992,0.24610324883505,-0.079476866422163,-0.244006995583434
+13,0,Task_1,-0.766513992210544,-1.03945619108008,-0.284570394188414,-0.198686061574238,-0.168897609541112,-0.92939259112691,0.265899059671913,-0.828727642584781,-0.427453097474168,-0.738368091608883
+14,0,Task_1,-0.301129557074769,-0.466366201816861,0.011556817105957,0.499408314757229,0.253163424774478,0.017645446880421,0.401735167095264,-0.650287617298501,-0.262217482830833,-0.510102120130588
+15,0,Task_1,-0.372562647160274,-0.805363289239018,-0.520161174281201,-0.392478459347396,0.147495839565868,0.780879606474075,-0.281044687799173,-0.148036908135786,-0.208810472224093,0.278961929718128
+16,0,Task_1,-0.573276127254349,-0.843760519080871,0.562803219191695,0.323676061636996,0.490737136410372,-0.95476192699496,0.028603504036769,-0.246295219650507,-0.297736293597739,-0.632473830957653
+17,0,Task_1,-1.08177643161138,-1.08748936331147,0.859669461054104,0.485772819254089,0.268883598825009,0.253553321699552,-0.045743087569395,0.66793403278249,0.308591963919947,0.771084301464027
+18,0,Task_1,-0.121943068431321,-0.937658541363365,-0.118871100462413,0.159000937768132,0.2985428841756,-0.203829205332538,-0.637945695251352,0.658049690810909,0.949529589134008,-0.577812553880056
+19,0,Task_1,-1.06747884637477,-0.842449899007254,0.74037411093045,0.558782660077505,-0.096052126354583,0.529119817360537,0.372190604770144,0.688656466253683,-0.819433165315505,-0.12814415930811
+20,0,Task_1,-0.376355273108791,-0.908946282731397,-10,-10,0.785237349732891,-0.387217730495401,-0.942409218899448,0.160806577297675,-0.723969983661972,-0.452650134415823
+21,0,Task_1,-0.846685755905842,-0.209448772979162,-0.92290734251321,0.465751384219632,-0.81727500527083,-0.182472640926628,0.887526070620356,0.111592883978488,0.386435078880162,-0.440017211221272
+22,0,Task_1,-0.837187625737658,-0.851876882999398,0.28033979546451,0.379365407838544,0.634843008192624,0.371753918780839,-0.611653305369863,0.732567927874185,0.85803611350317,-0.577973441708411
+23,0,Task_1,-0.175842102272502,-0.488461994046914,0.222850898945077,-0.424057088828287,-0.27619426781836,0.616941667680694,-0.696779972923147,0.23612770730498,0.760705889780527,0.34004139732033
+24,0,Task_1,-0.857809192768388,-0.265302164309273,-0.339501197382944,0.898529591365812,-0.881538228231582,0.090728826664301,-0.858395870780934,0.907272331515896,0.160531735619067,0.678911811768841
+25,1,Task_1,-0.585614473203943,0.551068965982618,-0.444576754578563,-0.78871174512572,0.246625773070183,-0.663474018818313,-0.446355552060464,0.750312773134634,-0.98959522970136,-0.150120109840706
+26,1,Task_1,-0.23908209161902,0.897577090052027,-0.388679577334402,-0.708193450791952,0.850310084800308,-0.767256338531612,0.370509317329194,0.387354921015751,0.533160321164986,0.149390212455131
+27,1,Task_1,-0.830137779545391,0.125511796448773,-0.359644680155969,0.682555404147728,-0.53195400936544,0.934101689590862,-0.73600284018832,-0.29960291454053,0.351125596355818,-0.187842884669279
+28,1,Task_1,-0.216184782523012,0.211978905733634,-0.249409157470717,-0.137070024386644,-0.707128531134104,0.944932049234295,0.431233366052987,0.449543990959262,0.912901932280027,0.77394610963827
+29,1,Task_1,-0.282632767106854,0.264519450298051,0.935302642480463,0.833720966523807,0.254167956717343,-0.007922712021391,-0.114776295376767,-0.276042896002242,-0.813098403125419,0.341922052212475
+30,1,Task_1,-0.575451831562744,0.581795291243757,0.425716772255087,0.544174803732763,0.492863854358204,-0.047589791717166,-0.743840790633672,0.525289489060411,0.829611715544936,0.015193221845522
+31,1,Task_1,-0.758220206206948,0.613617353581097,0.12665368551441,0.469705238170149,0.874436248273008,-0.759571175468135,0.310230735619265,-0.80342084374485,-0.462431082486477,-0.407165886759129
+32,1,Task_1,-0.591882107713968,0.146363847316077,-0.731393018031039,0.461102224603009,-0.884528391885322,-0.419893944840798,0.647518214389067,0.688126326408485,0.754656371691603,0.116881923067816
+33,1,Task_1,-0.715219788915433,1.08461646785062,-0.527433424947131,-0.598774697808646,0.113197791601676,-0.50528865259863,0.117572114288939,0.590400320594326,-0.155159386769126,0.354827977413197
+34,1,Task_1,-0.796303564991711,0.501349128362771,-0.818626622405165,-0.029008564510599,0.952315968378468,0.817495784213924,0.182224554845043,-0.01377304364653,-0.26273195293588,-0.859530562808673
+35,1,Task_1,-0.195432956299915,0.260304213033586,0.305218688016626,-0.060885647660027,-0.387903446605514,-0.108064042735465,-0.962980405009682,-0.424289604203511,-0.253442293077285,0.309637368411297
+36,1,Task_1,-1.03051702404215,0.699873963424479,0.54312844740039,0.591372473040837,-0.835367086693457,0.198315253422507,-0.181434739783802,0.636429105754948,0.420628646992331,0.990122364664621
+37,1,Task_1,-0.642471040462047,0.350035500680932,-0.653263607332762,0.359359450868376,0.30131719114182,0.649581794356589,0.942268955633086,0.884659894489377,-0.473171239344398,0.039635066570717
+38,1,Task_1,-0.716592168562474,0.251176558055396,-0.812352457176761,0.219766101590983,-0.65021067790289,0.423621690291556,-0.58865099275791,0.061487886019891,-0.237737474016087,0.641284347380825
+39,1,Task_1,-0.627915139540302,0.522644163585557,-0.829819386940741,-0.345104687573802,0.485166070545119,-0.258839727448056,-0.920615208326881,0.275498215871427,-0.629075534110342,-0.642527887960687
+40,1,Task_1,-0.156663423633232,0.304600082490645,10,-10,0.041519856511361,0.23303461629095,-0.497233246191187,-0.023544587617095,-0.418540837770003,-0.550233932792512
+41,1,Task_2,-0.403555861666686,0.807008471177762,-0.480316474702795,-0.753784710340632,-0.613234235616998,0.167955573662474,0.455636631315042,-0.380227635953206,0.48021383007369,-0.453674929885108
+42,1,Task_2,-0.475033062023491,0.231061734571007,0.310098050913387,-0.835007082906627,0.407580140850853,0.556924247596553,-0.388616604639346,0.60215104751412,-0.984322198098753,-0.996332888983337
+43,1,Task_2,-0.286740191813169,0.871522465291953,-0.898456453446964,-0.797029924245349,0.47491891024478,0.480193220538417,-0.750856163558686,-0.043960372032018,-0.242651391805662,-0.109239061054006
+44,1,Task_2,-0.191548445530409,0.578646221732672,0.571329522934018,-0.677379826379623,0.098396984688832,-0.961599170104035,-0.753922591922157,0.361435891257559,-0.638030455493982,0.404349024843908
+45,1,Task_2,-0.962818439897195,0.190811801399786,0.402433205555268,-0.06801187450078,-0.373089661152032,0.23970878487105,0.416451106643361,-0.50599166271433,-0.88669034806741,0.30364523616443
+46,1,Task_2,-0.885419462203051,0.155312944156919,-0.174925245509766,0.050330391451536,-0.388676795741932,-0.72307604978553,0.646076107724964,-0.00105589909588,0.491928720743773,-0.647995101369186
+47,1,Task_2,-0.634581804798124,0.149519641506344,0.289410761217525,0.48566510896872,0.338684773860801,0.374319581439648,-0.105479014627167,0.004520417892418,0.222862261975939,0.23538363683764
+48,1,Task_2,-0.380429991155736,0.423554740615867,-0.199044563017407,-0.855804112781183,0.947572000564906,0.654939562810152,0.802084131057488,0.010033694468233,0.449766366250574,0.119974134618433
+49,1,Task_2,-0.294345606906633,0.954791580849514,-0.346729568989951,-0.614828863660672,0.578150372001527,-0.697356489908387,-0.272496177427547,-0.326679505363186,0.403185907494623,0.659834986972357
+50,2,Task_1,0.164414245529879,0.270409021456753,-0.470918775433235,-0.165965173767875,-0.118373275802139,-0.804671542299309,-0.273096283874977,0.964676290034708,-0.240786016285174,0.781092750718218
+51,2,Task_1,0.962558187516297,0.353448095106742,-0.546315077724052,-0.263397808061131,0.073416112160648,-0.561584513583351,-0.003812545601594,-0.067901708659798,-0.797337624892413,-0.502494288676279
+52,2,Task_1,0.328218018271649,0.30124689351081,-0.239618151680487,0.281282683112064,-0.122253338243164,-0.416340912422471,-0.302944823763312,0.950697167857575,0.084774348269755,0.245643637478141
+53,2,Task_1,0.157711895115317,0.944426984942688,0.672465261607398,0.963677112876299,-0.732866944741014,0.269879007022312,-0.734121763984793,-0.18475004364869,0.494783604230457,-0.563469688908407
+54,2,Task_1,1.05838672002069,0.573716871491595,0.916674666213795,0.744100669613517,-0.536325680879341,0.745349313896706,-0.608494971121628,-0.036147807131094,0.730097211981708,-0.986020687921255
+55,2,Task_1,0.110836847139862,0.585126999320639,-0.040012375137611,0.248257524389148,-0.795936343325832,-0.755933622220192,0.664943062567423,-0.560825069941966,-0.987328335835364,0.00918182383389
+56,2,Task_1,1.06747271856711,1.07364864476858,-0.75655271526814,-0.433965979475654,-0.925820800763387,0.621204380538264,-0.725355435802351,-0.087195045278291,0.500040007799584,-0.351024070867477
+57,2,Task_1,0.989939601503884,0.247705435387067,0.593670368718185,0.74125415566331,-0.835056311664806,-0.128283340965351,0.795769070113583,0.338062872249377,0.961610282279288,-0.519755961049099
+58,2,Task_1,0.970023710841513,1.01758529653736,-0.917792004629201,-0.224807652067029,0.751172530954049,0.744925497765574,0.054821387540181,-0.268146122719043,-0.373795753322288,-0.023619900695578
+59,2,Task_1,1.04979809782509,0.825205513076737,0.937331444475048,-0.189146596668676,0.726757528139029,0.571196020214809,0.150478496659529,0.716370904753891,0.645947936391794,-0.096512499841381
+60,2,Task_1,0.606201322157722,0.429911059767652,-10,10,0.303748234076738,0.094684069184242,0.846651908762107,0.505710991097032,-0.664846620425076,-0.722934785670171
+61,2,Task_1,0.509318826024622,0.139403752494424,-0.313853456471656,-0.670641690437042,0.337481189036041,-0.695059667580877,0.382512664766648,-0.754051294565859,-0.540912893771664,-0.152736592481289
+62,2,Task_1,0.416414890641425,1.04597850573715,0.746279765035798,0.320667909398266,0.075712278316126,0.557089028326803,-0.314459962457274,-0.091179395352991,-0.712572618352738,-0.862523770264919
+63,2,Task_1,0.236961042862613,0.461540684896611,0.961634242304571,0.99902517180177,0.428576472620752,0.790254229843056,-0.162732148014183,0.057108415575022,0.099625367521191,-0.41779573726667
+64,2,Task_1,0.325246248634948,0.721692503249982,-0.293737994923213,-0.654603713924763,-0.15830470325221,-0.4506171823593,0.106217286056366,-0.250165079508456,-0.598894350859836,-0.860382476004742
+65,2,Task_1,0.258917730833821,0.68064431493967,0.661374709635725,0.335413696048534,0.295408469126627,-0.340725080366546,0.611961227458239,0.53327702260923,-0.960254363897463,0.913251337834092
+66,2,Task_2,1.0502123912678,1.0175241545193,-0.790281335013236,0.372594655247821,-0.940075790261345,0.972106617215367,-0.246874887198155,-0.501544524013033,-0.134947611932188,0.130090806976322
+67,2,Task_2,0.653819704386349,0.899775279158189,-0.600590046972624,0.281621309709353,0.836244003088172,0.56250556179443,-0.244248244001593,0.274273110413607,0.988229164412892,-0.903492892429764
+68,2,Task_2,0.29937357944438,1.01665644266054,0.230397844467249,0.458000795025685,0.160534364807898,0.106760231103633,0.084376336290482,-0.410257096809632,-0.388975913032382,0.233684932760446
+69,2,Task_2,0.631194229943451,0.952468985425419,0.122894112900537,-0.193746425367835,0.602411133999453,-0.895694511099768,0.347280223444287,0.045175117581033,-0.232661771389541,-0.314648785155521
+70,2,Task_2,1.08713454227837,0.656075322649452,0.906027162216176,0.736418182225292,-0.041284854438203,0.308524126840497,0.369205540497406,0.333193031466162,0.98544497734097,-0.253876502721057
+71,2,Task_2,0.139844193498669,0.408310759532142,-0.677661468333469,0.07388223501889,-0.682147267310905,0.024126391992196,0.848946249678909,-0.516253994735439,0.202627425635043,-0.897477249843204
+72,2,Task_2,0.155190443948636,0.269264020133562,0.492431513300772,-0.737330353527688,0.594894327441348,0.805436037154752,-0.912716679245893,-0.390199322338262,-0.735805203184445,-0.05803264345169
+73,2,Task_2,0.71862355058755,0.784351472902302,-0.863821530585294,-0.987125905118183,-0.698190916645222,-0.17859271120364,-0.902497993400075,0.777448050547606,0.03349780154213,0.569802193246196
+74,2,Task_2,0.932100491693559,0.673631604373795,-0.919679036112179,-0.083795023015624,0.492078750634905,-0.102786002654994,0.168000984501864,-0.984910911120671,-0.901017886055053,0.639813560268343
+75,3,Task_1,0.103583721726665,-0.373304248094501,0.107455937171145,-0.854711756750333,0.344969246269787,0.519092986129825,0.410230657805076,-0.91216461269154,0.033943611687528,-0.306643316979961
+76,3,Task_1,0.698578699086034,-0.805397267250048,-0.80848616018294,-0.010443047871684,-0.706296790283886,0.822118261736111,0.163327430772402,0.252786291364115,-0.501338527911191,-0.28349201031843
+77,3,Task_1,0.236887498524042,-0.155242051697718,-0.814416838367815,-0.02940231646999,-0.841428202408144,-0.004586605289201,-0.606434730541928,0.714277316437912,-0.44481897692423,-0.753698456302665
+78,3,Task_1,0.178535729762585,-0.178631521167564,-0.877637461379848,0.414405535550407,-0.03365581494898,0.624692043559635,-0.832402658891314,-0.723028062732401,-0.867099034604054,-0.185632378061498
+79,3,Task_1,0.135466013133272,-1.08706802405113,0.977828685636029,-0.57502380941392,-0.402617609462035,0.631967959251952,-0.426504420434097,0.480579460496328,0.686338078276468,-0.793812851707889
+80,3,Task_1,1.06511564653917,-0.529772758901416,10,10,0.101102136284509,-0.416199695149021,-0.494850987164782,-0.568698448483212,-0.184782382471875,-0.552230498856606
+81,3,Task_1,1.065535288073,-0.706304720574752,-0.202671045004157,-0.138914163603925,-0.937156710796857,-0.116790109384378,-0.094237431941851,-0.896761118553971,-0.183423320636867,0.458624633065419
+82,3,Task_1,0.896291258595646,-0.255636497676642,-0.699402902052328,0.801606907908076,0.618074329335756,-0.172568708757076,-0.075693445304373,0.488815268086692,-0.612225386267585,0.515474858015819
+83,3,Task_1,0.524657824983779,-0.653380201807584,-0.146467066237161,0.773717178872341,0.498796984960351,-0.015862721592055,0.487162827649467,-0.002016922590367,-0.480395455657278,0.140660394856319
+84,3,Task_1,0.131331250236976,-0.217693885103831,0.935273336022611,-0.65840232577507,0.254028615496319,-0.207949363786322,0.494233964181716,0.342544015156094,-0.2790717466048,0.681766781920308
+85,3,Task_1,0.247418413965845,-0.55249563814082,-10,-10,-0.772948300582061,-0.755591080857131,-0.795691897784493,0.140653835392209,-0.160483486922781,0.460920935704452
+86,3,Task_1,0.877892966008508,-0.600861427554399,-0.102806023076495,-0.232256721754397,0.982487312078063,0.220639487969972,0.466108251058299,-0.328239000603224,0.955688285869012,0.98401214247364
+87,3,Task_1,0.353355849981167,-0.384127703150446,-0.400909948872293,-0.887240029691788,-0.796366553971199,-0.189011341359002,-0.984264269832423,0.228539348323108,0.696045037642922,-0.734941166556072
+88,3,Task_1,0.278857275841181,-0.845560468506653,-0.906675421892372,0.643501800272306,0.2964442904515,-0.212339822521429,-0.624947347663644,-0.076505534185115,0.690006945874019,0.603178865697037
+89,3,Task_1,0.36866502316092,-0.854728078950075,-0.777878371782176,0.158700400185078,0.77008386941758,0.318201581494366,-0.577373286340777,0.207915408782256,0.169898207168944,-0.134718349741109
+90,3,Task_1,0.493452674287737,-0.910519988093965,10,-10,-0.381543623044489,-0.150608604917312,0.940200935058958,-0.260126956593852,0.011178432296195,-0.552646188796202
+91,3,Task_2,0.908818256135453,-0.80987547007129,-0.273896107346467,0.9218628887177,0.183329714125041,0.794995796775324,0.47034078624241,0.587159127993906,0.656065190534019,0.710378359435155
+92,3,Task_2,0.996336489766102,-0.975493823251068,-0.093438684660175,0.867363731909897,0.501979335337794,0.929133531466716,0.853038546233495,0.231647371842096,-0.921363933789468,0.9955206665909
+93,3,Task_2,0.657942481588168,-0.245177885637302,-0.353047628963401,0.686996459628496,0.12650715249212,-0.584157551233493,0.67801198459735,0.130184075673761,-0.541365882749818,0.804095414322346
+94,3,Task_2,0.328489621282775,-1.08052040344332,-0.055989266428472,0.083972688856283,0.495406878960658,-0.531851511151842,-0.68298755038252,-0.762719341237422,0.044183568378214,0.569492860435106
+95,3,Task_2,0.253790826276124,-0.935268396370178,-10,10,-0.592427348924565,-0.245215291809175,0.450286805609337,-0.61720080602177,-0.078323806376631,-0.138400199664094
+96,3,Task_2,0.226661731310054,-0.206651604608129,-0.840523610880692,-0.579768061766314,0.207088065224924,-0.30689024242517,-0.707319832593209,0.067209487208095,-0.219041441615042,0.651618314592841
+97,3,Task_2,0.730538042566787,-0.815537451517852,-0.071347258910479,-0.571647931880792,0.00248497405952,0.410346123251162,0.294254262248804,0.698018369247902,0.652553267893053,-0.960621219815728
+98,3,Task_2,1.05315118537755,-0.90842251928343,0.133355343382705,0.785183623637213,0.106494106522641,0.457003384754942,-0.314470768070196,-0.05337112691883,0.86147345141363,-0.770167158107586
+99,3,Task_2,0.453505426891074,-0.509861391994549,0.751819680541469,0.843477659731268,0.880714646905367,0.20665859661747,-0.85053999542226,0.770244035843202,-0.790477429383416,-0.219373260405667
diff --git a/tests/exec_test/classification_gen_proj/sisso.json b/tests/exec_test/classification_gen_proj/sisso.json
index 1be640db39dd72bf93ae0d05470b5dd92684e040..51d8512bbb114eb06064b38eac7b5f26bdcc7cc7 100644
--- a/tests/exec_test/classification_gen_proj/sisso.json
+++ b/tests/exec_test/classification_gen_proj/sisso.json
@@ -6,11 +6,10 @@
     "data_file": "data.csv",
     "data_file_relatice_to_json": true,
     "property_key": "prop",
-    "leave_out_frac": 0.2,
+    "leave_out_frac": 0.0,
     "n_models_store": 1,
     "n_rung_generate": 1,
     "calc_type": "classification",
-    "leave_out_inds": [80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95 ,96 ,97, 98 , 99],
     "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"],
     "param_opset" : [],
     "fix_intercept": false
diff --git a/tests/exec_test/classification_max_corr/data.csv b/tests/exec_test/classification_max_corr/data.csv
index 3fa9f64bd0b9d2133040ea4849a71789ac8f078a..3de3e29a3d3a8510a21ace99268501120ce46af4 100644
--- a/tests/exec_test/classification_max_corr/data.csv
+++ b/tests/exec_test/classification_max_corr/data.csv
@@ -1,101 +1,101 @@
-index,prop,A,B,C,D,E,F,G,H,I,J
-0,1,0.1,-0.3,10,10,-0.492825179098274,0.173025977694162,0.598942935224295,-0.298754475196825,-0.581254909010269,-0.110656494210556
-1,1,-1.89442810374214,-1.31996134398007,0.955713896876243,0.478117201427488,0.777586191100329,0.655369716778557,0.174914171427966,-0.288498877530604,-0.045316536149489,-0.606586193752411
-2,1,-1.47460150711424,-1.22614964523433,0.330140292484796,-0.389505498689912,0.138935265824808,-0.871869282167555,0.37472462048701,0.16418591189513,0.293560701443717,0.285608940220021
-3,1,-1.30213414336735,-1.82621262418812,-0.71381302228685,0.968769585007681,0.683148179202864,0.799125092538796,0.309479173526504,0.728052031003468,0.725495580994886,-0.676576302804248
-4,1,-1.73938632269334,-1.58349866505488,-0.067783417095816,0.229988549891323,0.582427598044647,0.261947149184825,-0.31573435079735,0.61178122144268,-0.187058216967238,0.871764347690334
-5,1,-1.56660896632398,-1.05861814902183,-0.212800982302764,0.915459776146607,-0.263465552591813,-0.666126495988014,-0.195028996490007,-0.237170057680116,-0.933358858596883,-0.19726273171241
-6,1,-1.55340876153895,-1.25209231285838,0.972585787901787,-0.872502887185675,0.729110910814452,0.265244787210866,-0.726900973624432,0.248626170107939,0.809004396880265,-0.278494064780479
-7,1,-1.54625325136447,-1.81238888450819,0.431645876221063,-0.595223273279383,0.953557069758112,-0.222915219121563,0.257670939076174,0.728927879098318,-0.579783055417687,-0.179960319428095
-8,1,-1.12735554524035,-1.69261497444728,0.827091199052693,0.208781482910759,-0.573317187361529,-0.00488758921352,0.341918716034638,-0.292901671356202,-0.34531700628784,0.766920547630073
-9,1,-1.35367834815884,-1.38141056472962,0.023965203621816,-0.805412569418484,0.806810139718495,-0.806576263127819,-0.39279977856172,-0.463819363774079,0.16095248005356,-0.553419747131608
-10,1,-1.17853151888796,-1.27705829298504,0.916015229666356,0.0275946645157,-0.829507007977635,-0.700063689327201,-0.715601456588714,-0.439799165143527,-0.487241220494887,0.245279267056121
-11,1,-1.17547049766875,-1.05613281246665,0.404557718897757,0.175997361062361,0.227383730822478,0.774254652577977,-0.616080996917636,-0.948639007451084,0.954076433375225,-0.497102001172339
-12,1,-1.67277915033943,-1.86190239883588,0.906708844886064,-0.592737030373698,0.885229803890949,0.822069297241907,0.204367485562992,0.24610324883505,-0.079476866422163,-0.244006995583434
-13,1,-1.96326165438884,-1.31680458089693,-0.284570394188414,-0.198686061574238,-0.168897609541112,-0.92939259112691,0.265899059671913,-0.828727642584781,-0.427453097474168,-0.738368091608883
-14,1,-1.79497769808481,-1.13948217357082,0.011556817105957,0.499408314757229,0.253163424774478,0.017645446880421,0.401735167095264,-0.650287617298501,-0.262217482830833,-0.510102120130588
-15,1,-1.07957120536262,-1.93245955077991,-0.520161174281201,-0.392478459347396,0.147495839565868,0.780879606474075,-0.281044687799173,-0.148036908135786,-0.208810472224093,0.278961929718128
-16,1,-1.52555145037507,-1.72455209196736,0.562803219191695,0.323676061636996,0.490737136410372,-0.95476192699496,0.028603504036769,-0.246295219650507,-0.297736293597739,-0.632473830957653
-17,1,-1.29142309507315,-1.9506961526212,0.859669461054104,0.485772819254089,0.268883598825009,0.253553321699552,-0.045743087569395,0.66793403278249,0.308591963919947,0.771084301464027
-18,1,-1.23404787121001,-1.68173519287847,-0.118871100462413,0.159000937768132,0.2985428841756,-0.203829205332538,-0.637945695251352,0.658049690810909,0.949529589134008,-0.577812553880056
-19,1,-1.13513050029551,-1.3119036669604,0.74037411093045,0.558782660077505,-0.096052126354583,0.529119817360537,0.372190604770144,0.688656466253683,-0.819433165315505,-0.12814415930811
-20,1,-0.2,0.132,-10,-10,0.785237349732891,-0.387217730495401,-0.942409218899448,0.160806577297675,-0.723969983661972,-0.452650134415823
-21,1,1.1507658618081,1.7260505392724,-0.92290734251321,0.465751384219632,-0.81727500527083,-0.182472640926628,0.887526070620356,0.111592883978488,0.386435078880162,-0.440017211221272
-22,1,1.90389768224701,1.71880759316591,0.28033979546451,0.379365407838544,0.634843008192624,0.371753918780839,-0.611653305369863,0.732567927874185,0.85803611350317,-0.577973441708411
-23,1,1.77751976452381,1.28697050370578,0.222850898945077,-0.424057088828287,-0.27619426781836,0.616941667680694,-0.696779972923147,0.23612770730498,0.760705889780527,0.34004139732033
-24,1,1.65314327493874,1.16282810211312,-0.339501197382944,0.898529591365812,-0.881538228231582,0.090728826664301,-0.858395870780934,0.907272331515896,0.160531735619067,0.678911811768841
-25,1,1.30955180558204,1.36827755737648,-0.444576754578563,-0.78871174512572,0.246625773070183,-0.663474018818313,-0.446355552060464,0.750312773134634,-0.98959522970136,-0.150120109840706
-26,1,1.44924431171893,1.40328864581169,-0.388679577334402,-0.708193450791952,0.850310084800308,-0.767256338531612,0.370509317329194,0.387354921015751,0.533160321164986,0.149390212455131
-27,1,1.61362501391753,1.05448314414567,-0.359644680155969,0.682555404147728,-0.53195400936544,0.934101689590862,-0.73600284018832,-0.29960291454053,0.351125596355818,-0.187842884669279
-28,1,1.0243392720598,1.91059602121133,-0.249409157470717,-0.137070024386644,-0.707128531134104,0.944932049234295,0.431233366052987,0.449543990959262,0.912901932280027,0.77394610963827
-29,1,1.99444678594607,1.67204984441306,0.935302642480463,0.833720966523807,0.254167956717343,-0.007922712021391,-0.114776295376767,-0.276042896002242,-0.813098403125419,0.341922052212475
-30,1,1.40110330287926,1.109011516196,0.425716772255087,0.544174803732763,0.492863854358204,-0.047589791717166,-0.743840790633672,0.525289489060411,0.829611715544936,0.015193221845522
-31,1,1.94995090625471,1.05727410799969,0.12665368551441,0.469705238170149,0.874436248273008,-0.759571175468135,0.310230735619265,-0.80342084374485,-0.462431082486477,-0.407165886759129
-32,1,1.47264625042994,1.18913643279065,-0.731393018031039,0.461102224603009,-0.884528391885322,-0.419893944840798,0.647518214389067,0.688126326408485,0.754656371691603,0.116881923067816
-33,1,1.45901677207507,1.17024364037294,-0.527433424947131,-0.598774697808646,0.113197791601676,-0.50528865259863,0.117572114288939,0.590400320594326,-0.155159386769126,0.354827977413197
-34,1,1.32042744243041,1.19801952930384,-0.818626622405165,-0.029008564510599,0.952315968378468,0.817495784213924,0.182224554845043,-0.01377304364653,-0.26273195293588,-0.859530562808673
-35,1,1.88138237976289,1.03670081839679,0.305218688016626,-0.060885647660027,-0.387903446605514,-0.108064042735465,-0.962980405009682,-0.424289604203511,-0.253442293077285,0.309637368411297
-36,1,1.9986688782461,1.36909257128618,0.54312844740039,0.591372473040837,-0.835367086693457,0.198315253422507,-0.181434739783802,0.636429105754948,0.420628646992331,0.990122364664621
-37,1,1.50455818499044,1.19094974349673,-0.653263607332762,0.359359450868376,0.30131719114182,0.649581794356589,0.942268955633086,0.884659894489377,-0.473171239344398,0.039635066570717
-38,1,1.00833361547154,1.98150630000827,-0.812352457176761,0.219766101590983,-0.65021067790289,0.423621690291556,-0.58865099275791,0.061487886019891,-0.237737474016087,0.641284347380825
-39,1,1.60179185724619,1.12508599627141,-0.829819386940741,-0.345104687573802,0.485166070545119,-0.258839727448056,-0.920615208326881,0.275498215871427,-0.629075534110342,-0.642527887960687
-40,0,0.2,0.58,10,-10,0.041519856511361,0.23303461629095,-0.497233246191187,-0.023544587617095,-0.418540837770003,-0.550233932792512
-41,0,-1.09147574370355,1.70418701701285,-0.480316474702795,-0.753784710340632,-0.613234235616998,0.167955573662474,0.455636631315042,-0.380227635953206,0.48021383007369,-0.453674929885108
-42,0,-1.9425392252915,1.59311394144654,0.310098050913387,-0.835007082906627,0.407580140850853,0.556924247596553,-0.388616604639346,0.60215104751412,-0.984322198098753,-0.996332888983337
-43,0,-1.40302421044915,1.05041379743038,-0.898456453446964,-0.797029924245349,0.47491891024478,0.480193220538417,-0.750856163558686,-0.043960372032018,-0.242651391805662,-0.109239061054006
-44,0,-1.45810616907354,1.08468326497063,0.571329522934018,-0.677379826379623,0.098396984688832,-0.961599170104035,-0.753922591922157,0.361435891257559,-0.638030455493982,0.404349024843908
-45,0,-1.60421432901638,1.57730973247518,0.402433205555268,-0.06801187450078,-0.373089661152032,0.23970878487105,0.416451106643361,-0.50599166271433,-0.88669034806741,0.30364523616443
-46,0,-1.54868661350102,1.32883184576708,-0.174925245509766,0.050330391451536,-0.388676795741932,-0.72307604978553,0.646076107724964,-0.00105589909588,0.491928720743773,-0.647995101369186
-47,0,-1.8920756792535,1.76576258461153,0.289410761217525,0.48566510896872,0.338684773860801,0.374319581439648,-0.105479014627167,0.004520417892418,0.222862261975939,0.23538363683764
-48,0,-1.51442922313653,1.69840409315155,-0.199044563017407,-0.855804112781183,0.947572000564906,0.654939562810152,0.802084131057488,0.010033694468233,0.449766366250574,0.119974134618433
-49,0,-1.33469144171841,1.80124846893287,-0.346729568989951,-0.614828863660672,0.578150372001527,-0.697356489908387,-0.272496177427547,-0.326679505363186,0.403185907494623,0.659834986972357
-50,0,-1.39216086591683,1.96030807097305,-0.470918775433235,-0.165965173767875,-0.118373275802139,-0.804671542299309,-0.273096283874977,0.964676290034708,-0.240786016285174,0.781092750718218
-51,0,-1.10818774496527,1.1321805921252,-0.546315077724052,-0.263397808061131,0.073416112160648,-0.561584513583351,-0.003812545601594,-0.067901708659798,-0.797337624892413,-0.502494288676279
-52,0,-1.12733422378345,1.22290093390908,-0.239618151680487,0.281282683112064,-0.122253338243164,-0.416340912422471,-0.302944823763312,0.950697167857575,0.084774348269755,0.245643637478141
-53,0,-1.54504585318447,1.46465556555859,0.672465261607398,0.963677112876299,-0.732866944741014,0.269879007022312,-0.734121763984793,-0.18475004364869,0.494783604230457,-0.563469688908407
-54,0,-1.69728989778812,1.93427938064611,0.916674666213795,0.744100669613517,-0.536325680879341,0.745349313896706,-0.608494971121628,-0.036147807131094,0.730097211981708,-0.986020687921255
-55,0,-1.46716685688328,1.91950733639359,-0.040012375137611,0.248257524389148,-0.795936343325832,-0.755933622220192,0.664943062567423,-0.560825069941966,-0.987328335835364,0.00918182383389
-56,0,-1.5078580841421,1.11065681931139,-0.75655271526814,-0.433965979475654,-0.925820800763387,0.621204380538264,-0.725355435802351,-0.087195045278291,0.500040007799584,-0.351024070867477
-57,0,-1.79333947783294,1.64615611570236,0.593670368718185,0.74125415566331,-0.835056311664806,-0.128283340965351,0.795769070113583,0.338062872249377,0.961610282279288,-0.519755961049099
-58,0,-1.68562328688306,1.79136645116331,-0.917792004629201,-0.224807652067029,0.751172530954049,0.744925497765574,0.054821387540181,-0.268146122719043,-0.373795753322288,-0.023619900695578
-59,0,-1.70325116873164,1.56173898398367,0.937331444475048,-0.189146596668676,0.726757528139029,0.571196020214809,0.150478496659529,0.716370904753891,0.645947936391794,-0.096512499841381
-60,0,-0.31,-0.164,-10,10,0.303748234076738,0.094684069184242,0.846651908762107,0.505710991097032,-0.664846620425076,-0.722934785670171
-61,0,1.51747503460744,-1.57976833969122,-0.313853456471656,-0.670641690437042,0.337481189036041,-0.695059667580877,0.382512664766648,-0.754051294565859,-0.540912893771664,-0.152736592481289
-62,0,1.36729416399966,-1.54942606995245,0.746279765035798,0.320667909398266,0.075712278316126,0.557089028326803,-0.314459962457274,-0.091179395352991,-0.712572618352738,-0.862523770264919
-63,0,1.87551859565403,-1.01245024447758,0.961634242304571,0.99902517180177,0.428576472620752,0.790254229843056,-0.162732148014183,0.057108415575022,0.099625367521191,-0.41779573726667
-64,0,1.8407338686869,-1.58680706359952,-0.293737994923213,-0.654603713924763,-0.15830470325221,-0.4506171823593,0.106217286056366,-0.250165079508456,-0.598894350859836,-0.860382476004742
-65,0,1.47999238640346,-1.68861965445586,0.661374709635725,0.335413696048534,0.295408469126627,-0.340725080366546,0.611961227458239,0.53327702260923,-0.960254363897463,0.913251337834092
-66,0,1.0735581028252,-1.06052424530937,-0.790281335013236,0.372594655247821,-0.940075790261345,0.972106617215367,-0.246874887198155,-0.501544524013033,-0.134947611932188,0.130090806976322
-67,0,1.63769743034008,-1.64946099093265,-0.600590046972624,0.281621309709353,0.836244003088172,0.56250556179443,-0.244248244001593,0.274273110413607,0.988229164412892,-0.903492892429764
-68,0,1.9226795203725,-1.58810792001545,0.230397844467249,0.458000795025685,0.160534364807898,0.106760231103633,0.084376336290482,-0.410257096809632,-0.388975913032382,0.233684932760446
-69,0,1.42821810695172,-1.75832976379165,0.122894112900537,-0.193746425367835,0.602411133999453,-0.895694511099768,0.347280223444287,0.045175117581033,-0.232661771389541,-0.314648785155521
-70,0,1.42602875697361,-1.16082451050484,0.906027162216176,0.736418182225292,-0.041284854438203,0.308524126840497,0.369205540497406,0.333193031466162,0.98544497734097,-0.253876502721057
-71,0,1.73002019404142,-1.80947421953802,-0.677661468333469,0.07388223501889,-0.682147267310905,0.024126391992196,0.848946249678909,-0.516253994735439,0.202627425635043,-0.897477249843204
-72,0,1.11605808678586,-1.05622349137538,0.492431513300772,-0.737330353527688,0.594894327441348,0.805436037154752,-0.912716679245893,-0.390199322338262,-0.735805203184445,-0.05803264345169
-73,0,1.52878306779173,-1.52822073704896,-0.863821530585294,-0.987125905118183,-0.698190916645222,-0.17859271120364,-0.902497993400075,0.777448050547606,0.03349780154213,0.569802193246196
-74,0,1.69602091303769,-1.68791329506752,-0.919679036112179,-0.083795023015624,0.492078750634905,-0.102786002654994,0.168000984501864,-0.984910911120671,-0.901017886055053,0.639813560268343
-75,0,1.82292095427058,-1.79921516167805,0.107455937171145,-0.854711756750333,0.344969246269787,0.519092986129825,0.410230657805076,-0.91216461269154,0.033943611687528,-0.306643316979961
-76,0,1.15382245032495,-1.9125109596393,-0.80848616018294,-0.010443047871684,-0.706296790283886,0.822118261736111,0.163327430772402,0.252786291364115,-0.501338527911191,-0.28349201031843
-77,0,1.19521627831595,-1.4347201247938,-0.814416838367815,-0.02940231646999,-0.841428202408144,-0.004586605289201,-0.606434730541928,0.714277316437912,-0.44481897692423,-0.753698456302665
-78,0,1.99358961720643,-1.52499478281942,-0.877637461379848,0.414405535550407,-0.03365581494898,0.624692043559635,-0.832402658891314,-0.723028062732401,-0.867099034604054,-0.185632378061498
-79,0,1.6235192049324,-1.52045677356057,0.977828685636029,-0.57502380941392,-0.402617609462035,0.631967959251952,-0.426504420434097,0.480579460496328,0.686338078276468,-0.793812851707889
-80,1,-1.9061964810895,-1.28908450646839,10,10,0.101102136284509,-0.416199695149021,-0.494850987164782,-0.568698448483212,-0.184782382471875,-0.552230498856606
-81,1,-1.12334568706136,-1.43192728687949,-0.202671045004157,-0.138914163603925,-0.937156710796857,-0.116790109384378,-0.094237431941851,-0.896761118553971,-0.183423320636867,0.458624633065419
-82,1,-1.85938009020988,-1.2014277824818,-0.699402902052328,0.801606907908076,0.618074329335756,-0.172568708757076,-0.075693445304373,0.488815268086692,-0.612225386267585,0.515474858015819
-83,1,-1.44593059276162,-1.50738144143115,-0.146467066237161,0.773717178872341,0.498796984960351,-0.015862721592055,0.487162827649467,-0.002016922590367,-0.480395455657278,0.140660394856319
-84,1,-1.5068337349461,-1.39605748721966,0.935273336022611,-0.65840232577507,0.254028615496319,-0.207949363786322,0.494233964181716,0.342544015156094,-0.2790717466048,0.681766781920308
-85,1,1.29459521637362,1.25954745515179,-10,-10,-0.772948300582061,-0.755591080857131,-0.795691897784493,0.140653835392209,-0.160483486922781,0.460920935704452
-86,1,1.04689401512909,1.48899924906156,-0.102806023076495,-0.232256721754397,0.982487312078063,0.220639487969972,0.466108251058299,-0.328239000603224,0.955688285869012,0.98401214247364
-87,1,1.58830474403604,1.70226055213414,-0.400909948872293,-0.887240029691788,-0.796366553971199,-0.189011341359002,-0.984264269832423,0.228539348323108,0.696045037642922,-0.734941166556072
-88,1,1.07001216284605,1.81845698640496,-0.906675421892372,0.643501800272306,0.2964442904515,-0.212339822521429,-0.624947347663644,-0.076505534185115,0.690006945874019,0.603178865697037
-89,1,1.47818853391931,1.1810797217516,-0.777878371782176,0.158700400185078,0.77008386941758,0.318201581494366,-0.577373286340777,0.207915408782256,0.169898207168944,-0.134718349741109
-90,0,-1.39792536337696,1.8903759983709,10,-10,-0.381543623044489,-0.150608604917312,0.940200935058958,-0.260126956593852,0.011178432296195,-0.552646188796202
-91,0,-1.34181919280501,1.37770384290606,-0.273896107346467,0.9218628887177,0.183329714125041,0.794995796775324,0.47034078624241,0.587159127993906,0.656065190534019,0.710378359435155
-92,0,-1.08535749655328,1.25684564483175,-0.093438684660175,0.867363731909897,0.501979335337794,0.929133531466716,0.853038546233495,0.231647371842096,-0.921363933789468,0.9955206665909
-93,0,-1.5078347061732,1.75537297346943,-0.353047628963401,0.686996459628496,0.12650715249212,-0.584157551233493,0.67801198459735,0.130184075673761,-0.541365882749818,0.804095414322346
-94,0,-1.67232665291775,1.91398842184753,-0.055989266428472,0.083972688856283,0.495406878960658,-0.531851511151842,-0.68298755038252,-0.762719341237422,0.044183568378214,0.569492860435106
-95,0,1.52196747373202,-1.81272431584475,-10,10,-0.592427348924565,-0.245215291809175,0.450286805609337,-0.61720080602177,-0.078323806376631,-0.138400199664094
-96,0,1.34277619089321,-1.04264614535854,-0.840523610880692,-0.579768061766314,0.207088065224924,-0.30689024242517,-0.707319832593209,0.067209487208095,-0.219041441615042,0.651618314592841
-97,0,1.72996670685819,-1.26148185356343,-0.071347258910479,-0.571647931880792,0.00248497405952,0.410346123251162,0.294254262248804,0.698018369247902,0.652553267893053,-0.960621219815728
-98,0,1.63679608599505,-1.40483117266873,0.133355343382705,0.785183623637213,0.106494106522641,0.457003384754942,-0.314470768070196,-0.05337112691883,0.86147345141363,-0.770167158107586
-99,0,1.22531932528574,-1.39832123108255,0.751819680541469,0.843477659731268,0.880714646905367,0.20665859661747,-0.85053999542226,0.770244035843202,-0.790477429383416,-0.219373260405667
+index,prop,task,A,B,C,D,E,F,G,H,I,J
+0,0,Task_1,-0.815427620522422,-0.549653782587197,10,10,-0.492825179098274,0.173025977694162,0.598942935224295,-0.298754475196825,-0.581254909010269,-0.110656494210556
+1,0,Task_1,-0.69992853524861,-0.229332112274544,0.955713896876243,0.478117201427488,0.777586191100329,0.655369716778557,0.174914171427966,-0.288498877530604,-0.045316536149489,-0.606586193752411
+2,0,Task_1,-0.368076290946109,-0.969405272021421,0.330140292484796,-0.389505498689912,0.138935265824808,-0.871869282167555,0.37472462048701,0.16418591189513,0.293560701443717,0.285608940220021
+3,0,Task_1,-0.573491802821712,-0.815581340289383,-0.71381302228685,0.968769585007681,0.683148179202864,0.799125092538796,0.309479173526504,0.728052031003468,0.725495580994886,-0.676576302804248
+4,0,Task_1,-0.676358754897705,-0.548282221764748,-0.067783417095816,0.229988549891323,0.582427598044647,0.261947149184825,-0.31573435079735,0.61178122144268,-0.187058216967238,0.871764347690334
+5,0,Task_1,-0.757605167585471,-0.808298008590424,-0.212800982302764,0.915459776146607,-0.263465552591813,-0.666126495988014,-0.195028996490007,-0.237170057680116,-0.933358858596883,-0.19726273171241
+6,0,Task_1,-0.886003489547442,-0.509472491633194,0.972585787901787,-0.872502887185675,0.729110910814452,0.265244787210866,-0.726900973624432,0.248626170107939,0.809004396880265,-0.278494064780479
+7,0,Task_1,-1.02924299329947,-0.618392550297407,0.431645876221063,-0.595223273279383,0.953557069758112,-0.222915219121563,0.257670939076174,0.728927879098318,-0.579783055417687,-0.179960319428095
+8,0,Task_1,-0.502456609281931,-0.196195032500234,0.827091199052693,0.208781482910759,-0.573317187361529,-0.00488758921352,0.341918716034638,-0.292901671356202,-0.34531700628784,0.766920547630073
+9,0,Task_1,-0.517308486454666,-0.58057651993072,0.023965203621816,-0.805412569418484,0.806810139718495,-0.806576263127819,-0.39279977856172,-0.463819363774079,0.16095248005356,-0.553419747131608
+10,0,Task_1,-0.634057125095051,-1.01875520243377,0.916015229666356,0.0275946645157,-0.829507007977635,-0.700063689327201,-0.715601456588714,-0.439799165143527,-0.487241220494887,0.245279267056121
+11,0,Task_1,-0.577778256396397,-0.425744718740636,0.404557718897757,0.175997361062361,0.227383730822478,0.774254652577977,-0.616080996917636,-0.948639007451084,0.954076433375225,-0.497102001172339
+12,0,Task_1,-0.197376045004303,-0.404709510371676,0.906708844886064,-0.592737030373698,0.885229803890949,0.822069297241907,0.204367485562992,0.24610324883505,-0.079476866422163,-0.244006995583434
+13,0,Task_1,-0.766513992210544,-1.03945619108008,-0.284570394188414,-0.198686061574238,-0.168897609541112,-0.92939259112691,0.265899059671913,-0.828727642584781,-0.427453097474168,-0.738368091608883
+14,0,Task_1,-0.301129557074769,-0.466366201816861,0.011556817105957,0.499408314757229,0.253163424774478,0.017645446880421,0.401735167095264,-0.650287617298501,-0.262217482830833,-0.510102120130588
+15,0,Task_1,-0.372562647160274,-0.805363289239018,-0.520161174281201,-0.392478459347396,0.147495839565868,0.780879606474075,-0.281044687799173,-0.148036908135786,-0.208810472224093,0.278961929718128
+16,0,Task_1,-0.573276127254349,-0.843760519080871,0.562803219191695,0.323676061636996,0.490737136410372,-0.95476192699496,0.028603504036769,-0.246295219650507,-0.297736293597739,-0.632473830957653
+17,0,Task_1,-1.08177643161138,-1.08748936331147,0.859669461054104,0.485772819254089,0.268883598825009,0.253553321699552,-0.045743087569395,0.66793403278249,0.308591963919947,0.771084301464027
+18,0,Task_1,-0.121943068431321,-0.937658541363365,-0.118871100462413,0.159000937768132,0.2985428841756,-0.203829205332538,-0.637945695251352,0.658049690810909,0.949529589134008,-0.577812553880056
+19,0,Task_1,-1.06747884637477,-0.842449899007254,0.74037411093045,0.558782660077505,-0.096052126354583,0.529119817360537,0.372190604770144,0.688656466253683,-0.819433165315505,-0.12814415930811
+20,0,Task_1,-0.376355273108791,-0.908946282731397,-10,-10,0.785237349732891,-0.387217730495401,-0.942409218899448,0.160806577297675,-0.723969983661972,-0.452650134415823
+21,0,Task_1,-0.846685755905842,-0.209448772979162,-0.92290734251321,0.465751384219632,-0.81727500527083,-0.182472640926628,0.887526070620356,0.111592883978488,0.386435078880162,-0.440017211221272
+22,0,Task_1,-0.837187625737658,-0.851876882999398,0.28033979546451,0.379365407838544,0.634843008192624,0.371753918780839,-0.611653305369863,0.732567927874185,0.85803611350317,-0.577973441708411
+23,0,Task_1,-0.175842102272502,-0.488461994046914,0.222850898945077,-0.424057088828287,-0.27619426781836,0.616941667680694,-0.696779972923147,0.23612770730498,0.760705889780527,0.34004139732033
+24,0,Task_1,-0.857809192768388,-0.265302164309273,-0.339501197382944,0.898529591365812,-0.881538228231582,0.090728826664301,-0.858395870780934,0.907272331515896,0.160531735619067,0.678911811768841
+25,1,Task_1,-0.585614473203943,0.551068965982618,-0.444576754578563,-0.78871174512572,0.246625773070183,-0.663474018818313,-0.446355552060464,0.750312773134634,-0.98959522970136,-0.150120109840706
+26,1,Task_1,-0.23908209161902,0.897577090052027,-0.388679577334402,-0.708193450791952,0.850310084800308,-0.767256338531612,0.370509317329194,0.387354921015751,0.533160321164986,0.149390212455131
+27,1,Task_1,-0.830137779545391,0.125511796448773,-0.359644680155969,0.682555404147728,-0.53195400936544,0.934101689590862,-0.73600284018832,-0.29960291454053,0.351125596355818,-0.187842884669279
+28,1,Task_1,-0.216184782523012,0.211978905733634,-0.249409157470717,-0.137070024386644,-0.707128531134104,0.944932049234295,0.431233366052987,0.449543990959262,0.912901932280027,0.77394610963827
+29,1,Task_1,-0.282632767106854,0.264519450298051,0.935302642480463,0.833720966523807,0.254167956717343,-0.007922712021391,-0.114776295376767,-0.276042896002242,-0.813098403125419,0.341922052212475
+30,1,Task_1,-0.575451831562744,0.581795291243757,0.425716772255087,0.544174803732763,0.492863854358204,-0.047589791717166,-0.743840790633672,0.525289489060411,0.829611715544936,0.015193221845522
+31,1,Task_1,-0.758220206206948,0.613617353581097,0.12665368551441,0.469705238170149,0.874436248273008,-0.759571175468135,0.310230735619265,-0.80342084374485,-0.462431082486477,-0.407165886759129
+32,1,Task_1,-0.591882107713968,0.146363847316077,-0.731393018031039,0.461102224603009,-0.884528391885322,-0.419893944840798,0.647518214389067,0.688126326408485,0.754656371691603,0.116881923067816
+33,1,Task_1,-0.715219788915433,1.08461646785062,-0.527433424947131,-0.598774697808646,0.113197791601676,-0.50528865259863,0.117572114288939,0.590400320594326,-0.155159386769126,0.354827977413197
+34,1,Task_1,-0.796303564991711,0.501349128362771,-0.818626622405165,-0.029008564510599,0.952315968378468,0.817495784213924,0.182224554845043,-0.01377304364653,-0.26273195293588,-0.859530562808673
+35,1,Task_1,-0.195432956299915,0.260304213033586,0.305218688016626,-0.060885647660027,-0.387903446605514,-0.108064042735465,-0.962980405009682,-0.424289604203511,-0.253442293077285,0.309637368411297
+36,1,Task_1,-1.03051702404215,0.699873963424479,0.54312844740039,0.591372473040837,-0.835367086693457,0.198315253422507,-0.181434739783802,0.636429105754948,0.420628646992331,0.990122364664621
+37,1,Task_1,-0.642471040462047,0.350035500680932,-0.653263607332762,0.359359450868376,0.30131719114182,0.649581794356589,0.942268955633086,0.884659894489377,-0.473171239344398,0.039635066570717
+38,1,Task_1,-0.716592168562474,0.251176558055396,-0.812352457176761,0.219766101590983,-0.65021067790289,0.423621690291556,-0.58865099275791,0.061487886019891,-0.237737474016087,0.641284347380825
+39,1,Task_1,-0.627915139540302,0.522644163585557,-0.829819386940741,-0.345104687573802,0.485166070545119,-0.258839727448056,-0.920615208326881,0.275498215871427,-0.629075534110342,-0.642527887960687
+40,1,Task_1,-0.156663423633232,0.304600082490645,10,-10,0.041519856511361,0.23303461629095,-0.497233246191187,-0.023544587617095,-0.418540837770003,-0.550233932792512
+41,1,Task_2,-0.403555861666686,0.807008471177762,-0.480316474702795,-0.753784710340632,-0.613234235616998,0.167955573662474,0.455636631315042,-0.380227635953206,0.48021383007369,-0.453674929885108
+42,1,Task_2,-0.475033062023491,0.231061734571007,0.310098050913387,-0.835007082906627,0.407580140850853,0.556924247596553,-0.388616604639346,0.60215104751412,-0.984322198098753,-0.996332888983337
+43,1,Task_2,-0.286740191813169,0.871522465291953,-0.898456453446964,-0.797029924245349,0.47491891024478,0.480193220538417,-0.750856163558686,-0.043960372032018,-0.242651391805662,-0.109239061054006
+44,1,Task_2,-0.191548445530409,0.578646221732672,0.571329522934018,-0.677379826379623,0.098396984688832,-0.961599170104035,-0.753922591922157,0.361435891257559,-0.638030455493982,0.404349024843908
+45,1,Task_2,-0.962818439897195,0.190811801399786,0.402433205555268,-0.06801187450078,-0.373089661152032,0.23970878487105,0.416451106643361,-0.50599166271433,-0.88669034806741,0.30364523616443
+46,1,Task_2,-0.885419462203051,0.155312944156919,-0.174925245509766,0.050330391451536,-0.388676795741932,-0.72307604978553,0.646076107724964,-0.00105589909588,0.491928720743773,-0.647995101369186
+47,1,Task_2,-0.634581804798124,0.149519641506344,0.289410761217525,0.48566510896872,0.338684773860801,0.374319581439648,-0.105479014627167,0.004520417892418,0.222862261975939,0.23538363683764
+48,1,Task_2,-0.380429991155736,0.423554740615867,-0.199044563017407,-0.855804112781183,0.947572000564906,0.654939562810152,0.802084131057488,0.010033694468233,0.449766366250574,0.119974134618433
+49,1,Task_2,-0.294345606906633,0.954791580849514,-0.346729568989951,-0.614828863660672,0.578150372001527,-0.697356489908387,-0.272496177427547,-0.326679505363186,0.403185907494623,0.659834986972357
+50,2,Task_1,0.164414245529879,0.270409021456753,-0.470918775433235,-0.165965173767875,-0.118373275802139,-0.804671542299309,-0.273096283874977,0.964676290034708,-0.240786016285174,0.781092750718218
+51,2,Task_1,0.962558187516297,0.353448095106742,-0.546315077724052,-0.263397808061131,0.073416112160648,-0.561584513583351,-0.003812545601594,-0.067901708659798,-0.797337624892413,-0.502494288676279
+52,2,Task_1,0.328218018271649,0.30124689351081,-0.239618151680487,0.281282683112064,-0.122253338243164,-0.416340912422471,-0.302944823763312,0.950697167857575,0.084774348269755,0.245643637478141
+53,2,Task_1,0.157711895115317,0.944426984942688,0.672465261607398,0.963677112876299,-0.732866944741014,0.269879007022312,-0.734121763984793,-0.18475004364869,0.494783604230457,-0.563469688908407
+54,2,Task_1,1.05838672002069,0.573716871491595,0.916674666213795,0.744100669613517,-0.536325680879341,0.745349313896706,-0.608494971121628,-0.036147807131094,0.730097211981708,-0.986020687921255
+55,2,Task_1,0.110836847139862,0.585126999320639,-0.040012375137611,0.248257524389148,-0.795936343325832,-0.755933622220192,0.664943062567423,-0.560825069941966,-0.987328335835364,0.00918182383389
+56,2,Task_1,1.06747271856711,1.07364864476858,-0.75655271526814,-0.433965979475654,-0.925820800763387,0.621204380538264,-0.725355435802351,-0.087195045278291,0.500040007799584,-0.351024070867477
+57,2,Task_1,0.989939601503884,0.247705435387067,0.593670368718185,0.74125415566331,-0.835056311664806,-0.128283340965351,0.795769070113583,0.338062872249377,0.961610282279288,-0.519755961049099
+58,2,Task_1,0.970023710841513,1.01758529653736,-0.917792004629201,-0.224807652067029,0.751172530954049,0.744925497765574,0.054821387540181,-0.268146122719043,-0.373795753322288,-0.023619900695578
+59,2,Task_1,1.04979809782509,0.825205513076737,0.937331444475048,-0.189146596668676,0.726757528139029,0.571196020214809,0.150478496659529,0.716370904753891,0.645947936391794,-0.096512499841381
+60,2,Task_1,0.606201322157722,0.429911059767652,-10,10,0.303748234076738,0.094684069184242,0.846651908762107,0.505710991097032,-0.664846620425076,-0.722934785670171
+61,2,Task_1,0.509318826024622,0.139403752494424,-0.313853456471656,-0.670641690437042,0.337481189036041,-0.695059667580877,0.382512664766648,-0.754051294565859,-0.540912893771664,-0.152736592481289
+62,2,Task_1,0.416414890641425,1.04597850573715,0.746279765035798,0.320667909398266,0.075712278316126,0.557089028326803,-0.314459962457274,-0.091179395352991,-0.712572618352738,-0.862523770264919
+63,2,Task_1,0.236961042862613,0.461540684896611,0.961634242304571,0.99902517180177,0.428576472620752,0.790254229843056,-0.162732148014183,0.057108415575022,0.099625367521191,-0.41779573726667
+64,2,Task_1,0.325246248634948,0.721692503249982,-0.293737994923213,-0.654603713924763,-0.15830470325221,-0.4506171823593,0.106217286056366,-0.250165079508456,-0.598894350859836,-0.860382476004742
+65,2,Task_1,0.258917730833821,0.68064431493967,0.661374709635725,0.335413696048534,0.295408469126627,-0.340725080366546,0.611961227458239,0.53327702260923,-0.960254363897463,0.913251337834092
+66,2,Task_2,1.0502123912678,1.0175241545193,-0.790281335013236,0.372594655247821,-0.940075790261345,0.972106617215367,-0.246874887198155,-0.501544524013033,-0.134947611932188,0.130090806976322
+67,2,Task_2,0.653819704386349,0.899775279158189,-0.600590046972624,0.281621309709353,0.836244003088172,0.56250556179443,-0.244248244001593,0.274273110413607,0.988229164412892,-0.903492892429764
+68,2,Task_2,0.29937357944438,1.01665644266054,0.230397844467249,0.458000795025685,0.160534364807898,0.106760231103633,0.084376336290482,-0.410257096809632,-0.388975913032382,0.233684932760446
+69,2,Task_2,0.631194229943451,0.952468985425419,0.122894112900537,-0.193746425367835,0.602411133999453,-0.895694511099768,0.347280223444287,0.045175117581033,-0.232661771389541,-0.314648785155521
+70,2,Task_2,1.08713454227837,0.656075322649452,0.906027162216176,0.736418182225292,-0.041284854438203,0.308524126840497,0.369205540497406,0.333193031466162,0.98544497734097,-0.253876502721057
+71,2,Task_2,0.139844193498669,0.408310759532142,-0.677661468333469,0.07388223501889,-0.682147267310905,0.024126391992196,0.848946249678909,-0.516253994735439,0.202627425635043,-0.897477249843204
+72,2,Task_2,0.155190443948636,0.269264020133562,0.492431513300772,-0.737330353527688,0.594894327441348,0.805436037154752,-0.912716679245893,-0.390199322338262,-0.735805203184445,-0.05803264345169
+73,2,Task_2,0.71862355058755,0.784351472902302,-0.863821530585294,-0.987125905118183,-0.698190916645222,-0.17859271120364,-0.902497993400075,0.777448050547606,0.03349780154213,0.569802193246196
+74,2,Task_2,0.932100491693559,0.673631604373795,-0.919679036112179,-0.083795023015624,0.492078750634905,-0.102786002654994,0.168000984501864,-0.984910911120671,-0.901017886055053,0.639813560268343
+75,3,Task_1,0.103583721726665,-0.373304248094501,0.107455937171145,-0.854711756750333,0.344969246269787,0.519092986129825,0.410230657805076,-0.91216461269154,0.033943611687528,-0.306643316979961
+76,3,Task_1,0.698578699086034,-0.805397267250048,-0.80848616018294,-0.010443047871684,-0.706296790283886,0.822118261736111,0.163327430772402,0.252786291364115,-0.501338527911191,-0.28349201031843
+77,3,Task_1,0.236887498524042,-0.155242051697718,-0.814416838367815,-0.02940231646999,-0.841428202408144,-0.004586605289201,-0.606434730541928,0.714277316437912,-0.44481897692423,-0.753698456302665
+78,3,Task_1,0.178535729762585,-0.178631521167564,-0.877637461379848,0.414405535550407,-0.03365581494898,0.624692043559635,-0.832402658891314,-0.723028062732401,-0.867099034604054,-0.185632378061498
+79,3,Task_1,0.135466013133272,-1.08706802405113,0.977828685636029,-0.57502380941392,-0.402617609462035,0.631967959251952,-0.426504420434097,0.480579460496328,0.686338078276468,-0.793812851707889
+80,3,Task_1,1.06511564653917,-0.529772758901416,10,10,0.101102136284509,-0.416199695149021,-0.494850987164782,-0.568698448483212,-0.184782382471875,-0.552230498856606
+81,3,Task_1,1.065535288073,-0.706304720574752,-0.202671045004157,-0.138914163603925,-0.937156710796857,-0.116790109384378,-0.094237431941851,-0.896761118553971,-0.183423320636867,0.458624633065419
+82,3,Task_1,0.896291258595646,-0.255636497676642,-0.699402902052328,0.801606907908076,0.618074329335756,-0.172568708757076,-0.075693445304373,0.488815268086692,-0.612225386267585,0.515474858015819
+83,3,Task_1,0.524657824983779,-0.653380201807584,-0.146467066237161,0.773717178872341,0.498796984960351,-0.015862721592055,0.487162827649467,-0.002016922590367,-0.480395455657278,0.140660394856319
+84,3,Task_1,0.131331250236976,-0.217693885103831,0.935273336022611,-0.65840232577507,0.254028615496319,-0.207949363786322,0.494233964181716,0.342544015156094,-0.2790717466048,0.681766781920308
+85,3,Task_1,0.247418413965845,-0.55249563814082,-10,-10,-0.772948300582061,-0.755591080857131,-0.795691897784493,0.140653835392209,-0.160483486922781,0.460920935704452
+86,3,Task_1,0.877892966008508,-0.600861427554399,-0.102806023076495,-0.232256721754397,0.982487312078063,0.220639487969972,0.466108251058299,-0.328239000603224,0.955688285869012,0.98401214247364
+87,3,Task_1,0.353355849981167,-0.384127703150446,-0.400909948872293,-0.887240029691788,-0.796366553971199,-0.189011341359002,-0.984264269832423,0.228539348323108,0.696045037642922,-0.734941166556072
+88,3,Task_1,0.278857275841181,-0.845560468506653,-0.906675421892372,0.643501800272306,0.2964442904515,-0.212339822521429,-0.624947347663644,-0.076505534185115,0.690006945874019,0.603178865697037
+89,3,Task_1,0.36866502316092,-0.854728078950075,-0.777878371782176,0.158700400185078,0.77008386941758,0.318201581494366,-0.577373286340777,0.207915408782256,0.169898207168944,-0.134718349741109
+90,3,Task_1,0.493452674287737,-0.910519988093965,10,-10,-0.381543623044489,-0.150608604917312,0.940200935058958,-0.260126956593852,0.011178432296195,-0.552646188796202
+91,3,Task_2,0.908818256135453,-0.80987547007129,-0.273896107346467,0.9218628887177,0.183329714125041,0.794995796775324,0.47034078624241,0.587159127993906,0.656065190534019,0.710378359435155
+92,3,Task_2,0.996336489766102,-0.975493823251068,-0.093438684660175,0.867363731909897,0.501979335337794,0.929133531466716,0.853038546233495,0.231647371842096,-0.921363933789468,0.9955206665909
+93,3,Task_2,0.657942481588168,-0.245177885637302,-0.353047628963401,0.686996459628496,0.12650715249212,-0.584157551233493,0.67801198459735,0.130184075673761,-0.541365882749818,0.804095414322346
+94,3,Task_2,0.328489621282775,-1.08052040344332,-0.055989266428472,0.083972688856283,0.495406878960658,-0.531851511151842,-0.68298755038252,-0.762719341237422,0.044183568378214,0.569492860435106
+95,3,Task_2,0.253790826276124,-0.935268396370178,-10,10,-0.592427348924565,-0.245215291809175,0.450286805609337,-0.61720080602177,-0.078323806376631,-0.138400199664094
+96,3,Task_2,0.226661731310054,-0.206651604608129,-0.840523610880692,-0.579768061766314,0.207088065224924,-0.30689024242517,-0.707319832593209,0.067209487208095,-0.219041441615042,0.651618314592841
+97,3,Task_2,0.730538042566787,-0.815537451517852,-0.071347258910479,-0.571647931880792,0.00248497405952,0.410346123251162,0.294254262248804,0.698018369247902,0.652553267893053,-0.960621219815728
+98,3,Task_2,1.05315118537755,-0.90842251928343,0.133355343382705,0.785183623637213,0.106494106522641,0.457003384754942,-0.314470768070196,-0.05337112691883,0.86147345141363,-0.770167158107586
+99,3,Task_2,0.453505426891074,-0.509861391994549,0.751819680541469,0.843477659731268,0.880714646905367,0.20665859661747,-0.85053999542226,0.770244035843202,-0.790477429383416,-0.219373260405667
diff --git a/tests/exec_test/classification_max_corr/sisso.json b/tests/exec_test/classification_max_corr/sisso.json
index bc7fa4cd2b4bf9ee1093d3f7e33ca3052bdab7f5..cae3d008d56450244b127005ec662a36f4efae82 100644
--- a/tests/exec_test/classification_max_corr/sisso.json
+++ b/tests/exec_test/classification_max_corr/sisso.json
@@ -10,7 +10,7 @@
     "leave_out_frac": 0.2,
     "n_models_store": 1,
     "calc_type": "classification",
-    "leave_out_inds": [80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95 ,96 ,97, 98 , 99],
+    "leave_out_inds": [ 2, 3, 4, 6, 21, 23, 30, 38, 39, 52, 53, 61, 76, 82, 83, 45, 47, 48, 49, 66 ],
     "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"],
     "param_opset" : [],
     "fix_intercept": false
diff --git a/tests/exec_test/classification_max_corr_gen_proj/data.csv b/tests/exec_test/classification_max_corr_gen_proj/data.csv
index 3fa9f64bd0b9d2133040ea4849a71789ac8f078a..3de3e29a3d3a8510a21ace99268501120ce46af4 100644
--- a/tests/exec_test/classification_max_corr_gen_proj/data.csv
+++ b/tests/exec_test/classification_max_corr_gen_proj/data.csv
@@ -1,101 +1,101 @@
-index,prop,A,B,C,D,E,F,G,H,I,J
-0,1,0.1,-0.3,10,10,-0.492825179098274,0.173025977694162,0.598942935224295,-0.298754475196825,-0.581254909010269,-0.110656494210556
-1,1,-1.89442810374214,-1.31996134398007,0.955713896876243,0.478117201427488,0.777586191100329,0.655369716778557,0.174914171427966,-0.288498877530604,-0.045316536149489,-0.606586193752411
-2,1,-1.47460150711424,-1.22614964523433,0.330140292484796,-0.389505498689912,0.138935265824808,-0.871869282167555,0.37472462048701,0.16418591189513,0.293560701443717,0.285608940220021
-3,1,-1.30213414336735,-1.82621262418812,-0.71381302228685,0.968769585007681,0.683148179202864,0.799125092538796,0.309479173526504,0.728052031003468,0.725495580994886,-0.676576302804248
-4,1,-1.73938632269334,-1.58349866505488,-0.067783417095816,0.229988549891323,0.582427598044647,0.261947149184825,-0.31573435079735,0.61178122144268,-0.187058216967238,0.871764347690334
-5,1,-1.56660896632398,-1.05861814902183,-0.212800982302764,0.915459776146607,-0.263465552591813,-0.666126495988014,-0.195028996490007,-0.237170057680116,-0.933358858596883,-0.19726273171241
-6,1,-1.55340876153895,-1.25209231285838,0.972585787901787,-0.872502887185675,0.729110910814452,0.265244787210866,-0.726900973624432,0.248626170107939,0.809004396880265,-0.278494064780479
-7,1,-1.54625325136447,-1.81238888450819,0.431645876221063,-0.595223273279383,0.953557069758112,-0.222915219121563,0.257670939076174,0.728927879098318,-0.579783055417687,-0.179960319428095
-8,1,-1.12735554524035,-1.69261497444728,0.827091199052693,0.208781482910759,-0.573317187361529,-0.00488758921352,0.341918716034638,-0.292901671356202,-0.34531700628784,0.766920547630073
-9,1,-1.35367834815884,-1.38141056472962,0.023965203621816,-0.805412569418484,0.806810139718495,-0.806576263127819,-0.39279977856172,-0.463819363774079,0.16095248005356,-0.553419747131608
-10,1,-1.17853151888796,-1.27705829298504,0.916015229666356,0.0275946645157,-0.829507007977635,-0.700063689327201,-0.715601456588714,-0.439799165143527,-0.487241220494887,0.245279267056121
-11,1,-1.17547049766875,-1.05613281246665,0.404557718897757,0.175997361062361,0.227383730822478,0.774254652577977,-0.616080996917636,-0.948639007451084,0.954076433375225,-0.497102001172339
-12,1,-1.67277915033943,-1.86190239883588,0.906708844886064,-0.592737030373698,0.885229803890949,0.822069297241907,0.204367485562992,0.24610324883505,-0.079476866422163,-0.244006995583434
-13,1,-1.96326165438884,-1.31680458089693,-0.284570394188414,-0.198686061574238,-0.168897609541112,-0.92939259112691,0.265899059671913,-0.828727642584781,-0.427453097474168,-0.738368091608883
-14,1,-1.79497769808481,-1.13948217357082,0.011556817105957,0.499408314757229,0.253163424774478,0.017645446880421,0.401735167095264,-0.650287617298501,-0.262217482830833,-0.510102120130588
-15,1,-1.07957120536262,-1.93245955077991,-0.520161174281201,-0.392478459347396,0.147495839565868,0.780879606474075,-0.281044687799173,-0.148036908135786,-0.208810472224093,0.278961929718128
-16,1,-1.52555145037507,-1.72455209196736,0.562803219191695,0.323676061636996,0.490737136410372,-0.95476192699496,0.028603504036769,-0.246295219650507,-0.297736293597739,-0.632473830957653
-17,1,-1.29142309507315,-1.9506961526212,0.859669461054104,0.485772819254089,0.268883598825009,0.253553321699552,-0.045743087569395,0.66793403278249,0.308591963919947,0.771084301464027
-18,1,-1.23404787121001,-1.68173519287847,-0.118871100462413,0.159000937768132,0.2985428841756,-0.203829205332538,-0.637945695251352,0.658049690810909,0.949529589134008,-0.577812553880056
-19,1,-1.13513050029551,-1.3119036669604,0.74037411093045,0.558782660077505,-0.096052126354583,0.529119817360537,0.372190604770144,0.688656466253683,-0.819433165315505,-0.12814415930811
-20,1,-0.2,0.132,-10,-10,0.785237349732891,-0.387217730495401,-0.942409218899448,0.160806577297675,-0.723969983661972,-0.452650134415823
-21,1,1.1507658618081,1.7260505392724,-0.92290734251321,0.465751384219632,-0.81727500527083,-0.182472640926628,0.887526070620356,0.111592883978488,0.386435078880162,-0.440017211221272
-22,1,1.90389768224701,1.71880759316591,0.28033979546451,0.379365407838544,0.634843008192624,0.371753918780839,-0.611653305369863,0.732567927874185,0.85803611350317,-0.577973441708411
-23,1,1.77751976452381,1.28697050370578,0.222850898945077,-0.424057088828287,-0.27619426781836,0.616941667680694,-0.696779972923147,0.23612770730498,0.760705889780527,0.34004139732033
-24,1,1.65314327493874,1.16282810211312,-0.339501197382944,0.898529591365812,-0.881538228231582,0.090728826664301,-0.858395870780934,0.907272331515896,0.160531735619067,0.678911811768841
-25,1,1.30955180558204,1.36827755737648,-0.444576754578563,-0.78871174512572,0.246625773070183,-0.663474018818313,-0.446355552060464,0.750312773134634,-0.98959522970136,-0.150120109840706
-26,1,1.44924431171893,1.40328864581169,-0.388679577334402,-0.708193450791952,0.850310084800308,-0.767256338531612,0.370509317329194,0.387354921015751,0.533160321164986,0.149390212455131
-27,1,1.61362501391753,1.05448314414567,-0.359644680155969,0.682555404147728,-0.53195400936544,0.934101689590862,-0.73600284018832,-0.29960291454053,0.351125596355818,-0.187842884669279
-28,1,1.0243392720598,1.91059602121133,-0.249409157470717,-0.137070024386644,-0.707128531134104,0.944932049234295,0.431233366052987,0.449543990959262,0.912901932280027,0.77394610963827
-29,1,1.99444678594607,1.67204984441306,0.935302642480463,0.833720966523807,0.254167956717343,-0.007922712021391,-0.114776295376767,-0.276042896002242,-0.813098403125419,0.341922052212475
-30,1,1.40110330287926,1.109011516196,0.425716772255087,0.544174803732763,0.492863854358204,-0.047589791717166,-0.743840790633672,0.525289489060411,0.829611715544936,0.015193221845522
-31,1,1.94995090625471,1.05727410799969,0.12665368551441,0.469705238170149,0.874436248273008,-0.759571175468135,0.310230735619265,-0.80342084374485,-0.462431082486477,-0.407165886759129
-32,1,1.47264625042994,1.18913643279065,-0.731393018031039,0.461102224603009,-0.884528391885322,-0.419893944840798,0.647518214389067,0.688126326408485,0.754656371691603,0.116881923067816
-33,1,1.45901677207507,1.17024364037294,-0.527433424947131,-0.598774697808646,0.113197791601676,-0.50528865259863,0.117572114288939,0.590400320594326,-0.155159386769126,0.354827977413197
-34,1,1.32042744243041,1.19801952930384,-0.818626622405165,-0.029008564510599,0.952315968378468,0.817495784213924,0.182224554845043,-0.01377304364653,-0.26273195293588,-0.859530562808673
-35,1,1.88138237976289,1.03670081839679,0.305218688016626,-0.060885647660027,-0.387903446605514,-0.108064042735465,-0.962980405009682,-0.424289604203511,-0.253442293077285,0.309637368411297
-36,1,1.9986688782461,1.36909257128618,0.54312844740039,0.591372473040837,-0.835367086693457,0.198315253422507,-0.181434739783802,0.636429105754948,0.420628646992331,0.990122364664621
-37,1,1.50455818499044,1.19094974349673,-0.653263607332762,0.359359450868376,0.30131719114182,0.649581794356589,0.942268955633086,0.884659894489377,-0.473171239344398,0.039635066570717
-38,1,1.00833361547154,1.98150630000827,-0.812352457176761,0.219766101590983,-0.65021067790289,0.423621690291556,-0.58865099275791,0.061487886019891,-0.237737474016087,0.641284347380825
-39,1,1.60179185724619,1.12508599627141,-0.829819386940741,-0.345104687573802,0.485166070545119,-0.258839727448056,-0.920615208326881,0.275498215871427,-0.629075534110342,-0.642527887960687
-40,0,0.2,0.58,10,-10,0.041519856511361,0.23303461629095,-0.497233246191187,-0.023544587617095,-0.418540837770003,-0.550233932792512
-41,0,-1.09147574370355,1.70418701701285,-0.480316474702795,-0.753784710340632,-0.613234235616998,0.167955573662474,0.455636631315042,-0.380227635953206,0.48021383007369,-0.453674929885108
-42,0,-1.9425392252915,1.59311394144654,0.310098050913387,-0.835007082906627,0.407580140850853,0.556924247596553,-0.388616604639346,0.60215104751412,-0.984322198098753,-0.996332888983337
-43,0,-1.40302421044915,1.05041379743038,-0.898456453446964,-0.797029924245349,0.47491891024478,0.480193220538417,-0.750856163558686,-0.043960372032018,-0.242651391805662,-0.109239061054006
-44,0,-1.45810616907354,1.08468326497063,0.571329522934018,-0.677379826379623,0.098396984688832,-0.961599170104035,-0.753922591922157,0.361435891257559,-0.638030455493982,0.404349024843908
-45,0,-1.60421432901638,1.57730973247518,0.402433205555268,-0.06801187450078,-0.373089661152032,0.23970878487105,0.416451106643361,-0.50599166271433,-0.88669034806741,0.30364523616443
-46,0,-1.54868661350102,1.32883184576708,-0.174925245509766,0.050330391451536,-0.388676795741932,-0.72307604978553,0.646076107724964,-0.00105589909588,0.491928720743773,-0.647995101369186
-47,0,-1.8920756792535,1.76576258461153,0.289410761217525,0.48566510896872,0.338684773860801,0.374319581439648,-0.105479014627167,0.004520417892418,0.222862261975939,0.23538363683764
-48,0,-1.51442922313653,1.69840409315155,-0.199044563017407,-0.855804112781183,0.947572000564906,0.654939562810152,0.802084131057488,0.010033694468233,0.449766366250574,0.119974134618433
-49,0,-1.33469144171841,1.80124846893287,-0.346729568989951,-0.614828863660672,0.578150372001527,-0.697356489908387,-0.272496177427547,-0.326679505363186,0.403185907494623,0.659834986972357
-50,0,-1.39216086591683,1.96030807097305,-0.470918775433235,-0.165965173767875,-0.118373275802139,-0.804671542299309,-0.273096283874977,0.964676290034708,-0.240786016285174,0.781092750718218
-51,0,-1.10818774496527,1.1321805921252,-0.546315077724052,-0.263397808061131,0.073416112160648,-0.561584513583351,-0.003812545601594,-0.067901708659798,-0.797337624892413,-0.502494288676279
-52,0,-1.12733422378345,1.22290093390908,-0.239618151680487,0.281282683112064,-0.122253338243164,-0.416340912422471,-0.302944823763312,0.950697167857575,0.084774348269755,0.245643637478141
-53,0,-1.54504585318447,1.46465556555859,0.672465261607398,0.963677112876299,-0.732866944741014,0.269879007022312,-0.734121763984793,-0.18475004364869,0.494783604230457,-0.563469688908407
-54,0,-1.69728989778812,1.93427938064611,0.916674666213795,0.744100669613517,-0.536325680879341,0.745349313896706,-0.608494971121628,-0.036147807131094,0.730097211981708,-0.986020687921255
-55,0,-1.46716685688328,1.91950733639359,-0.040012375137611,0.248257524389148,-0.795936343325832,-0.755933622220192,0.664943062567423,-0.560825069941966,-0.987328335835364,0.00918182383389
-56,0,-1.5078580841421,1.11065681931139,-0.75655271526814,-0.433965979475654,-0.925820800763387,0.621204380538264,-0.725355435802351,-0.087195045278291,0.500040007799584,-0.351024070867477
-57,0,-1.79333947783294,1.64615611570236,0.593670368718185,0.74125415566331,-0.835056311664806,-0.128283340965351,0.795769070113583,0.338062872249377,0.961610282279288,-0.519755961049099
-58,0,-1.68562328688306,1.79136645116331,-0.917792004629201,-0.224807652067029,0.751172530954049,0.744925497765574,0.054821387540181,-0.268146122719043,-0.373795753322288,-0.023619900695578
-59,0,-1.70325116873164,1.56173898398367,0.937331444475048,-0.189146596668676,0.726757528139029,0.571196020214809,0.150478496659529,0.716370904753891,0.645947936391794,-0.096512499841381
-60,0,-0.31,-0.164,-10,10,0.303748234076738,0.094684069184242,0.846651908762107,0.505710991097032,-0.664846620425076,-0.722934785670171
-61,0,1.51747503460744,-1.57976833969122,-0.313853456471656,-0.670641690437042,0.337481189036041,-0.695059667580877,0.382512664766648,-0.754051294565859,-0.540912893771664,-0.152736592481289
-62,0,1.36729416399966,-1.54942606995245,0.746279765035798,0.320667909398266,0.075712278316126,0.557089028326803,-0.314459962457274,-0.091179395352991,-0.712572618352738,-0.862523770264919
-63,0,1.87551859565403,-1.01245024447758,0.961634242304571,0.99902517180177,0.428576472620752,0.790254229843056,-0.162732148014183,0.057108415575022,0.099625367521191,-0.41779573726667
-64,0,1.8407338686869,-1.58680706359952,-0.293737994923213,-0.654603713924763,-0.15830470325221,-0.4506171823593,0.106217286056366,-0.250165079508456,-0.598894350859836,-0.860382476004742
-65,0,1.47999238640346,-1.68861965445586,0.661374709635725,0.335413696048534,0.295408469126627,-0.340725080366546,0.611961227458239,0.53327702260923,-0.960254363897463,0.913251337834092
-66,0,1.0735581028252,-1.06052424530937,-0.790281335013236,0.372594655247821,-0.940075790261345,0.972106617215367,-0.246874887198155,-0.501544524013033,-0.134947611932188,0.130090806976322
-67,0,1.63769743034008,-1.64946099093265,-0.600590046972624,0.281621309709353,0.836244003088172,0.56250556179443,-0.244248244001593,0.274273110413607,0.988229164412892,-0.903492892429764
-68,0,1.9226795203725,-1.58810792001545,0.230397844467249,0.458000795025685,0.160534364807898,0.106760231103633,0.084376336290482,-0.410257096809632,-0.388975913032382,0.233684932760446
-69,0,1.42821810695172,-1.75832976379165,0.122894112900537,-0.193746425367835,0.602411133999453,-0.895694511099768,0.347280223444287,0.045175117581033,-0.232661771389541,-0.314648785155521
-70,0,1.42602875697361,-1.16082451050484,0.906027162216176,0.736418182225292,-0.041284854438203,0.308524126840497,0.369205540497406,0.333193031466162,0.98544497734097,-0.253876502721057
-71,0,1.73002019404142,-1.80947421953802,-0.677661468333469,0.07388223501889,-0.682147267310905,0.024126391992196,0.848946249678909,-0.516253994735439,0.202627425635043,-0.897477249843204
-72,0,1.11605808678586,-1.05622349137538,0.492431513300772,-0.737330353527688,0.594894327441348,0.805436037154752,-0.912716679245893,-0.390199322338262,-0.735805203184445,-0.05803264345169
-73,0,1.52878306779173,-1.52822073704896,-0.863821530585294,-0.987125905118183,-0.698190916645222,-0.17859271120364,-0.902497993400075,0.777448050547606,0.03349780154213,0.569802193246196
-74,0,1.69602091303769,-1.68791329506752,-0.919679036112179,-0.083795023015624,0.492078750634905,-0.102786002654994,0.168000984501864,-0.984910911120671,-0.901017886055053,0.639813560268343
-75,0,1.82292095427058,-1.79921516167805,0.107455937171145,-0.854711756750333,0.344969246269787,0.519092986129825,0.410230657805076,-0.91216461269154,0.033943611687528,-0.306643316979961
-76,0,1.15382245032495,-1.9125109596393,-0.80848616018294,-0.010443047871684,-0.706296790283886,0.822118261736111,0.163327430772402,0.252786291364115,-0.501338527911191,-0.28349201031843
-77,0,1.19521627831595,-1.4347201247938,-0.814416838367815,-0.02940231646999,-0.841428202408144,-0.004586605289201,-0.606434730541928,0.714277316437912,-0.44481897692423,-0.753698456302665
-78,0,1.99358961720643,-1.52499478281942,-0.877637461379848,0.414405535550407,-0.03365581494898,0.624692043559635,-0.832402658891314,-0.723028062732401,-0.867099034604054,-0.185632378061498
-79,0,1.6235192049324,-1.52045677356057,0.977828685636029,-0.57502380941392,-0.402617609462035,0.631967959251952,-0.426504420434097,0.480579460496328,0.686338078276468,-0.793812851707889
-80,1,-1.9061964810895,-1.28908450646839,10,10,0.101102136284509,-0.416199695149021,-0.494850987164782,-0.568698448483212,-0.184782382471875,-0.552230498856606
-81,1,-1.12334568706136,-1.43192728687949,-0.202671045004157,-0.138914163603925,-0.937156710796857,-0.116790109384378,-0.094237431941851,-0.896761118553971,-0.183423320636867,0.458624633065419
-82,1,-1.85938009020988,-1.2014277824818,-0.699402902052328,0.801606907908076,0.618074329335756,-0.172568708757076,-0.075693445304373,0.488815268086692,-0.612225386267585,0.515474858015819
-83,1,-1.44593059276162,-1.50738144143115,-0.146467066237161,0.773717178872341,0.498796984960351,-0.015862721592055,0.487162827649467,-0.002016922590367,-0.480395455657278,0.140660394856319
-84,1,-1.5068337349461,-1.39605748721966,0.935273336022611,-0.65840232577507,0.254028615496319,-0.207949363786322,0.494233964181716,0.342544015156094,-0.2790717466048,0.681766781920308
-85,1,1.29459521637362,1.25954745515179,-10,-10,-0.772948300582061,-0.755591080857131,-0.795691897784493,0.140653835392209,-0.160483486922781,0.460920935704452
-86,1,1.04689401512909,1.48899924906156,-0.102806023076495,-0.232256721754397,0.982487312078063,0.220639487969972,0.466108251058299,-0.328239000603224,0.955688285869012,0.98401214247364
-87,1,1.58830474403604,1.70226055213414,-0.400909948872293,-0.887240029691788,-0.796366553971199,-0.189011341359002,-0.984264269832423,0.228539348323108,0.696045037642922,-0.734941166556072
-88,1,1.07001216284605,1.81845698640496,-0.906675421892372,0.643501800272306,0.2964442904515,-0.212339822521429,-0.624947347663644,-0.076505534185115,0.690006945874019,0.603178865697037
-89,1,1.47818853391931,1.1810797217516,-0.777878371782176,0.158700400185078,0.77008386941758,0.318201581494366,-0.577373286340777,0.207915408782256,0.169898207168944,-0.134718349741109
-90,0,-1.39792536337696,1.8903759983709,10,-10,-0.381543623044489,-0.150608604917312,0.940200935058958,-0.260126956593852,0.011178432296195,-0.552646188796202
-91,0,-1.34181919280501,1.37770384290606,-0.273896107346467,0.9218628887177,0.183329714125041,0.794995796775324,0.47034078624241,0.587159127993906,0.656065190534019,0.710378359435155
-92,0,-1.08535749655328,1.25684564483175,-0.093438684660175,0.867363731909897,0.501979335337794,0.929133531466716,0.853038546233495,0.231647371842096,-0.921363933789468,0.9955206665909
-93,0,-1.5078347061732,1.75537297346943,-0.353047628963401,0.686996459628496,0.12650715249212,-0.584157551233493,0.67801198459735,0.130184075673761,-0.541365882749818,0.804095414322346
-94,0,-1.67232665291775,1.91398842184753,-0.055989266428472,0.083972688856283,0.495406878960658,-0.531851511151842,-0.68298755038252,-0.762719341237422,0.044183568378214,0.569492860435106
-95,0,1.52196747373202,-1.81272431584475,-10,10,-0.592427348924565,-0.245215291809175,0.450286805609337,-0.61720080602177,-0.078323806376631,-0.138400199664094
-96,0,1.34277619089321,-1.04264614535854,-0.840523610880692,-0.579768061766314,0.207088065224924,-0.30689024242517,-0.707319832593209,0.067209487208095,-0.219041441615042,0.651618314592841
-97,0,1.72996670685819,-1.26148185356343,-0.071347258910479,-0.571647931880792,0.00248497405952,0.410346123251162,0.294254262248804,0.698018369247902,0.652553267893053,-0.960621219815728
-98,0,1.63679608599505,-1.40483117266873,0.133355343382705,0.785183623637213,0.106494106522641,0.457003384754942,-0.314470768070196,-0.05337112691883,0.86147345141363,-0.770167158107586
-99,0,1.22531932528574,-1.39832123108255,0.751819680541469,0.843477659731268,0.880714646905367,0.20665859661747,-0.85053999542226,0.770244035843202,-0.790477429383416,-0.219373260405667
+index,prop,task,A,B,C,D,E,F,G,H,I,J
+0,0,Task_1,-0.815427620522422,-0.549653782587197,10,10,-0.492825179098274,0.173025977694162,0.598942935224295,-0.298754475196825,-0.581254909010269,-0.110656494210556
+1,0,Task_1,-0.69992853524861,-0.229332112274544,0.955713896876243,0.478117201427488,0.777586191100329,0.655369716778557,0.174914171427966,-0.288498877530604,-0.045316536149489,-0.606586193752411
+2,0,Task_1,-0.368076290946109,-0.969405272021421,0.330140292484796,-0.389505498689912,0.138935265824808,-0.871869282167555,0.37472462048701,0.16418591189513,0.293560701443717,0.285608940220021
+3,0,Task_1,-0.573491802821712,-0.815581340289383,-0.71381302228685,0.968769585007681,0.683148179202864,0.799125092538796,0.309479173526504,0.728052031003468,0.725495580994886,-0.676576302804248
+4,0,Task_1,-0.676358754897705,-0.548282221764748,-0.067783417095816,0.229988549891323,0.582427598044647,0.261947149184825,-0.31573435079735,0.61178122144268,-0.187058216967238,0.871764347690334
+5,0,Task_1,-0.757605167585471,-0.808298008590424,-0.212800982302764,0.915459776146607,-0.263465552591813,-0.666126495988014,-0.195028996490007,-0.237170057680116,-0.933358858596883,-0.19726273171241
+6,0,Task_1,-0.886003489547442,-0.509472491633194,0.972585787901787,-0.872502887185675,0.729110910814452,0.265244787210866,-0.726900973624432,0.248626170107939,0.809004396880265,-0.278494064780479
+7,0,Task_1,-1.02924299329947,-0.618392550297407,0.431645876221063,-0.595223273279383,0.953557069758112,-0.222915219121563,0.257670939076174,0.728927879098318,-0.579783055417687,-0.179960319428095
+8,0,Task_1,-0.502456609281931,-0.196195032500234,0.827091199052693,0.208781482910759,-0.573317187361529,-0.00488758921352,0.341918716034638,-0.292901671356202,-0.34531700628784,0.766920547630073
+9,0,Task_1,-0.517308486454666,-0.58057651993072,0.023965203621816,-0.805412569418484,0.806810139718495,-0.806576263127819,-0.39279977856172,-0.463819363774079,0.16095248005356,-0.553419747131608
+10,0,Task_1,-0.634057125095051,-1.01875520243377,0.916015229666356,0.0275946645157,-0.829507007977635,-0.700063689327201,-0.715601456588714,-0.439799165143527,-0.487241220494887,0.245279267056121
+11,0,Task_1,-0.577778256396397,-0.425744718740636,0.404557718897757,0.175997361062361,0.227383730822478,0.774254652577977,-0.616080996917636,-0.948639007451084,0.954076433375225,-0.497102001172339
+12,0,Task_1,-0.197376045004303,-0.404709510371676,0.906708844886064,-0.592737030373698,0.885229803890949,0.822069297241907,0.204367485562992,0.24610324883505,-0.079476866422163,-0.244006995583434
+13,0,Task_1,-0.766513992210544,-1.03945619108008,-0.284570394188414,-0.198686061574238,-0.168897609541112,-0.92939259112691,0.265899059671913,-0.828727642584781,-0.427453097474168,-0.738368091608883
+14,0,Task_1,-0.301129557074769,-0.466366201816861,0.011556817105957,0.499408314757229,0.253163424774478,0.017645446880421,0.401735167095264,-0.650287617298501,-0.262217482830833,-0.510102120130588
+15,0,Task_1,-0.372562647160274,-0.805363289239018,-0.520161174281201,-0.392478459347396,0.147495839565868,0.780879606474075,-0.281044687799173,-0.148036908135786,-0.208810472224093,0.278961929718128
+16,0,Task_1,-0.573276127254349,-0.843760519080871,0.562803219191695,0.323676061636996,0.490737136410372,-0.95476192699496,0.028603504036769,-0.246295219650507,-0.297736293597739,-0.632473830957653
+17,0,Task_1,-1.08177643161138,-1.08748936331147,0.859669461054104,0.485772819254089,0.268883598825009,0.253553321699552,-0.045743087569395,0.66793403278249,0.308591963919947,0.771084301464027
+18,0,Task_1,-0.121943068431321,-0.937658541363365,-0.118871100462413,0.159000937768132,0.2985428841756,-0.203829205332538,-0.637945695251352,0.658049690810909,0.949529589134008,-0.577812553880056
+19,0,Task_1,-1.06747884637477,-0.842449899007254,0.74037411093045,0.558782660077505,-0.096052126354583,0.529119817360537,0.372190604770144,0.688656466253683,-0.819433165315505,-0.12814415930811
+20,0,Task_1,-0.376355273108791,-0.908946282731397,-10,-10,0.785237349732891,-0.387217730495401,-0.942409218899448,0.160806577297675,-0.723969983661972,-0.452650134415823
+21,0,Task_1,-0.846685755905842,-0.209448772979162,-0.92290734251321,0.465751384219632,-0.81727500527083,-0.182472640926628,0.887526070620356,0.111592883978488,0.386435078880162,-0.440017211221272
+22,0,Task_1,-0.837187625737658,-0.851876882999398,0.28033979546451,0.379365407838544,0.634843008192624,0.371753918780839,-0.611653305369863,0.732567927874185,0.85803611350317,-0.577973441708411
+23,0,Task_1,-0.175842102272502,-0.488461994046914,0.222850898945077,-0.424057088828287,-0.27619426781836,0.616941667680694,-0.696779972923147,0.23612770730498,0.760705889780527,0.34004139732033
+24,0,Task_1,-0.857809192768388,-0.265302164309273,-0.339501197382944,0.898529591365812,-0.881538228231582,0.090728826664301,-0.858395870780934,0.907272331515896,0.160531735619067,0.678911811768841
+25,1,Task_1,-0.585614473203943,0.551068965982618,-0.444576754578563,-0.78871174512572,0.246625773070183,-0.663474018818313,-0.446355552060464,0.750312773134634,-0.98959522970136,-0.150120109840706
+26,1,Task_1,-0.23908209161902,0.897577090052027,-0.388679577334402,-0.708193450791952,0.850310084800308,-0.767256338531612,0.370509317329194,0.387354921015751,0.533160321164986,0.149390212455131
+27,1,Task_1,-0.830137779545391,0.125511796448773,-0.359644680155969,0.682555404147728,-0.53195400936544,0.934101689590862,-0.73600284018832,-0.29960291454053,0.351125596355818,-0.187842884669279
+28,1,Task_1,-0.216184782523012,0.211978905733634,-0.249409157470717,-0.137070024386644,-0.707128531134104,0.944932049234295,0.431233366052987,0.449543990959262,0.912901932280027,0.77394610963827
+29,1,Task_1,-0.282632767106854,0.264519450298051,0.935302642480463,0.833720966523807,0.254167956717343,-0.007922712021391,-0.114776295376767,-0.276042896002242,-0.813098403125419,0.341922052212475
+30,1,Task_1,-0.575451831562744,0.581795291243757,0.425716772255087,0.544174803732763,0.492863854358204,-0.047589791717166,-0.743840790633672,0.525289489060411,0.829611715544936,0.015193221845522
+31,1,Task_1,-0.758220206206948,0.613617353581097,0.12665368551441,0.469705238170149,0.874436248273008,-0.759571175468135,0.310230735619265,-0.80342084374485,-0.462431082486477,-0.407165886759129
+32,1,Task_1,-0.591882107713968,0.146363847316077,-0.731393018031039,0.461102224603009,-0.884528391885322,-0.419893944840798,0.647518214389067,0.688126326408485,0.754656371691603,0.116881923067816
+33,1,Task_1,-0.715219788915433,1.08461646785062,-0.527433424947131,-0.598774697808646,0.113197791601676,-0.50528865259863,0.117572114288939,0.590400320594326,-0.155159386769126,0.354827977413197
+34,1,Task_1,-0.796303564991711,0.501349128362771,-0.818626622405165,-0.029008564510599,0.952315968378468,0.817495784213924,0.182224554845043,-0.01377304364653,-0.26273195293588,-0.859530562808673
+35,1,Task_1,-0.195432956299915,0.260304213033586,0.305218688016626,-0.060885647660027,-0.387903446605514,-0.108064042735465,-0.962980405009682,-0.424289604203511,-0.253442293077285,0.309637368411297
+36,1,Task_1,-1.03051702404215,0.699873963424479,0.54312844740039,0.591372473040837,-0.835367086693457,0.198315253422507,-0.181434739783802,0.636429105754948,0.420628646992331,0.990122364664621
+37,1,Task_1,-0.642471040462047,0.350035500680932,-0.653263607332762,0.359359450868376,0.30131719114182,0.649581794356589,0.942268955633086,0.884659894489377,-0.473171239344398,0.039635066570717
+38,1,Task_1,-0.716592168562474,0.251176558055396,-0.812352457176761,0.219766101590983,-0.65021067790289,0.423621690291556,-0.58865099275791,0.061487886019891,-0.237737474016087,0.641284347380825
+39,1,Task_1,-0.627915139540302,0.522644163585557,-0.829819386940741,-0.345104687573802,0.485166070545119,-0.258839727448056,-0.920615208326881,0.275498215871427,-0.629075534110342,-0.642527887960687
+40,1,Task_1,-0.156663423633232,0.304600082490645,10,-10,0.041519856511361,0.23303461629095,-0.497233246191187,-0.023544587617095,-0.418540837770003,-0.550233932792512
+41,1,Task_2,-0.403555861666686,0.807008471177762,-0.480316474702795,-0.753784710340632,-0.613234235616998,0.167955573662474,0.455636631315042,-0.380227635953206,0.48021383007369,-0.453674929885108
+42,1,Task_2,-0.475033062023491,0.231061734571007,0.310098050913387,-0.835007082906627,0.407580140850853,0.556924247596553,-0.388616604639346,0.60215104751412,-0.984322198098753,-0.996332888983337
+43,1,Task_2,-0.286740191813169,0.871522465291953,-0.898456453446964,-0.797029924245349,0.47491891024478,0.480193220538417,-0.750856163558686,-0.043960372032018,-0.242651391805662,-0.109239061054006
+44,1,Task_2,-0.191548445530409,0.578646221732672,0.571329522934018,-0.677379826379623,0.098396984688832,-0.961599170104035,-0.753922591922157,0.361435891257559,-0.638030455493982,0.404349024843908
+45,1,Task_2,-0.962818439897195,0.190811801399786,0.402433205555268,-0.06801187450078,-0.373089661152032,0.23970878487105,0.416451106643361,-0.50599166271433,-0.88669034806741,0.30364523616443
+46,1,Task_2,-0.885419462203051,0.155312944156919,-0.174925245509766,0.050330391451536,-0.388676795741932,-0.72307604978553,0.646076107724964,-0.00105589909588,0.491928720743773,-0.647995101369186
+47,1,Task_2,-0.634581804798124,0.149519641506344,0.289410761217525,0.48566510896872,0.338684773860801,0.374319581439648,-0.105479014627167,0.004520417892418,0.222862261975939,0.23538363683764
+48,1,Task_2,-0.380429991155736,0.423554740615867,-0.199044563017407,-0.855804112781183,0.947572000564906,0.654939562810152,0.802084131057488,0.010033694468233,0.449766366250574,0.119974134618433
+49,1,Task_2,-0.294345606906633,0.954791580849514,-0.346729568989951,-0.614828863660672,0.578150372001527,-0.697356489908387,-0.272496177427547,-0.326679505363186,0.403185907494623,0.659834986972357
+50,2,Task_1,0.164414245529879,0.270409021456753,-0.470918775433235,-0.165965173767875,-0.118373275802139,-0.804671542299309,-0.273096283874977,0.964676290034708,-0.240786016285174,0.781092750718218
+51,2,Task_1,0.962558187516297,0.353448095106742,-0.546315077724052,-0.263397808061131,0.073416112160648,-0.561584513583351,-0.003812545601594,-0.067901708659798,-0.797337624892413,-0.502494288676279
+52,2,Task_1,0.328218018271649,0.30124689351081,-0.239618151680487,0.281282683112064,-0.122253338243164,-0.416340912422471,-0.302944823763312,0.950697167857575,0.084774348269755,0.245643637478141
+53,2,Task_1,0.157711895115317,0.944426984942688,0.672465261607398,0.963677112876299,-0.732866944741014,0.269879007022312,-0.734121763984793,-0.18475004364869,0.494783604230457,-0.563469688908407
+54,2,Task_1,1.05838672002069,0.573716871491595,0.916674666213795,0.744100669613517,-0.536325680879341,0.745349313896706,-0.608494971121628,-0.036147807131094,0.730097211981708,-0.986020687921255
+55,2,Task_1,0.110836847139862,0.585126999320639,-0.040012375137611,0.248257524389148,-0.795936343325832,-0.755933622220192,0.664943062567423,-0.560825069941966,-0.987328335835364,0.00918182383389
+56,2,Task_1,1.06747271856711,1.07364864476858,-0.75655271526814,-0.433965979475654,-0.925820800763387,0.621204380538264,-0.725355435802351,-0.087195045278291,0.500040007799584,-0.351024070867477
+57,2,Task_1,0.989939601503884,0.247705435387067,0.593670368718185,0.74125415566331,-0.835056311664806,-0.128283340965351,0.795769070113583,0.338062872249377,0.961610282279288,-0.519755961049099
+58,2,Task_1,0.970023710841513,1.01758529653736,-0.917792004629201,-0.224807652067029,0.751172530954049,0.744925497765574,0.054821387540181,-0.268146122719043,-0.373795753322288,-0.023619900695578
+59,2,Task_1,1.04979809782509,0.825205513076737,0.937331444475048,-0.189146596668676,0.726757528139029,0.571196020214809,0.150478496659529,0.716370904753891,0.645947936391794,-0.096512499841381
+60,2,Task_1,0.606201322157722,0.429911059767652,-10,10,0.303748234076738,0.094684069184242,0.846651908762107,0.505710991097032,-0.664846620425076,-0.722934785670171
+61,2,Task_1,0.509318826024622,0.139403752494424,-0.313853456471656,-0.670641690437042,0.337481189036041,-0.695059667580877,0.382512664766648,-0.754051294565859,-0.540912893771664,-0.152736592481289
+62,2,Task_1,0.416414890641425,1.04597850573715,0.746279765035798,0.320667909398266,0.075712278316126,0.557089028326803,-0.314459962457274,-0.091179395352991,-0.712572618352738,-0.862523770264919
+63,2,Task_1,0.236961042862613,0.461540684896611,0.961634242304571,0.99902517180177,0.428576472620752,0.790254229843056,-0.162732148014183,0.057108415575022,0.099625367521191,-0.41779573726667
+64,2,Task_1,0.325246248634948,0.721692503249982,-0.293737994923213,-0.654603713924763,-0.15830470325221,-0.4506171823593,0.106217286056366,-0.250165079508456,-0.598894350859836,-0.860382476004742
+65,2,Task_1,0.258917730833821,0.68064431493967,0.661374709635725,0.335413696048534,0.295408469126627,-0.340725080366546,0.611961227458239,0.53327702260923,-0.960254363897463,0.913251337834092
+66,2,Task_2,1.0502123912678,1.0175241545193,-0.790281335013236,0.372594655247821,-0.940075790261345,0.972106617215367,-0.246874887198155,-0.501544524013033,-0.134947611932188,0.130090806976322
+67,2,Task_2,0.653819704386349,0.899775279158189,-0.600590046972624,0.281621309709353,0.836244003088172,0.56250556179443,-0.244248244001593,0.274273110413607,0.988229164412892,-0.903492892429764
+68,2,Task_2,0.29937357944438,1.01665644266054,0.230397844467249,0.458000795025685,0.160534364807898,0.106760231103633,0.084376336290482,-0.410257096809632,-0.388975913032382,0.233684932760446
+69,2,Task_2,0.631194229943451,0.952468985425419,0.122894112900537,-0.193746425367835,0.602411133999453,-0.895694511099768,0.347280223444287,0.045175117581033,-0.232661771389541,-0.314648785155521
+70,2,Task_2,1.08713454227837,0.656075322649452,0.906027162216176,0.736418182225292,-0.041284854438203,0.308524126840497,0.369205540497406,0.333193031466162,0.98544497734097,-0.253876502721057
+71,2,Task_2,0.139844193498669,0.408310759532142,-0.677661468333469,0.07388223501889,-0.682147267310905,0.024126391992196,0.848946249678909,-0.516253994735439,0.202627425635043,-0.897477249843204
+72,2,Task_2,0.155190443948636,0.269264020133562,0.492431513300772,-0.737330353527688,0.594894327441348,0.805436037154752,-0.912716679245893,-0.390199322338262,-0.735805203184445,-0.05803264345169
+73,2,Task_2,0.71862355058755,0.784351472902302,-0.863821530585294,-0.987125905118183,-0.698190916645222,-0.17859271120364,-0.902497993400075,0.777448050547606,0.03349780154213,0.569802193246196
+74,2,Task_2,0.932100491693559,0.673631604373795,-0.919679036112179,-0.083795023015624,0.492078750634905,-0.102786002654994,0.168000984501864,-0.984910911120671,-0.901017886055053,0.639813560268343
+75,3,Task_1,0.103583721726665,-0.373304248094501,0.107455937171145,-0.854711756750333,0.344969246269787,0.519092986129825,0.410230657805076,-0.91216461269154,0.033943611687528,-0.306643316979961
+76,3,Task_1,0.698578699086034,-0.805397267250048,-0.80848616018294,-0.010443047871684,-0.706296790283886,0.822118261736111,0.163327430772402,0.252786291364115,-0.501338527911191,-0.28349201031843
+77,3,Task_1,0.236887498524042,-0.155242051697718,-0.814416838367815,-0.02940231646999,-0.841428202408144,-0.004586605289201,-0.606434730541928,0.714277316437912,-0.44481897692423,-0.753698456302665
+78,3,Task_1,0.178535729762585,-0.178631521167564,-0.877637461379848,0.414405535550407,-0.03365581494898,0.624692043559635,-0.832402658891314,-0.723028062732401,-0.867099034604054,-0.185632378061498
+79,3,Task_1,0.135466013133272,-1.08706802405113,0.977828685636029,-0.57502380941392,-0.402617609462035,0.631967959251952,-0.426504420434097,0.480579460496328,0.686338078276468,-0.793812851707889
+80,3,Task_1,1.06511564653917,-0.529772758901416,10,10,0.101102136284509,-0.416199695149021,-0.494850987164782,-0.568698448483212,-0.184782382471875,-0.552230498856606
+81,3,Task_1,1.065535288073,-0.706304720574752,-0.202671045004157,-0.138914163603925,-0.937156710796857,-0.116790109384378,-0.094237431941851,-0.896761118553971,-0.183423320636867,0.458624633065419
+82,3,Task_1,0.896291258595646,-0.255636497676642,-0.699402902052328,0.801606907908076,0.618074329335756,-0.172568708757076,-0.075693445304373,0.488815268086692,-0.612225386267585,0.515474858015819
+83,3,Task_1,0.524657824983779,-0.653380201807584,-0.146467066237161,0.773717178872341,0.498796984960351,-0.015862721592055,0.487162827649467,-0.002016922590367,-0.480395455657278,0.140660394856319
+84,3,Task_1,0.131331250236976,-0.217693885103831,0.935273336022611,-0.65840232577507,0.254028615496319,-0.207949363786322,0.494233964181716,0.342544015156094,-0.2790717466048,0.681766781920308
+85,3,Task_1,0.247418413965845,-0.55249563814082,-10,-10,-0.772948300582061,-0.755591080857131,-0.795691897784493,0.140653835392209,-0.160483486922781,0.460920935704452
+86,3,Task_1,0.877892966008508,-0.600861427554399,-0.102806023076495,-0.232256721754397,0.982487312078063,0.220639487969972,0.466108251058299,-0.328239000603224,0.955688285869012,0.98401214247364
+87,3,Task_1,0.353355849981167,-0.384127703150446,-0.400909948872293,-0.887240029691788,-0.796366553971199,-0.189011341359002,-0.984264269832423,0.228539348323108,0.696045037642922,-0.734941166556072
+88,3,Task_1,0.278857275841181,-0.845560468506653,-0.906675421892372,0.643501800272306,0.2964442904515,-0.212339822521429,-0.624947347663644,-0.076505534185115,0.690006945874019,0.603178865697037
+89,3,Task_1,0.36866502316092,-0.854728078950075,-0.777878371782176,0.158700400185078,0.77008386941758,0.318201581494366,-0.577373286340777,0.207915408782256,0.169898207168944,-0.134718349741109
+90,3,Task_1,0.493452674287737,-0.910519988093965,10,-10,-0.381543623044489,-0.150608604917312,0.940200935058958,-0.260126956593852,0.011178432296195,-0.552646188796202
+91,3,Task_2,0.908818256135453,-0.80987547007129,-0.273896107346467,0.9218628887177,0.183329714125041,0.794995796775324,0.47034078624241,0.587159127993906,0.656065190534019,0.710378359435155
+92,3,Task_2,0.996336489766102,-0.975493823251068,-0.093438684660175,0.867363731909897,0.501979335337794,0.929133531466716,0.853038546233495,0.231647371842096,-0.921363933789468,0.9955206665909
+93,3,Task_2,0.657942481588168,-0.245177885637302,-0.353047628963401,0.686996459628496,0.12650715249212,-0.584157551233493,0.67801198459735,0.130184075673761,-0.541365882749818,0.804095414322346
+94,3,Task_2,0.328489621282775,-1.08052040344332,-0.055989266428472,0.083972688856283,0.495406878960658,-0.531851511151842,-0.68298755038252,-0.762719341237422,0.044183568378214,0.569492860435106
+95,3,Task_2,0.253790826276124,-0.935268396370178,-10,10,-0.592427348924565,-0.245215291809175,0.450286805609337,-0.61720080602177,-0.078323806376631,-0.138400199664094
+96,3,Task_2,0.226661731310054,-0.206651604608129,-0.840523610880692,-0.579768061766314,0.207088065224924,-0.30689024242517,-0.707319832593209,0.067209487208095,-0.219041441615042,0.651618314592841
+97,3,Task_2,0.730538042566787,-0.815537451517852,-0.071347258910479,-0.571647931880792,0.00248497405952,0.410346123251162,0.294254262248804,0.698018369247902,0.652553267893053,-0.960621219815728
+98,3,Task_2,1.05315118537755,-0.90842251928343,0.133355343382705,0.785183623637213,0.106494106522641,0.457003384754942,-0.314470768070196,-0.05337112691883,0.86147345141363,-0.770167158107586
+99,3,Task_2,0.453505426891074,-0.509861391994549,0.751819680541469,0.843477659731268,0.880714646905367,0.20665859661747,-0.85053999542226,0.770244035843202,-0.790477429383416,-0.219373260405667
diff --git a/tests/exec_test/classification_max_corr_gen_proj/sisso.json b/tests/exec_test/classification_max_corr_gen_proj/sisso.json
index dc2c7a145a97c5b95a5a7df0c588e4648e1d976b..7f978c23d974a3d8bf07c2710cac52494716e103 100644
--- a/tests/exec_test/classification_max_corr_gen_proj/sisso.json
+++ b/tests/exec_test/classification_max_corr_gen_proj/sisso.json
@@ -11,7 +11,7 @@
     "n_models_store": 1,
     "n_rung_generate": 1,
     "calc_type": "classification",
-    "leave_out_inds": [80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95 ,96 ,97, 98 , 99],
+    "leave_out_inds": [ 2, 3, 4, 6, 21, 23, 30, 38, 39, 52, 53, 61, 76, 82, 83, 45, 47, 48, 49, 66 ],
     "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"],
     "param_opset" : [],
     "fix_intercept": false
diff --git a/tests/exec_test/gen_proj/check_model.py b/tests/exec_test/gen_proj/check_model.py
index 183a6c1712ceacec2a96d313d41a9f191952bb0d..50da73db80d3fbcd882280532eae51cbb80c1adf 100644
--- a/tests/exec_test/gen_proj/check_model.py
+++ b/tests/exec_test/gen_proj/check_model.py
@@ -4,7 +4,6 @@ from pathlib import Path
 import numpy as np
 
 model = ModelRegressor(
-    str("models/train_dim_2_model_0.dat"), str("models/test_dim_2_model_0.dat")
+    str("models/train_dim_2_model_0.dat")
 )
 assert model.rmse < 1e-4
-assert model.test_rmse < 1e-4
diff --git a/tests/exec_test/gen_proj/sisso.json b/tests/exec_test/gen_proj/sisso.json
index f615931822d0416b89cdd96844fa8765dd58337c..dc65c132bc8d780fa99c16cdf6a25779dc6f0bbe 100644
--- a/tests/exec_test/gen_proj/sisso.json
+++ b/tests/exec_test/gen_proj/sisso.json
@@ -7,10 +7,9 @@
     "data_file_relatice_to_json": true,
     "property_key": "Prop",
     "task_key": "Task",
-    "leave_out_frac": 0.05,
+    "leave_out_frac": 0.0,
     "n_models_store": 1,
     "n_rung_generate": 1,
-    "leave_out_inds": [0, 1, 2, 60, 61],
     "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"],
     "param_opset": [],
     "fix_intercept": false
diff --git a/tests/exec_test/no_test_data/check_model.py b/tests/exec_test/log_reg_gen_proj/check_model.py
similarity index 63%
rename from tests/exec_test/no_test_data/check_model.py
rename to tests/exec_test/log_reg_gen_proj/check_model.py
index 50da73db80d3fbcd882280532eae51cbb80c1adf..1e429966786af35cb64c3179d674daedbf5a40bb 100644
--- a/tests/exec_test/no_test_data/check_model.py
+++ b/tests/exec_test/log_reg_gen_proj/check_model.py
@@ -1,9 +1,9 @@
-from sissopp import ModelRegressor
+from sissopp import ModelLogRegressor
 from pathlib import Path
 
 import numpy as np
 
-model = ModelRegressor(
+model = ModelLogRegressor(
     str("models/train_dim_2_model_0.dat")
 )
 assert model.rmse < 1e-4
diff --git a/tests/exec_test/log_reg_gen_proj/data.csv b/tests/exec_test/log_reg_gen_proj/data.csv
new file mode 100644
index 0000000000000000000000000000000000000000..ca8bf5a5dba405754688a54146c85f0ccaa8249b
--- /dev/null
+++ b/tests/exec_test/log_reg_gen_proj/data.csv
@@ -0,0 +1,101 @@
+Sample,Prop,A,B,C,D
+1,0.042644585721321,49.8070975337169,307.029288124149,127.290160977898,119.363921090366
+2,0.051286552544473,207.151687934325,385.803129282962,232.077642053051,152.451146796233
+3,0.115456483870835,259.115243485315,350.89565982664,126.112015312861,194.269240170231
+4,15.1980039696031,341.392794952748,2.50119646241174,370.843032180154,11.5134267982876
+5,0.030828548720774,123.642359657426,107.674457089161,278.481606674966,277.417740655869
+6,0.113310081828946,434.902594566398,405.870579391739,241.257457440821,213.210004173378
+7,0.071973103667228,393.51387804721,243.422246542275,98.6921352953811,38.4646528894646
+8,0.079124869408332,454.177123336891,312.18007412968,365.093447826501,93.4027537695288
+9,0.084392098447046,262.904172099227,476.075762264071,95.6336322308505,269.096484900607
+10,0.044467510536784,447.85350715457,423.476173355266,422.140647530134,105.30966798588
+11,0.032607637392685,156.982834162006,216.937871068582,436.272542198989,438.578448993111
+12,0.026590761658031,30.4947043445939,353.866638985695,131.530432627816,103.578990316735
+13,0.882677326292512,324.183123925377,379.786690798287,93.7301886485703,440.061356079183
+14,2.92863264703151,253.756388912426,191.488748047164,69.2744710777569,166.908183251839
+15,0.041178758470398,433.425208331938,422.012179205648,138.480233516687,94.7864081755759
+16,0.02951523999224,364.325349150667,478.127595998362,85.6428392655327,93.6000841896524
+17,0.103881442342912,286.280745819571,226.720331942159,138.891298487121,78.0909759774124
+18,0.079447386532567,184.446053206631,428.401650931551,356.108594859543,238.138089889099
+19,0.490048714715941,437.097165477292,89.4506321427452,357.249495038998,21.2823078127849
+20,0.315083955685341,464.689646103278,357.8239998639,166.719192215288,470.323737680847
+21,0.336628829642389,453.229393345602,121.062956838254,368.072231287668,33.8834258163608
+22,0.596250795257965,329.206703754852,336.263700827181,259.138467352137,407.462821099063
+23,0.024008996387738,429.499336054639,451.30307811156,253.003258066353,11.9533585885846
+24,0.702221733754583,293.874006420616,221.692047223248,42.2106293777491,277.645795523081
+25,0.807995736802549,276.103314658051,161.778514482533,426.942506234502,208.868975590116
+26,0.002730384017923,83.1768893799547,13.0356263432169,197.449738789384,355.96703006125
+27,0.034760234337076,239.859835781859,226.950754683377,422.533852762138,472.823756482285
+28,0.103325140788876,142.348449069038,308.188608196991,19.8589904106916,445.782867882951
+29,0.069515399570576,325.065695936087,341.202112577236,280.094904922382,126.698479235295
+30,0.026792086484689,105.496640506796,431.709809945209,201.067642207524,125.346044992663
+31,0.06968520346432,278.288015848546,340.463606947326,62.5783439753465,135.763786890355
+32,0.033814657993073,314.022680967248,366.997527539967,146.288622368974,52.3761667809278
+33,0.036437360543752,182.544295340408,74.6987790052084,111.509818170011,241.023451399827
+34,0.007278291999309,161.331981931619,108.016457758383,331.063276457351,498.833301187792
+35,0.063162819962502,489.309265600451,424.263298255808,159.148021054163,150.645927662812
+36,0.084687250049346,268.778111170417,12.2088368874704,283.50778492837,126.160505819305
+37,0.193238168342884,356.429298308577,16.8790201937628,79.0256508718712,105.78215423975
+38,0.002398998011824,18.700576586924,54.7261653842681,148.691281396519,365.033771980965
+39,0.153459585676185,294.597161844894,103.910596069842,345.840575142828,208.072222213417
+40,0.008244336063986,284.63690286644,15.4716735442082,102.064258484033,407.319217444755
+41,0.038860594113166,242.474343916146,489.029235376506,398.649900719263,182.88700899363
+42,0.068119481955482,288.653567920823,104.182871116269,70.369683070715,260.771564816744
+43,0.645977325788857,285.468431709879,261.088369157819,493.106944180385,321.625679649568
+44,0.056318139669975,180.087855188098,72.886342824395,103.167691072242,204.56315967502
+45,4.03890609225025,198.253697705946,243.375043815026,348.789155694472,222.634358725914
+46,0.050956703375414,66.8423643063199,65.3676810191975,496.070776260521,158.342895817271
+47,7.71218334170515,395.439636236816,230.35515336685,64.9645947685536,211.908463158588
+48,0.0214328895491,176.169330689589,464.950315259736,452.14257500201,81.9464407280035
+49,2.6815357030127,431.955781915035,308.66349882149,253.874224338872,343.837415858124
+50,0.750972167059506,224.327882299282,147.30201835669,243.603529830416,103.105383948355
+51,0.01828936588316,249.505131204334,114.794245180025,129.495085479016,408.195718175082
+52,0.323325638910123,159.141681749234,106.854744390776,18.9019889656962,51.4307457794658
+53,0.033770961308791,413.581373474192,154.408120689668,446.571947766101,435.968644308603
+54,0.318264739192304,346.575691958707,59.8626851714252,255.289629539941,132.392312946497
+55,0.543889052259968,202.50018301682,224.87814578565,179.824036025239,168.045578982369
+56,0.014813028268087,317.664906479146,95.6823941971786,410.391814113528,449.004998805183
+57,3.57717288413323,275.990345747377,445.847337469333,364.60493511239,475.71306458261
+58,0.129900583191695,9.14924177359304,119.375334020565,113.361398805488,62.8300931374718
+59,0.03708937574487,185.837293865833,447.695352621602,145.051135814477,160.694140439915
+60,0.014921382861101,8.28959490938569,378.093007577705,419.707558105521,40.4024046726325
+61,0.387351324110643,407.707347482472,350.199800459508,105.74924661113,253.972219444862
+62,0.101466855479979,267.424013298673,386.705098956208,106.841281052243,212.004932409899
+63,1.32736567001172,352.225865613491,171.568875244562,466.911865424454,130.806654602562
+64,0.028663161770634,200.509116400279,185.054151659821,169.780882949394,426.355265958408
+65,0.066319972191657,393.903835973502,34.1090298843032,395.993284870997,201.465690562591
+66,0.225228846252684,416.977891536474,469.286103651892,451.235879905236,329.368398152194
+67,0.193954759220393,161.654732137163,117.19778787807,386.605589800927,191.350615565409
+68,0.096588543650903,207.153853909472,281.333900278231,463.836665364427,431.036779599936
+69,0.045009258705634,363.180221118644,60.3413214970106,282.709764209504,263.181135410842
+70,0.148522003805379,337.475883356403,345.712927332466,287.038267897121,198.123437087294
+71,0.233773563579686,417.561141951304,416.699554349938,87.471372642944,284.442500963842
+72,0.07632953740424,353.462065059428,230.244117781226,205.862394545593,418.718162036753
+73,0.018751232608648,211.936255629827,38.9153036995522,335.407375291207,269.160151304954
+74,0.11436186782933,232.953194819263,75.6969339038158,145.359684234586,179.190191924589
+75,0.059780123967251,69.2355001498863,279.298667202486,382.963222385851,434.84703056805
+76,0.039466512803061,423.250211589563,470.623951042526,385.799699148574,127.024253733708
+77,0.012044591018025,208.006274730239,51.9283654736177,256.955959532126,347.239249164622
+78,0.02529290143529,104.714857532854,304.9501964971,31.8389612251596,37.8811131918349
+79,0.004302255370929,177.067793973296,28.3724394563165,331.479931132225,461.564224843564
+80,0.074133627013002,158.134705180115,315.976143027337,278.395310237464,484.311241814557
+81,0.160836802038602,410.048253861679,191.235924936736,470.978702738376,322.209828240428
+82,2.01269810172184,405.219860759387,382.357826642821,340.408900681332,340.037356281025
+83,0.086463232833669,465.637225155392,273.935550533945,471.461401666282,69.4129300775254
+84,0.09652399673283,484.793856217189,217.858512463092,317.765365776244,30.5574828436997
+85,25.7254804725824,82.523989196792,270.839083225059,174.362686576973,263.842713219115
+86,3123.2782662272,106.094612132738,332.523075165983,43.7520305984393,333.205085763281
+87,0.024585156837383,445.981783633524,66.2298255679999,373.905334245721,377.150351563984
+88,0.018205881117995,117.196993636293,487.277963654347,133.617747939771,85.6568366996931
+89,0.188996142204433,479.415773481536,87.8379455379654,429.859856289863,204.165337443809
+90,0.030687409853583,324.976847605473,470.960628473335,126.942577378075,106.983488589713
+91,0.084751033512141,234.705200183164,155.730168172949,303.535045799807,16.2640518096776
+92,0.019959943675665,342.417666791063,8.85018592692246,287.691069761052,283.171709994183
+93,0.017102517653105,162.236891091598,81.3657842047582,456.824872475328,318.421521204707
+94,0.070128975496401,23.8513212365969,333.533751731515,196.322141219655,479.082443413362
+95,0.006819280465836,151.095020172981,37.8615330871062,227.034115691539,362.755341229094
+96,0.009406739334267,32.3147853544107,113.549008148151,57.9053526795503,348.470092929769
+97,0.026975504059684,53.4863829718894,186.176691450922,56.7764485836142,371.957528066938
+98,711.780196017338,7.36908228123383,488.268102038773,252.670421142526,486.698386703915
+99,2.82416953708447,183.480683022873,368.758162863328,2.80315566798772,340.166678214112
+100,0.3850029324009,482.322626243789,100.05861510573,495.618985557607,17.9764839910466
diff --git a/tests/exec_test/no_test_data/sisso.json b/tests/exec_test/log_reg_gen_proj/sisso.json
similarity index 71%
rename from tests/exec_test/no_test_data/sisso.json
rename to tests/exec_test/log_reg_gen_proj/sisso.json
index 7dcb348824039c5944525d15a2646c2357abf6b4..dbd02b05e1c7d609fe9bf395422fd69c6d699b1c 100644
--- a/tests/exec_test/no_test_data/sisso.json
+++ b/tests/exec_test/log_reg_gen_proj/sisso.json
@@ -1,14 +1,15 @@
 {
     "desc_dim": 2,
     "n_sis_select": 1,
-    "max_rung": 2,
+    "max_rung": 1,
     "n_residual": 1,
-    "data_file": "../data.csv",
+    "data_file": "data.csv",
     "data_file_relatice_to_json": true,
     "property_key": "Prop",
-    "task_key": "Task",
-    "leave_out_frac": 0.00,
     "n_models_store": 1,
+    "n_rung_generate": 1,
+    "calc_type": "log_regression",
     "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"],
+    "param_opset": [],
     "fix_intercept": false
 }
diff --git a/tests/exec_test/log_reg_max_corr/check_model.py b/tests/exec_test/log_reg_max_corr/check_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..c5cdbe83019b9eaea840c312ad60630f00a84896
--- /dev/null
+++ b/tests/exec_test/log_reg_max_corr/check_model.py
@@ -0,0 +1,10 @@
+from sissopp import ModelLogRegressor
+from pathlib import Path
+
+import numpy as np
+
+model = ModelLogRegressor(
+    str("models/train_dim_2_model_0.dat"), str("models/test_dim_2_model_0.dat")
+)
+assert model.rmse < 1e-4
+assert model.test_rmse < 1e-4
diff --git a/tests/exec_test/log_reg_max_corr/data.csv b/tests/exec_test/log_reg_max_corr/data.csv
new file mode 100644
index 0000000000000000000000000000000000000000..ca8bf5a5dba405754688a54146c85f0ccaa8249b
--- /dev/null
+++ b/tests/exec_test/log_reg_max_corr/data.csv
@@ -0,0 +1,101 @@
+Sample,Prop,A,B,C,D
+1,0.042644585721321,49.8070975337169,307.029288124149,127.290160977898,119.363921090366
+2,0.051286552544473,207.151687934325,385.803129282962,232.077642053051,152.451146796233
+3,0.115456483870835,259.115243485315,350.89565982664,126.112015312861,194.269240170231
+4,15.1980039696031,341.392794952748,2.50119646241174,370.843032180154,11.5134267982876
+5,0.030828548720774,123.642359657426,107.674457089161,278.481606674966,277.417740655869
+6,0.113310081828946,434.902594566398,405.870579391739,241.257457440821,213.210004173378
+7,0.071973103667228,393.51387804721,243.422246542275,98.6921352953811,38.4646528894646
+8,0.079124869408332,454.177123336891,312.18007412968,365.093447826501,93.4027537695288
+9,0.084392098447046,262.904172099227,476.075762264071,95.6336322308505,269.096484900607
+10,0.044467510536784,447.85350715457,423.476173355266,422.140647530134,105.30966798588
+11,0.032607637392685,156.982834162006,216.937871068582,436.272542198989,438.578448993111
+12,0.026590761658031,30.4947043445939,353.866638985695,131.530432627816,103.578990316735
+13,0.882677326292512,324.183123925377,379.786690798287,93.7301886485703,440.061356079183
+14,2.92863264703151,253.756388912426,191.488748047164,69.2744710777569,166.908183251839
+15,0.041178758470398,433.425208331938,422.012179205648,138.480233516687,94.7864081755759
+16,0.02951523999224,364.325349150667,478.127595998362,85.6428392655327,93.6000841896524
+17,0.103881442342912,286.280745819571,226.720331942159,138.891298487121,78.0909759774124
+18,0.079447386532567,184.446053206631,428.401650931551,356.108594859543,238.138089889099
+19,0.490048714715941,437.097165477292,89.4506321427452,357.249495038998,21.2823078127849
+20,0.315083955685341,464.689646103278,357.8239998639,166.719192215288,470.323737680847
+21,0.336628829642389,453.229393345602,121.062956838254,368.072231287668,33.8834258163608
+22,0.596250795257965,329.206703754852,336.263700827181,259.138467352137,407.462821099063
+23,0.024008996387738,429.499336054639,451.30307811156,253.003258066353,11.9533585885846
+24,0.702221733754583,293.874006420616,221.692047223248,42.2106293777491,277.645795523081
+25,0.807995736802549,276.103314658051,161.778514482533,426.942506234502,208.868975590116
+26,0.002730384017923,83.1768893799547,13.0356263432169,197.449738789384,355.96703006125
+27,0.034760234337076,239.859835781859,226.950754683377,422.533852762138,472.823756482285
+28,0.103325140788876,142.348449069038,308.188608196991,19.8589904106916,445.782867882951
+29,0.069515399570576,325.065695936087,341.202112577236,280.094904922382,126.698479235295
+30,0.026792086484689,105.496640506796,431.709809945209,201.067642207524,125.346044992663
+31,0.06968520346432,278.288015848546,340.463606947326,62.5783439753465,135.763786890355
+32,0.033814657993073,314.022680967248,366.997527539967,146.288622368974,52.3761667809278
+33,0.036437360543752,182.544295340408,74.6987790052084,111.509818170011,241.023451399827
+34,0.007278291999309,161.331981931619,108.016457758383,331.063276457351,498.833301187792
+35,0.063162819962502,489.309265600451,424.263298255808,159.148021054163,150.645927662812
+36,0.084687250049346,268.778111170417,12.2088368874704,283.50778492837,126.160505819305
+37,0.193238168342884,356.429298308577,16.8790201937628,79.0256508718712,105.78215423975
+38,0.002398998011824,18.700576586924,54.7261653842681,148.691281396519,365.033771980965
+39,0.153459585676185,294.597161844894,103.910596069842,345.840575142828,208.072222213417
+40,0.008244336063986,284.63690286644,15.4716735442082,102.064258484033,407.319217444755
+41,0.038860594113166,242.474343916146,489.029235376506,398.649900719263,182.88700899363
+42,0.068119481955482,288.653567920823,104.182871116269,70.369683070715,260.771564816744
+43,0.645977325788857,285.468431709879,261.088369157819,493.106944180385,321.625679649568
+44,0.056318139669975,180.087855188098,72.886342824395,103.167691072242,204.56315967502
+45,4.03890609225025,198.253697705946,243.375043815026,348.789155694472,222.634358725914
+46,0.050956703375414,66.8423643063199,65.3676810191975,496.070776260521,158.342895817271
+47,7.71218334170515,395.439636236816,230.35515336685,64.9645947685536,211.908463158588
+48,0.0214328895491,176.169330689589,464.950315259736,452.14257500201,81.9464407280035
+49,2.6815357030127,431.955781915035,308.66349882149,253.874224338872,343.837415858124
+50,0.750972167059506,224.327882299282,147.30201835669,243.603529830416,103.105383948355
+51,0.01828936588316,249.505131204334,114.794245180025,129.495085479016,408.195718175082
+52,0.323325638910123,159.141681749234,106.854744390776,18.9019889656962,51.4307457794658
+53,0.033770961308791,413.581373474192,154.408120689668,446.571947766101,435.968644308603
+54,0.318264739192304,346.575691958707,59.8626851714252,255.289629539941,132.392312946497
+55,0.543889052259968,202.50018301682,224.87814578565,179.824036025239,168.045578982369
+56,0.014813028268087,317.664906479146,95.6823941971786,410.391814113528,449.004998805183
+57,3.57717288413323,275.990345747377,445.847337469333,364.60493511239,475.71306458261
+58,0.129900583191695,9.14924177359304,119.375334020565,113.361398805488,62.8300931374718
+59,0.03708937574487,185.837293865833,447.695352621602,145.051135814477,160.694140439915
+60,0.014921382861101,8.28959490938569,378.093007577705,419.707558105521,40.4024046726325
+61,0.387351324110643,407.707347482472,350.199800459508,105.74924661113,253.972219444862
+62,0.101466855479979,267.424013298673,386.705098956208,106.841281052243,212.004932409899
+63,1.32736567001172,352.225865613491,171.568875244562,466.911865424454,130.806654602562
+64,0.028663161770634,200.509116400279,185.054151659821,169.780882949394,426.355265958408
+65,0.066319972191657,393.903835973502,34.1090298843032,395.993284870997,201.465690562591
+66,0.225228846252684,416.977891536474,469.286103651892,451.235879905236,329.368398152194
+67,0.193954759220393,161.654732137163,117.19778787807,386.605589800927,191.350615565409
+68,0.096588543650903,207.153853909472,281.333900278231,463.836665364427,431.036779599936
+69,0.045009258705634,363.180221118644,60.3413214970106,282.709764209504,263.181135410842
+70,0.148522003805379,337.475883356403,345.712927332466,287.038267897121,198.123437087294
+71,0.233773563579686,417.561141951304,416.699554349938,87.471372642944,284.442500963842
+72,0.07632953740424,353.462065059428,230.244117781226,205.862394545593,418.718162036753
+73,0.018751232608648,211.936255629827,38.9153036995522,335.407375291207,269.160151304954
+74,0.11436186782933,232.953194819263,75.6969339038158,145.359684234586,179.190191924589
+75,0.059780123967251,69.2355001498863,279.298667202486,382.963222385851,434.84703056805
+76,0.039466512803061,423.250211589563,470.623951042526,385.799699148574,127.024253733708
+77,0.012044591018025,208.006274730239,51.9283654736177,256.955959532126,347.239249164622
+78,0.02529290143529,104.714857532854,304.9501964971,31.8389612251596,37.8811131918349
+79,0.004302255370929,177.067793973296,28.3724394563165,331.479931132225,461.564224843564
+80,0.074133627013002,158.134705180115,315.976143027337,278.395310237464,484.311241814557
+81,0.160836802038602,410.048253861679,191.235924936736,470.978702738376,322.209828240428
+82,2.01269810172184,405.219860759387,382.357826642821,340.408900681332,340.037356281025
+83,0.086463232833669,465.637225155392,273.935550533945,471.461401666282,69.4129300775254
+84,0.09652399673283,484.793856217189,217.858512463092,317.765365776244,30.5574828436997
+85,25.7254804725824,82.523989196792,270.839083225059,174.362686576973,263.842713219115
+86,3123.2782662272,106.094612132738,332.523075165983,43.7520305984393,333.205085763281
+87,0.024585156837383,445.981783633524,66.2298255679999,373.905334245721,377.150351563984
+88,0.018205881117995,117.196993636293,487.277963654347,133.617747939771,85.6568366996931
+89,0.188996142204433,479.415773481536,87.8379455379654,429.859856289863,204.165337443809
+90,0.030687409853583,324.976847605473,470.960628473335,126.942577378075,106.983488589713
+91,0.084751033512141,234.705200183164,155.730168172949,303.535045799807,16.2640518096776
+92,0.019959943675665,342.417666791063,8.85018592692246,287.691069761052,283.171709994183
+93,0.017102517653105,162.236891091598,81.3657842047582,456.824872475328,318.421521204707
+94,0.070128975496401,23.8513212365969,333.533751731515,196.322141219655,479.082443413362
+95,0.006819280465836,151.095020172981,37.8615330871062,227.034115691539,362.755341229094
+96,0.009406739334267,32.3147853544107,113.549008148151,57.9053526795503,348.470092929769
+97,0.026975504059684,53.4863829718894,186.176691450922,56.7764485836142,371.957528066938
+98,711.780196017338,7.36908228123383,488.268102038773,252.670421142526,486.698386703915
+99,2.82416953708447,183.480683022873,368.758162863328,2.80315566798772,340.166678214112
+100,0.3850029324009,482.322626243789,100.05861510573,495.618985557607,17.9764839910466
diff --git a/tests/exec_test/log_reg_max_corr/sisso.json b/tests/exec_test/log_reg_max_corr/sisso.json
new file mode 100644
index 0000000000000000000000000000000000000000..c0b3ea6f876d4829a334fd692b5ce8f242573440
--- /dev/null
+++ b/tests/exec_test/log_reg_max_corr/sisso.json
@@ -0,0 +1,17 @@
+{
+    "desc_dim": 2,
+    "n_sis_select": 1,
+    "max_rung": 1,
+    "n_residual": 1,
+    "data_file": "data.csv",
+    "data_file_relatice_to_json": true,
+    "property_key": "Prop",
+    "leave_out_frac": 0.05,
+    "n_models_store": 1,
+    "calc_type": "log_regression",
+    "max_feat_cross_correlation": 0.99,
+    "leave_out_inds": [0, 1, 2, 60, 61],
+    "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"],
+    "param_opset": [],
+    "fix_intercept": false
+}
diff --git a/tests/exec_test/log_reg_max_corr_gen_proj/check_model.py b/tests/exec_test/log_reg_max_corr_gen_proj/check_model.py
new file mode 100644
index 0000000000000000000000000000000000000000..c5cdbe83019b9eaea840c312ad60630f00a84896
--- /dev/null
+++ b/tests/exec_test/log_reg_max_corr_gen_proj/check_model.py
@@ -0,0 +1,10 @@
+from sissopp import ModelLogRegressor
+from pathlib import Path
+
+import numpy as np
+
+model = ModelLogRegressor(
+    str("models/train_dim_2_model_0.dat"), str("models/test_dim_2_model_0.dat")
+)
+assert model.rmse < 1e-4
+assert model.test_rmse < 1e-4
diff --git a/tests/exec_test/log_reg_max_corr_gen_proj/data.csv b/tests/exec_test/log_reg_max_corr_gen_proj/data.csv
new file mode 100644
index 0000000000000000000000000000000000000000..ca8bf5a5dba405754688a54146c85f0ccaa8249b
--- /dev/null
+++ b/tests/exec_test/log_reg_max_corr_gen_proj/data.csv
@@ -0,0 +1,101 @@
+Sample,Prop,A,B,C,D
+1,0.042644585721321,49.8070975337169,307.029288124149,127.290160977898,119.363921090366
+2,0.051286552544473,207.151687934325,385.803129282962,232.077642053051,152.451146796233
+3,0.115456483870835,259.115243485315,350.89565982664,126.112015312861,194.269240170231
+4,15.1980039696031,341.392794952748,2.50119646241174,370.843032180154,11.5134267982876
+5,0.030828548720774,123.642359657426,107.674457089161,278.481606674966,277.417740655869
+6,0.113310081828946,434.902594566398,405.870579391739,241.257457440821,213.210004173378
+7,0.071973103667228,393.51387804721,243.422246542275,98.6921352953811,38.4646528894646
+8,0.079124869408332,454.177123336891,312.18007412968,365.093447826501,93.4027537695288
+9,0.084392098447046,262.904172099227,476.075762264071,95.6336322308505,269.096484900607
+10,0.044467510536784,447.85350715457,423.476173355266,422.140647530134,105.30966798588
+11,0.032607637392685,156.982834162006,216.937871068582,436.272542198989,438.578448993111
+12,0.026590761658031,30.4947043445939,353.866638985695,131.530432627816,103.578990316735
+13,0.882677326292512,324.183123925377,379.786690798287,93.7301886485703,440.061356079183
+14,2.92863264703151,253.756388912426,191.488748047164,69.2744710777569,166.908183251839
+15,0.041178758470398,433.425208331938,422.012179205648,138.480233516687,94.7864081755759
+16,0.02951523999224,364.325349150667,478.127595998362,85.6428392655327,93.6000841896524
+17,0.103881442342912,286.280745819571,226.720331942159,138.891298487121,78.0909759774124
+18,0.079447386532567,184.446053206631,428.401650931551,356.108594859543,238.138089889099
+19,0.490048714715941,437.097165477292,89.4506321427452,357.249495038998,21.2823078127849
+20,0.315083955685341,464.689646103278,357.8239998639,166.719192215288,470.323737680847
+21,0.336628829642389,453.229393345602,121.062956838254,368.072231287668,33.8834258163608
+22,0.596250795257965,329.206703754852,336.263700827181,259.138467352137,407.462821099063
+23,0.024008996387738,429.499336054639,451.30307811156,253.003258066353,11.9533585885846
+24,0.702221733754583,293.874006420616,221.692047223248,42.2106293777491,277.645795523081
+25,0.807995736802549,276.103314658051,161.778514482533,426.942506234502,208.868975590116
+26,0.002730384017923,83.1768893799547,13.0356263432169,197.449738789384,355.96703006125
+27,0.034760234337076,239.859835781859,226.950754683377,422.533852762138,472.823756482285
+28,0.103325140788876,142.348449069038,308.188608196991,19.8589904106916,445.782867882951
+29,0.069515399570576,325.065695936087,341.202112577236,280.094904922382,126.698479235295
+30,0.026792086484689,105.496640506796,431.709809945209,201.067642207524,125.346044992663
+31,0.06968520346432,278.288015848546,340.463606947326,62.5783439753465,135.763786890355
+32,0.033814657993073,314.022680967248,366.997527539967,146.288622368974,52.3761667809278
+33,0.036437360543752,182.544295340408,74.6987790052084,111.509818170011,241.023451399827
+34,0.007278291999309,161.331981931619,108.016457758383,331.063276457351,498.833301187792
+35,0.063162819962502,489.309265600451,424.263298255808,159.148021054163,150.645927662812
+36,0.084687250049346,268.778111170417,12.2088368874704,283.50778492837,126.160505819305
+37,0.193238168342884,356.429298308577,16.8790201937628,79.0256508718712,105.78215423975
+38,0.002398998011824,18.700576586924,54.7261653842681,148.691281396519,365.033771980965
+39,0.153459585676185,294.597161844894,103.910596069842,345.840575142828,208.072222213417
+40,0.008244336063986,284.63690286644,15.4716735442082,102.064258484033,407.319217444755
+41,0.038860594113166,242.474343916146,489.029235376506,398.649900719263,182.88700899363
+42,0.068119481955482,288.653567920823,104.182871116269,70.369683070715,260.771564816744
+43,0.645977325788857,285.468431709879,261.088369157819,493.106944180385,321.625679649568
+44,0.056318139669975,180.087855188098,72.886342824395,103.167691072242,204.56315967502
+45,4.03890609225025,198.253697705946,243.375043815026,348.789155694472,222.634358725914
+46,0.050956703375414,66.8423643063199,65.3676810191975,496.070776260521,158.342895817271
+47,7.71218334170515,395.439636236816,230.35515336685,64.9645947685536,211.908463158588
+48,0.0214328895491,176.169330689589,464.950315259736,452.14257500201,81.9464407280035
+49,2.6815357030127,431.955781915035,308.66349882149,253.874224338872,343.837415858124
+50,0.750972167059506,224.327882299282,147.30201835669,243.603529830416,103.105383948355
+51,0.01828936588316,249.505131204334,114.794245180025,129.495085479016,408.195718175082
+52,0.323325638910123,159.141681749234,106.854744390776,18.9019889656962,51.4307457794658
+53,0.033770961308791,413.581373474192,154.408120689668,446.571947766101,435.968644308603
+54,0.318264739192304,346.575691958707,59.8626851714252,255.289629539941,132.392312946497
+55,0.543889052259968,202.50018301682,224.87814578565,179.824036025239,168.045578982369
+56,0.014813028268087,317.664906479146,95.6823941971786,410.391814113528,449.004998805183
+57,3.57717288413323,275.990345747377,445.847337469333,364.60493511239,475.71306458261
+58,0.129900583191695,9.14924177359304,119.375334020565,113.361398805488,62.8300931374718
+59,0.03708937574487,185.837293865833,447.695352621602,145.051135814477,160.694140439915
+60,0.014921382861101,8.28959490938569,378.093007577705,419.707558105521,40.4024046726325
+61,0.387351324110643,407.707347482472,350.199800459508,105.74924661113,253.972219444862
+62,0.101466855479979,267.424013298673,386.705098956208,106.841281052243,212.004932409899
+63,1.32736567001172,352.225865613491,171.568875244562,466.911865424454,130.806654602562
+64,0.028663161770634,200.509116400279,185.054151659821,169.780882949394,426.355265958408
+65,0.066319972191657,393.903835973502,34.1090298843032,395.993284870997,201.465690562591
+66,0.225228846252684,416.977891536474,469.286103651892,451.235879905236,329.368398152194
+67,0.193954759220393,161.654732137163,117.19778787807,386.605589800927,191.350615565409
+68,0.096588543650903,207.153853909472,281.333900278231,463.836665364427,431.036779599936
+69,0.045009258705634,363.180221118644,60.3413214970106,282.709764209504,263.181135410842
+70,0.148522003805379,337.475883356403,345.712927332466,287.038267897121,198.123437087294
+71,0.233773563579686,417.561141951304,416.699554349938,87.471372642944,284.442500963842
+72,0.07632953740424,353.462065059428,230.244117781226,205.862394545593,418.718162036753
+73,0.018751232608648,211.936255629827,38.9153036995522,335.407375291207,269.160151304954
+74,0.11436186782933,232.953194819263,75.6969339038158,145.359684234586,179.190191924589
+75,0.059780123967251,69.2355001498863,279.298667202486,382.963222385851,434.84703056805
+76,0.039466512803061,423.250211589563,470.623951042526,385.799699148574,127.024253733708
+77,0.012044591018025,208.006274730239,51.9283654736177,256.955959532126,347.239249164622
+78,0.02529290143529,104.714857532854,304.9501964971,31.8389612251596,37.8811131918349
+79,0.004302255370929,177.067793973296,28.3724394563165,331.479931132225,461.564224843564
+80,0.074133627013002,158.134705180115,315.976143027337,278.395310237464,484.311241814557
+81,0.160836802038602,410.048253861679,191.235924936736,470.978702738376,322.209828240428
+82,2.01269810172184,405.219860759387,382.357826642821,340.408900681332,340.037356281025
+83,0.086463232833669,465.637225155392,273.935550533945,471.461401666282,69.4129300775254
+84,0.09652399673283,484.793856217189,217.858512463092,317.765365776244,30.5574828436997
+85,25.7254804725824,82.523989196792,270.839083225059,174.362686576973,263.842713219115
+86,3123.2782662272,106.094612132738,332.523075165983,43.7520305984393,333.205085763281
+87,0.024585156837383,445.981783633524,66.2298255679999,373.905334245721,377.150351563984
+88,0.018205881117995,117.196993636293,487.277963654347,133.617747939771,85.6568366996931
+89,0.188996142204433,479.415773481536,87.8379455379654,429.859856289863,204.165337443809
+90,0.030687409853583,324.976847605473,470.960628473335,126.942577378075,106.983488589713
+91,0.084751033512141,234.705200183164,155.730168172949,303.535045799807,16.2640518096776
+92,0.019959943675665,342.417666791063,8.85018592692246,287.691069761052,283.171709994183
+93,0.017102517653105,162.236891091598,81.3657842047582,456.824872475328,318.421521204707
+94,0.070128975496401,23.8513212365969,333.533751731515,196.322141219655,479.082443413362
+95,0.006819280465836,151.095020172981,37.8615330871062,227.034115691539,362.755341229094
+96,0.009406739334267,32.3147853544107,113.549008148151,57.9053526795503,348.470092929769
+97,0.026975504059684,53.4863829718894,186.176691450922,56.7764485836142,371.957528066938
+98,711.780196017338,7.36908228123383,488.268102038773,252.670421142526,486.698386703915
+99,2.82416953708447,183.480683022873,368.758162863328,2.80315566798772,340.166678214112
+100,0.3850029324009,482.322626243789,100.05861510573,495.618985557607,17.9764839910466
diff --git a/tests/exec_test/log_reg_max_corr_gen_proj/sisso.json b/tests/exec_test/log_reg_max_corr_gen_proj/sisso.json
new file mode 100644
index 0000000000000000000000000000000000000000..a2e566facab1015b72032ec5b87b09404b4eb2ac
--- /dev/null
+++ b/tests/exec_test/log_reg_max_corr_gen_proj/sisso.json
@@ -0,0 +1,18 @@
+{
+    "desc_dim": 2,
+    "n_sis_select": 1,
+    "max_rung": 1,
+    "n_residual": 1,
+    "data_file": "data.csv",
+    "data_file_relatice_to_json": true,
+    "property_key": "Prop",
+    "leave_out_frac": 0.05,
+    "n_models_store": 1,
+    "calc_type": "log_regression",
+    "n_rung_generate": 1,
+    "max_feat_cross_correlation": 0.99,
+    "leave_out_inds": [0, 1, 2, 60, 61],
+    "opset": ["add", "sub", "abs_diff", "mult", "div", "inv", "abs", "exp", "log", "sin", "cos", "sq", "cb", "six_pow", "sqrt", "cbrt", "neg_exp"],
+    "param_opset": [],
+    "fix_intercept": false
+}
diff --git a/tests/exec_test/reparam/sisso.json b/tests/exec_test/reparam/sisso.json
index d90917e122558ad90f6b7dcb206d3a76015d0627..64267ee61bae531b89fa71491fe34d8d68141ce6 100644
--- a/tests/exec_test/reparam/sisso.json
+++ b/tests/exec_test/reparam/sisso.json
@@ -1,7 +1,7 @@
 {
     "desc_dim": 2,
     "n_sis_select": 5,
-    "max_rung": 1,
+    "max_rung": 2,
     "n_residual": 5,
     "data_file": "data.csv",
     "data_file_relatice_to_json": true,
diff --git a/tests/exec_test/reparam_gen_proj/sisso.json b/tests/exec_test/reparam_gen_proj/sisso.json
index c3c554cf65f999cfd552d8a0bac735f1664be560..90182dad19e3288700eddcb2c4fa45cb076a0e61 100644
--- a/tests/exec_test/reparam_gen_proj/sisso.json
+++ b/tests/exec_test/reparam_gen_proj/sisso.json
@@ -1,6 +1,6 @@
 {
     "desc_dim": 2,
-    "n_sis_select": 5,
+    "n_sis_select": 21,
     "max_rung": 1,
     "n_residual": 5,
     "data_file": "data.csv",
@@ -11,7 +11,25 @@
     "n_rung_generate": 1,
     "leave_out_inds": [],
     "opset": ["sq"],
-    "param_opset": ["cb", "abs_diff", "div"],
+    "param_opset": [
+        "add",
+        "sub",
+        "abs_diff",
+        "mult",
+        "div",
+        "inv",
+        "abs",
+        "exp",
+        "log",
+        "sin",
+        "cos",
+        "sq",
+        "cb",
+        "six_pow",
+        "sqrt",
+        "cbrt",
+        "neg_exp"
+    ],
     "fix_intercept": true,
     "reparam_residual": true,
     "global_param_opt": false
diff --git a/tests/googletest/classification/test_lp_wrapper.cc b/tests/googletest/classification/test_lp_wrapper.cc
index 60d18e159bffe9249c91de829f41c934aefce710..d52ae7a4a9ac389a9bdd8b87b819606b0cd5f2a9 100644
--- a/tests/googletest/classification/test_lp_wrapper.cc
+++ b/tests/googletest/classification/test_lp_wrapper.cc
@@ -136,6 +136,33 @@ namespace
 
     TEST_F(LPWrapperTests, WithTestData)
     {
+        EXPECT_THROW(
+            LPWrapper lp(
+                _samp_per_class,
+                _task_num,
+                _n_class,
+                _n_dim,
+                _n_samp + 1,
+                _tol,
+                _samp_per_class_test,
+                _n_samp_test
+            ),
+            std::logic_error
+        );
+
+        EXPECT_THROW(
+            LPWrapper lp(
+                _samp_per_class,
+                _task_num,
+                _n_class,
+                _n_dim,
+                _n_samp,
+                _tol,
+                _samp_per_class_test,
+                _n_samp_test + 1
+            ),
+            std::logic_error
+        );
         LPWrapper lp(
             _samp_per_class,
             _task_num,
@@ -160,10 +187,30 @@ namespace
         EXPECT_EQ(lp.n_samp_test(), _n_samp_test);
         EXPECT_EQ(lp.n_overlap(), 0);
         EXPECT_EQ(lp.n_overlap_test(), 0);
+
+        EXPECT_THROW(lp.copy_data(0, {0, 1, 2}), std::logic_error);
+
+        val_ptrs = {_phi[0].value_ptr(), _phi[1].value_ptr(), _phi[1].value_ptr()};
+        EXPECT_THROW(lp.set_n_overlap(val_ptrs, test_val_ptrs, _error.data(), _test_error.data()), std::logic_error);
+
+        val_ptrs = {_phi[0].value_ptr(), _phi[1].value_ptr()};
+        test_val_ptrs = {_phi[0].test_value_ptr(), _phi[1].test_value_ptr(), _phi[1].test_value_ptr()};
+        EXPECT_THROW(lp.set_n_overlap(val_ptrs, test_val_ptrs, _error.data(), _test_error.data()), std::logic_error);
     }
 
     TEST_F(LPWrapperTests, WithoutTestData)
     {
+        EXPECT_THROW(
+            LPWrapper lp(
+                _samp_per_class,
+                _task_num,
+                _n_class,
+                _n_dim,
+                _n_samp + 1,
+                _tol
+            ),
+            std::logic_error
+        );
 
         LPWrapper lp(
             _samp_per_class,
@@ -187,11 +234,19 @@ namespace
         EXPECT_EQ(lp.n_samp_test(), 0);
         EXPECT_EQ(lp.n_overlap(), 0);
         EXPECT_EQ(lp.n_overlap_test(), 0);
+
+        EXPECT_THROW(lp.copy_data(0, {0, 1, 2}), std::logic_error);
+
+        val_ptrs = {_phi[0].value_ptr(), _phi[1].value_ptr(), _phi[1].value_ptr()};
+        EXPECT_THROW(lp.set_n_overlap(val_ptrs, test_val_ptrs, _error.data(), _test_error.data()), std::logic_error);
+
+        val_ptrs = {_phi[0].value_ptr(), _phi[1].value_ptr()};
+        test_val_ptrs = {_phi[0].test_value_ptr(), _phi[1].test_value_ptr(), _phi[1].test_value_ptr()};
+        EXPECT_THROW(lp.set_n_overlap(val_ptrs, test_val_ptrs, _error.data(), _test_error.data()), std::logic_error);
     }
 
     TEST_F(LPWrapperTests, CopyTest)
     {
-
         LPWrapper lp(
             _samp_per_class,
             _task_num,
diff --git a/tests/googletest/classification/test_prop_sorted_d_mat.cc b/tests/googletest/classification/test_prop_sorted_d_mat.cc
index e61cf3f09c0a9ad0e353ce58f50a5f1a4b76e124..3123df66254b3c3a3a92155541ebff6e74235fec 100644
--- a/tests/googletest/classification/test_prop_sorted_d_mat.cc
+++ b/tests/googletest/classification/test_prop_sorted_d_mat.cc
@@ -62,5 +62,22 @@ namespace
         prop_sorted_d_mat::access_sample_sorted_d_matrix(2)[0] = 1.5;
         EXPECT_EQ(prop_sorted_d_mat::access_sample_sorted_d_matrix(2, 0, 0)[0], 1.5);
         prop_sorted_d_mat::finalize_sorted_d_matrix_arr();
+
+        EXPECT_THROW(
+            prop_sorted_d_mat::initialize_sorted_d_matrix_arr(1, _n_task, _n_class + 1, _n_samples_per_class),
+            std::logic_error
+        );
+
+        EXPECT_THROW(
+            prop_sorted_d_mat::initialize_sorted_d_matrix_arr(1, _n_task + 1, _n_class, _n_samples_per_class),
+            std::logic_error
+        );
+
+        _n_samples_per_class.push_back(1);
+        EXPECT_THROW(
+            prop_sorted_d_mat::initialize_sorted_d_matrix_arr(1, _n_task, _n_class, _n_samples_per_class),
+            std::logic_error
+        );
+
     }
 }
diff --git a/tests/googletest/classification/test_svm_wrapper.cc b/tests/googletest/classification/test_svm_wrapper.cc
index b55ee6bcc6d4f0808d555c24bda349d82c8067e2..ca7c9c67f9e0a7da8ce718420494bd62249b1f00 100644
--- a/tests/googletest/classification/test_svm_wrapper.cc
+++ b/tests/googletest/classification/test_svm_wrapper.cc
@@ -168,6 +168,15 @@ namespace
         EXPECT_EQ(svm.n_dim(), _n_dim);
         EXPECT_EQ(svm.n_samp(), _n_samp);
         EXPECT_EQ(svm.n_misclassified(), 0);
+
+        EXPECT_THROW(svm.copy_data({0,1,2}, 0), std::logic_error);
+        EXPECT_THROW(svm.copy_data({0,1}, 3), std::logic_error);
+
+        val_ptrs = {_phi[0].value_ptr(), _phi[1].value_ptr(), _phi[1].value_ptr()};
+        EXPECT_THROW(svm.copy_data(val_ptrs), std::logic_error);
+
+        test_val_ptrs = {_phi[0].test_value_ptr(), _phi[1].test_value_ptr(), _phi[1].test_value_ptr()};
+        EXPECT_THROW(svm.predict(20, test_val_ptrs), std::logic_error);
     }
 
     TEST_F(SVMWrapperTests, CTest)
@@ -210,5 +219,14 @@ namespace
         EXPECT_EQ(svm.n_dim(), _n_dim);
         EXPECT_EQ(svm.n_samp(), _n_samp);
         EXPECT_EQ(svm.n_misclassified(), 0);
+
+        EXPECT_THROW(svm.copy_data({0,1,2}, 0), std::logic_error);
+        EXPECT_THROW(svm.copy_data({0,1}, 3), std::logic_error);
+
+        val_ptrs = {_phi[0].value_ptr(), _phi[1].value_ptr(), _phi[1].value_ptr()};
+        EXPECT_THROW(svm.copy_data(val_ptrs), std::logic_error);
+
+        test_val_ptrs = {_phi[0].test_value_ptr(), _phi[1].test_value_ptr(), _phi[1].test_value_ptr()};
+        EXPECT_THROW(svm.predict(20, test_val_ptrs), std::logic_error);
     }
 }
diff --git a/tests/googletest/descriptor_identification/model/test_model_classifier.cc b/tests/googletest/descriptor_identification/model/test_model_classifier.cc
index ac1583394625f7332832b021e44ac54dcb94b242..3148a09cbe6ede02e85e61a0f75f652341731956 100644
--- a/tests/googletest/descriptor_identification/model/test_model_classifier.cc
+++ b/tests/googletest/descriptor_identification/model/test_model_classifier.cc
@@ -143,4 +143,35 @@ namespace
         boost::filesystem::remove("train_class_mods.dat");
         boost::filesystem::remove("test_class_mods.dat");
     }
+
+    TEST_F(ModelClassifierTests, EvalTest)
+    {
+        ModelClassifier model(
+            "Property",
+            Unit("m"),
+            _loss,
+            _features,
+            _leave_out_inds,
+            _sample_ids_train,
+            _sample_ids_test,
+            task_names
+        );
+
+        model.set_task_eval(0);
+        std::vector<double> pt = {2.0, 2.0};
+        EXPECT_THROW(model.eval(pt), std::logic_error);
+
+        std::map<std::string, double> pt_dct;
+        pt_dct["A"] = 1.0;
+        pt_dct["B"] = 1.0;
+        EXPECT_THROW(model.eval(pt_dct), std::logic_error);
+
+        std::vector<std::vector<double>> pts = {{1.0}, {1.0}};
+        EXPECT_THROW(model.eval(pts), std::logic_error);
+
+        std::map<std::string, std::vector<double>> pts_dct;
+        pts_dct["A"] = {1.0};
+        pts_dct["B"] = {1.0};
+        EXPECT_THROW(model.eval(pts_dct), std::logic_error);
+    }
 }
diff --git a/tests/googletest/descriptor_identification/model/test_model_log_regressor.cc b/tests/googletest/descriptor_identification/model/test_model_log_regressor.cc
index 2a46674652cad5cd05806a574e9edd8dad251918..054e89a6a51a1d2bc309e5c2f81b42d3d4621b8a 100644
--- a/tests/googletest/descriptor_identification/model/test_model_log_regressor.cc
+++ b/tests/googletest/descriptor_identification/model/test_model_log_regressor.cc
@@ -127,6 +127,8 @@ namespace
         EXPECT_LT(std::abs(model.coefs()[0][1] + 2.1), 1e-10);
         EXPECT_LT(std::abs(model.coefs()[0][2] - std::log(0.001)), 1e-10);
 
+        EXPECT_STREQ(model.toLatexString().c_str(), "$\\exp\\left(c_0\\right)\\left(A\\right)^{a_0}\\left(B\\right)^{a_1}$");
+
         model.to_file("train_false_log_reg.dat", true);
         model.to_file("test_false_log_reg.dat", false);
     }
@@ -248,6 +250,8 @@ namespace
 
         model.to_file("train_true_log_reg.dat", true);
         model.to_file("test_true_log_reg.dat", false);
+
+        EXPECT_STREQ(model.toLatexString().c_str(), "$\\left(A\\right)^{a_0}\\left(B\\right)^{a_1}$");
     }
 
     TEST_F(ModelLogRegssorTests, FixInterceptTrueFileTest)
@@ -340,5 +344,23 @@ namespace
         pts_dct["B"] = {1.0};
         val = model.eval(pts_dct)[0];
         EXPECT_LT(val - 0.00025, 1e-10);
+
+        pt.push_back(1.0);
+        EXPECT_THROW(model.eval(pt), std::logic_error);
+
+        pts.push_back({1.0});
+        EXPECT_THROW(model.eval(pts), std::logic_error);
+
+        pts.pop_back();
+        pts.back().push_back(1.0);
+        EXPECT_THROW(model.eval(pts), std::logic_error);
+
+        pts_dct["A"] = {1.0, 1.0};
+        EXPECT_THROW(model.eval(pts_dct), std::logic_error);
+
+        pt_dct.erase("A");
+        pts_dct.erase("A");
+        EXPECT_THROW(model.eval(pt_dct), std::logic_error);
+        EXPECT_THROW(model.eval(pts_dct), std::logic_error);
     }
 }
diff --git a/tests/googletest/descriptor_identification/model/test_model_regressor.cc b/tests/googletest/descriptor_identification/model/test_model_regressor.cc
index c270f02d6ad7ee5630843064bb0fc872d9c5a557..62ecc14ad655e4252e51c21b20158deb76d4ea1a 100644
--- a/tests/googletest/descriptor_identification/model/test_model_regressor.cc
+++ b/tests/googletest/descriptor_identification/model/test_model_regressor.cc
@@ -133,6 +133,7 @@ namespace
         EXPECT_LT(std::abs(model.coefs()[1][1] + 0.4), 1e-10);
         EXPECT_LT(std::abs(model.coefs()[1][2] + 6.5), 1e-10);
 
+        EXPECT_STREQ(model.toLatexString().c_str(), "$c_0 + a_0A + a_1B$");
         model.to_file("train_false.dat", true);
         model.to_file("test_false.dat", false);
     }
@@ -259,6 +260,8 @@ namespace
         EXPECT_LT(std::abs(model.coefs()[1][0] - 1.25), 1e-10);
         EXPECT_LT(std::abs(model.coefs()[1][1] + 0.4), 1e-10);
 
+        EXPECT_STREQ(model.toLatexString().c_str(), "$a_0A + a_1B$");
+
         model.to_file("train_true.dat", true);
         model.to_file("test_true.dat", false);
     }
@@ -359,5 +362,23 @@ namespace
         EXPECT_LT(model.eval(pt_dct) + 5.65, 1e-10);
         EXPECT_LT(model.eval(pts)[0] + 5.65, 1e-10);
         EXPECT_LT(model.eval(pts_dct)[0] + 5.65, 1e-10);
+
+        pt.push_back(1.0);
+        EXPECT_THROW(model.eval(pt), std::logic_error);
+
+        pts.push_back({1.0});
+        EXPECT_THROW(model.eval(pts), std::logic_error);
+
+        pts.pop_back();
+        pts.back().push_back(1.0);
+        EXPECT_THROW(model.eval(pts), std::logic_error);
+
+        pts_dct["A"] = {1.0, 1.0};
+        EXPECT_THROW(model.eval(pts_dct), std::logic_error);
+
+        pt_dct.erase("A");
+        pts_dct.erase("A");
+        EXPECT_THROW(model.eval(pt_dct), std::logic_error);
+        EXPECT_THROW(model.eval(pts_dct), std::logic_error);
     }
 }
diff --git a/tests/googletest/descriptor_identification/solver/test_sisso_classifier.cc b/tests/googletest/descriptor_identification/solver/test_sisso_classifier.cc
index 1034836027c1629b6bd063584add38f3d4abba64..0b0f500183e8e5f23ec8cecd4587d39342d7e8bb 100644
--- a/tests/googletest/descriptor_identification/solver/test_sisso_classifier.cc
+++ b/tests/googletest/descriptor_identification/solver/test_sisso_classifier.cc
@@ -201,4 +201,38 @@ namespace
         boost::filesystem::remove_all("feature_space/");
         boost::filesystem::remove_all("models/");
     }
+
+    TEST_F(SISSOClassifierTests, FixInterceptTrueTest)
+    {
+        std::shared_ptr<FeatureSpace> feat_space = std::make_shared<FeatureSpace>(inputs);
+        inputs.set_fix_intercept(true);
+        SISSOClassifier sisso(inputs, feat_space);
+        EXPECT_FALSE(sisso.fix_intercept());
+
+        std::vector<double> prop_comp(80, 0.0);
+        std::transform(inputs.prop_train().begin(), inputs.prop_train().end(), sisso.prop_train().begin(), prop_comp.begin(), [](double p1, double p2){return std::abs(p1 - p2);});
+        EXPECT_FALSE(std::any_of(prop_comp.begin(), prop_comp.end(), [](double p){return p > 1e-10;}));
+
+        std::transform(inputs.prop_test().begin(), inputs.prop_test().begin() + 10, sisso.prop_test().begin(), prop_comp.begin(), [](double p1, double p2){return std::abs(p1 - p2);});
+        EXPECT_FALSE(std::any_of(prop_comp.begin(), prop_comp.begin() + 10, [](double p){return p > 1e-10;}));
+
+        EXPECT_EQ(sisso.n_samp(), 80);
+        EXPECT_EQ(sisso.n_dim(), 2);
+        EXPECT_EQ(sisso.n_residual(), 2);
+        EXPECT_EQ(sisso.n_models_store(), 3);
+
+        sisso.fit();
+
+        EXPECT_EQ(sisso.models().size(), 2);
+        EXPECT_EQ(sisso.models()[0].size(), 3);
+
+        EXPECT_EQ(sisso.models().back()[0].n_convex_overlap_train(), 0);
+        EXPECT_EQ(sisso.models().back()[0].n_convex_overlap_test(), 0);
+
+        EXPECT_EQ(sisso.models().back()[0].n_svm_misclassified_train(), 0);
+        EXPECT_EQ(sisso.models().back()[0].n_svm_misclassified_test(), 0);
+
+        boost::filesystem::remove_all("feature_space/");
+        boost::filesystem::remove_all("models/");
+    }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_abs_diff_node.cc b/tests/googletest/feature_creation/feature_generation/test_abs_diff_node.cc
index e196bd19aa6090665577cee3ce3ac6ec2a140177..5d7c03c4ef23e7902370c6814c3d3f568842be3a 100644
--- a/tests/googletest/feature_creation/feature_generation/test_abs_diff_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_abs_diff_node.cc
@@ -25,8 +25,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
-
+            #endif
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {5.0};
 
@@ -180,8 +184,8 @@ namespace
 
     TEST_F(AbsDiffNodeTest, AttributesTest)
     {
-        _abs_diff_test = std::make_shared<AbsDiffNode>(_phi[0], _phi[1], 5, 1e-50, 1e50);
-        _abs_diff_test = std::make_shared<AbsDiffNode>(_abs_diff_test, _phi[1], 6, 1e-50, 1e50);
+        node_ptr feat_r1 = std::make_shared<AbsDiffNode>(_phi[0], _phi[1], 5, 1e-50, 1e50);
+        _abs_diff_test = std::make_shared<AbsDiffNode>(feat_r1, _phi[1], 6, 1e-50, 1e50);
 
         EXPECT_EQ(_abs_diff_test->rung(), 2);
 
@@ -195,5 +199,35 @@ namespace
 
         EXPECT_STREQ(_abs_diff_test->expr().c_str(), "(|(|A - B|) - B|)");
         EXPECT_STREQ(_abs_diff_test->postfix_expr().c_str(), "0|1|abd|1|abd");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_abs_diff_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_abs_diff_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_abs_diff_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_abs_diff_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_abs_diff_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, -2.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(4, 0.0);
+        EXPECT_THROW(_abs_diff_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _abs_diff_test->set_parameters({}, true);
+        _abs_diff_test->set_parameters(nullptr);
+
+        _abs_diff_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], -1.0);
+        EXPECT_EQ(dfdp[1], 0.0);
+
+        EXPECT_EQ(dfdp[2], 1.0);
+        EXPECT_EQ(dfdp[3], 1.0);
+
+        EXPECT_EQ(util_funcs::norm(grad), 0.0);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_abs_node.cc b/tests/googletest/feature_creation/feature_generation/test_abs_node.cc
index d1d5440fa73f0b92f49488f95ae3c7e290082311..438aea04b33c05ebd4b48d5df56dd68f1dfde865 100644
--- a/tests/googletest/feature_creation/feature_generation/test_abs_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_abs_node.cc
@@ -24,7 +24,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {-1.0, -2.0, -3.0, -4.0};
             std::vector<double> test_value_1 =  {50.0};
@@ -182,5 +187,32 @@ namespace
 
         EXPECT_STREQ(_abs_test->expr().c_str(), "(|A|)");
         EXPECT_STREQ(_abs_test->postfix_expr().c_str(), "0|abs");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_abs_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_abs_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_abs_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_abs_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_abs_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 2.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_abs_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _abs_test->set_parameters({}, true);
+        _abs_test->set_parameters(nullptr);
+        _abs_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], 1.0);
+        EXPECT_EQ(dfdp[1], 0.0);
+
+        EXPECT_EQ(dfdp[2], -1.0);
+        EXPECT_EQ(dfdp[3], -1.0);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_add_node.cc b/tests/googletest/feature_creation/feature_generation/test_add_node.cc
index b50c377427009a83ed5486f134fe2c3ebb063be2..f91124f873fcf84b8965eba1570dc70ee8cd4e03 100644
--- a/tests/googletest/feature_creation/feature_generation/test_add_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_add_node.cc
@@ -24,7 +24,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {5.0};
@@ -194,5 +199,32 @@ namespace
 
         EXPECT_STREQ(_add_test->expr().c_str(), "((A + B) + B)");
         EXPECT_STREQ(_add_test->postfix_expr().c_str(), "0|1|add|1|add");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_add_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_add_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_add_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_add_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_add_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(4, 0.0);
+        EXPECT_THROW(_add_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _add_test->set_parameters({}, true);
+        _add_test->set_parameters(nullptr);
+        _add_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], 1.0);
+        EXPECT_EQ(dfdp[1], 1.0);
+
+        EXPECT_EQ(dfdp[2], 1.0);
+        EXPECT_EQ(dfdp[3], 1.0);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_cb_node.cc b/tests/googletest/feature_creation/feature_generation/test_cb_node.cc
index 0a2ebebef8fc41450022543679351bbbb1c3395e..8110f79865ee03846ec247095733d9ca27084118 100644
--- a/tests/googletest/feature_creation/feature_generation/test_cb_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_cb_node.cc
@@ -28,7 +28,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 8.0};
             std::vector<double> test_value_1 =  {2.0};
@@ -180,5 +185,32 @@ namespace
 
         EXPECT_STREQ(_cb_test->expr().c_str(), "(A^3)");
         EXPECT_STREQ(_cb_test->postfix_expr().c_str(), "0|cb");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_cb_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_cb_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_cb_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_cb_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_cb_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_cb_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _cb_test->set_parameters({}, true);
+        _cb_test->set_parameters(nullptr);
+        _cb_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], 3.0);
+        EXPECT_EQ(dfdp[1], 12.0);
+
+        EXPECT_EQ(dfdp[2], 27.0);
+        EXPECT_EQ(dfdp[3], 192.0);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_cbrt_node.cc b/tests/googletest/feature_creation/feature_generation/test_cbrt_node.cc
index e556cd04146933d1e17f19ee8ae1136a4ef6375c..66b6c5a513ad58b70da22aeec66a72dbc085efbf 100644
--- a/tests/googletest/feature_creation/feature_generation/test_cbrt_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_cbrt_node.cc
@@ -29,7 +29,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 8.0};
             std::vector<double> test_value_1 =  {8.0};
@@ -205,5 +210,29 @@ namespace
 
         EXPECT_STREQ(_cbrt_test->expr().c_str(), "cbrt(A)");
         EXPECT_STREQ(_cbrt_test->postfix_expr().c_str(), "0|cbrt");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_cbrt_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_cbrt_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_cbrt_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_cbrt_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_cbrt_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_cbrt_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _cbrt_test->set_parameters({}, true);
+        _cbrt_test->set_parameters(nullptr);
+        _cbrt_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_LT(std::abs(dfdp[0] - 1/3.0), 1e-10);
+        EXPECT_LT(std::abs(dfdp[3] - 1/12.0), 1e-10);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_cos_node.cc b/tests/googletest/feature_creation/feature_generation/test_cos_node.cc
index 73ea5c78e59dc3aaf91cc4536cd7b9bc290a7671..427294009253c585f9681e114200ce700146097b 100644
--- a/tests/googletest/feature_creation/feature_generation/test_cos_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_cos_node.cc
@@ -25,7 +25,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {0.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {0.0};
@@ -181,5 +186,32 @@ namespace
 
         EXPECT_STREQ(_cos_test->expr().c_str(), "cos(A)");
         EXPECT_STREQ(_cos_test->postfix_expr().c_str(), "0|cos");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_cos_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_cos_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_cos_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_cos_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_cos_test->parameters().size(), 0);
+
+        std::vector<double> params = {M_PI / 2.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_cos_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _cos_test->set_parameters({}, true);
+        _cos_test->set_parameters(nullptr);
+        _cos_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_LT(std::abs(dfdp[0] - 0.0), 1e-10);
+        EXPECT_LT(std::abs(dfdp[1] - 0.0), 1e-10);
+
+        EXPECT_LT(std::abs(dfdp[2] - 1.0), 1e-10);
+        EXPECT_LT(std::abs(dfdp[3] - 0.0), 1e-10);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_div_node.cc b/tests/googletest/feature_creation/feature_generation/test_div_node.cc
index 162cb34aa1e35fadb25aa02c59e215b1c00e7f0b..423915850d78be48a6cf6bbc8f8e98c191a2e744 100644
--- a/tests/googletest/feature_creation/feature_generation/test_div_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_div_node.cc
@@ -25,7 +25,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {5.0};
@@ -244,5 +249,34 @@ namespace
 
         EXPECT_STREQ(_div_test->expr().c_str(), "((A / B) / B)");
         EXPECT_STREQ(_div_test->postfix_expr().c_str(), "0|1|div|1|div");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_div_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_div_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_div_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_div_test->n_params_possible(), 2);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(4, 0.0);
+        EXPECT_THROW(_div_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        EXPECT_EQ(_div_test->parameters().size(), 0);
+
+        _div_test->set_parameters({}, true);
+        _div_test->set_parameters(nullptr);
+        _div_test->param_derivative(params.data(), dfdp.data());
+        std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
+        std::vector<double> value_2 = {10.0, 25.0, 30.0, 40.0};
+
+        EXPECT_LT(std::abs(dfdp[0] + (1.0 / 10.0) / (10.0 * 10.0)), 1e-10);
+        EXPECT_LT(std::abs(dfdp[1] + (2.0 / 25.0) / (25.0 * 25.0)), 1e-10);
+
+        EXPECT_LT(std::abs(dfdp[2] + (3.0 / 30.0) / (30.0 * 30.0)), 1e-10);
+        EXPECT_LT(std::abs(dfdp[3] + (4.0 / 40.0) / (40.0 * 40.0)), 1e-10);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_exp_node.cc b/tests/googletest/feature_creation/feature_generation/test_exp_node.cc
index cb3dff658ded2d68fa3f40511ff3ed362da9988f..6923e5af15a977dc1d4f7332a9ec57fe66c3b4a7 100644
--- a/tests/googletest/feature_creation/feature_generation/test_exp_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_exp_node.cc
@@ -29,7 +29,12 @@ namespace
     protected:
         void SetUp() override
         {
-            node_value_arrs::initialize_values_arr({4}, {1}, 3, 2, false);
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {0.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {0.0};
@@ -210,5 +215,32 @@ namespace
 
         EXPECT_STREQ(_exp_test->expr().c_str(), "exp(A)");
         EXPECT_STREQ(_exp_test->postfix_expr().c_str(), "0|exp");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_exp_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_exp_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_exp_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_exp_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_exp_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_exp_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _exp_test->set_parameters({}, true);
+        _exp_test->set_parameters(nullptr);
+        _exp_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_LT(std::abs(dfdp[0] - _exp_test->value()[0]), 1e-10);
+        EXPECT_LT(std::abs(dfdp[1] - _exp_test->value()[1]), 1e-10);
+
+        EXPECT_LT(std::abs(dfdp[2] - _exp_test->value()[2]), 1e-10);
+        EXPECT_LT(std::abs(dfdp[3] - _exp_test->value()[3]), 1e-10);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_feat_node.cc b/tests/googletest/feature_creation/feature_generation/test_feat_node.cc
index ffa48c9a7f2345892ab39a48044223ed19a05cea..46106f91c9d4dcd667589254995b920817f80c68 100644
--- a/tests/googletest/feature_creation/feature_generation/test_feat_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_feat_node.cc
@@ -22,7 +22,7 @@ namespace
     protected:
         void SetUp() override
         {
-            node_value_arrs::initialize_values_arr({4}, {1}, 3, 0, false);
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
 
             _value_1 = {1.0, 2.0, 3.0, 4.0};
             _test_value_1 =  {5.0};
@@ -32,6 +32,9 @@ namespace
 
             _value_3 = {1.0, 2.0, 3.0, 1.0};
             _test_value_3 =  {5.0};
+
+            _value_4 = {1.0, 2.0, 3.0};
+            _test_value_4 =  {};
         }
 
         void TearDown() override
@@ -47,6 +50,9 @@ namespace
 
         std::vector<double> _value_3;
         std::vector<double> _test_value_3;
+
+        std::vector<double> _value_4;
+        std::vector<double> _test_value_4;
     };
 
     TEST_F(FeatureNodeTest, ConstructorTest)
@@ -74,6 +80,42 @@ namespace
         );
         node_ptr feat_4 = feat_1->hard_copy();
 
+        try
+        {
+            node_ptr feat_5 = std::make_shared<FeatureNode>(
+                3,
+                "D",
+                _value_4,
+                _test_value_3,
+                Unit("m")
+            );
+            EXPECT_TRUE(false) << " (Creation of FeatureNode with wrong number of samples in the training data)";
+        }
+        catch(const std::logic_error& e)
+        {}
+
+        try
+        {
+            node_ptr feat_5 = std::make_shared<FeatureNode>(
+                3,
+                "D",
+                _value_3,
+                _test_value_4,
+                Unit("m")
+            );
+            EXPECT_TRUE(false) << " (Creation of FeatureNode with wrong number of samples in the test data)";
+        }
+        catch(const std::logic_error& e)
+        {}
+
+        node_ptr feat_5 = std::make_shared<FeatureNode>(
+            3,
+            "D",
+            _value_3,
+            _test_value_3,
+            Unit("m")
+        );
+
         EXPECT_FALSE(feat_1->is_const());
         EXPECT_FALSE(feat_1->is_nan());
         EXPECT_STREQ(feat_1->unit().toString().c_str(), "m");
@@ -126,4 +168,37 @@ namespace
         EXPECT_EQ(feat_4->n_feats(), 0);
         EXPECT_EQ(feat_4->sort_score(10), 0);
     }
+
+    TEST_F(FeatureNodeTest, DefaultFunctionTests)
+    {
+        std::shared_ptr<FeatureNode> feat_1 = std::make_shared<FeatureNode>(
+            0,
+            "A",
+            _value_1,
+            _test_value_1,
+            Unit("m")
+        );
+        node_ptr feat_2 = feat_1->hard_copy();
+
+        std::vector<node_ptr> phi = {feat_1, feat_2};
+        feat_1->reset_feats(phi);
+
+        #ifdef PARAMETERIZE
+        feat_1->set_parameters({}, true);
+        feat_1->set_parameters(nullptr);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        feat_1->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], 0.0);
+        EXPECT_EQ(dfdp[1], 0.0);
+
+        EXPECT_THROW(feat_1->gradient(params.data(), dfdp.data()), std::logic_error);
+
+        feat_1->set_value(nullptr);
+        feat_1->set_test_value(nullptr);
+        #endif
+
+    }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_inv_node.cc b/tests/googletest/feature_creation/feature_generation/test_inv_node.cc
index 7eaa5cec0f5c653fdfe874b2b644c7ce29dc9bb3..3a0022e01a358110bd1b62e8c98b3fd392e91809 100644
--- a/tests/googletest/feature_creation/feature_generation/test_inv_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_inv_node.cc
@@ -29,7 +29,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 8.0};
             std::vector<double> test_value_1 =  {2.0};
@@ -204,5 +209,32 @@ namespace
 
         EXPECT_STREQ(_inv_test->expr().c_str(), "(1.0 / A)");
         EXPECT_STREQ(_inv_test->postfix_expr().c_str(), "0|inv");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_inv_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_inv_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_inv_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_inv_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_inv_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_inv_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _inv_test->set_parameters({}, true);
+        _inv_test->set_parameters(nullptr);
+        _inv_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_LT(std::abs(dfdp[0] + 1.0), 1e-10);
+        EXPECT_LT(std::abs(dfdp[1] + 1.0 / 4.0), 1e-10);
+
+        EXPECT_LT(std::abs(dfdp[2] + 1.0 / 9.0), 1e-10);
+        EXPECT_LT(std::abs(dfdp[3] + 1.0 / 64.0), 1e-10);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_log_node.cc b/tests/googletest/feature_creation/feature_generation/test_log_node.cc
index 87c67d1fe01c8ada56d699d0a43405111c1763e2..8dafacd352be8123904c2d24939d5bd52058f364 100644
--- a/tests/googletest/feature_creation/feature_generation/test_log_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_log_node.cc
@@ -35,7 +35,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {1.0};
@@ -304,5 +309,32 @@ namespace
 
         EXPECT_STREQ(_log_test->expr().c_str(), "ln(A)");
         EXPECT_STREQ(_log_test->postfix_expr().c_str(), "0|log");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_log_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_log_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_log_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_log_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_log_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_log_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _log_test->set_parameters({}, true);
+        _log_test->set_parameters(nullptr);
+        _log_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_LT(std::abs(dfdp[0] - 1.0), 1e-10);
+        EXPECT_LT(std::abs(dfdp[1] - 0.5), 1e-10);
+
+        EXPECT_LT(std::abs(dfdp[2] - (1.0 / 3.0)), 1e-10);
+        EXPECT_LT(std::abs(dfdp[3] - 0.25), 1e-10);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_model_node.cc b/tests/googletest/feature_creation/feature_generation/test_model_node.cc
index 8b45808d1c48f76fa0be008f4333a570da4ab733..01f7ec4ce841c656e48a4832f1e4f628931250d9 100644
--- a/tests/googletest/feature_creation/feature_generation/test_model_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_model_node.cc
@@ -23,7 +23,7 @@ namespace
     protected:
         void SetUp() override
         {
-            node_value_arrs::initialize_values_arr({4}, {1}, 3, 0, false);
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
 
             _value_1 = {1.0, 2.0, 3.0, 4.0};
             _test_value_1 =  {5.0};
@@ -100,5 +100,83 @@ namespace
         EXPECT_EQ(feat_4->rung(), 1);
         EXPECT_EQ(feat_4->n_feats(), 0);
         EXPECT_EQ(feat_4->n_feats(), 0);
+
+        try
+        {
+            std::shared_ptr<ModelNode> feat_5 = std::make_shared<ModelNode>(4, 1, "C^2", "$C^2$", "2|squ", "C.^2", _value_3, _test_value_3,  std::vector<std::string>(1, "C"), Unit("m"));
+            EXPECT_TRUE(false) << " (Creation of ModelNode with wrong postfix_expr.)";
+        }
+        catch(const std::logic_error& e)
+        {}
+
+        feat_1->set_value();
+        EXPECT_EQ(feat_1->value()[0], _value_1[0]);
+
+        feat_1->set_test_value();
+        EXPECT_EQ(feat_1->test_value()[0], _test_value_1[0]);
+
+        #ifdef PARAMETERIZE
+        try
+        {
+            std::shared_ptr<ModelNode> feat_5 = std::make_shared<ModelNode>(4, 1, "(C + 1.0)^2", "$\\left(C + 1\\right)^2$", "2|sq: 1.0, 1.0, 2.0", "(C + 1).^2", _value_3, _test_value_3,  std::vector<std::string>(1, "C"), Unit("m"));
+            EXPECT_TRUE(false) << " (Creation of ModelNode with wrong postfix_expr.)";
+        }
+        catch(const std::logic_error& e)
+        {}
+        std::vector<double> params = {1.0, 1.0};
+        EXPECT_STREQ(feat_1->matlab_fxn_expr(params.data()).c_str(), feat_1->matlab_fxn_expr().c_str());
+        #endif
+    }
+
+    TEST_F(ModelNodeTest, EvalTest)
+    {
+        std::shared_ptr<ModelNode> feat_1 = std::make_shared<ModelNode>(0, 1, "A^2", "$A^2$", "0|sq", "A^2", _value_1, _test_value_1,  std::vector<std::string>(1, "A"), Unit("m"));
+
+        std::vector<double> pt = {2.0};
+        std::vector<std::vector<double>> pts = {{2.0}};
+
+        std::map<std::string, double> pt_dct;
+        pt_dct["A"] = 2.0;
+
+        std::map<std::string, std::vector<double>> pts_dct;
+        pts_dct["A"] = {2.0};
+
+        EXPECT_LT(std::abs(feat_1->eval(pt) - 4.0), 1e-10);
+        EXPECT_LT(std::abs(feat_1->eval(pt_dct) - 4.0), 1e-10);
+
+        double val = feat_1->eval(pts)[0];
+        EXPECT_LT(std::abs(val - 4.0), 1e-10);
+
+        val = feat_1->eval(pts_dct)[0];
+        EXPECT_LT(std::abs(val - 4.0), 1e-10);
+    }
+
+    TEST_F(ModelNodeTest, EvalFailTest)
+    {
+        std::shared_ptr<ModelNode> feat_1 = std::make_shared<ModelNode>(0, 1, "A^2", "$A^2$", "1|0|sq", "A^2", _value_1, _test_value_1,  std::vector<std::string>(1, "A"), Unit("m"));
+        EXPECT_THROW(feat_1->eval(_value_1.data()), std::logic_error);
+        EXPECT_THROW(feat_1->eval((_value_1)), std::logic_error);
+
+        std::vector<std::vector<double>> values = {{1.0}, {1.0}};
+        EXPECT_THROW(feat_1->eval(values.data()), std::logic_error);
+        EXPECT_THROW(feat_1->eval(values), std::logic_error);
+
+        std::vector<std::string> x_in = {"A", "B"};
+        feat_1 = std::make_shared<ModelNode>(0, 1, "A + B", "$A + B$", "0|add", "(A + B)", _value_1, _test_value_1, x_in, Unit("m"));
+        EXPECT_THROW(feat_1->eval(_value_1), std::logic_error);
+        EXPECT_THROW(feat_1->eval(values.data()), std::logic_error);
+
+        values = {{1.0}, {1.0, 1.0}};
+        EXPECT_THROW(feat_1->eval(values), std::logic_error);
+
+        feat_1 = std::make_shared<ModelNode>(0, 1, "A^2", "$A^2$", "0|sq", "A^2", _value_1, _test_value_1,  std::vector<std::string>(1, "A"), Unit("m"));
+        EXPECT_THROW(feat_1->eval(_value_1), std::logic_error);
+
+        std::map<std::string, double> pt_dct;
+        pt_dct["B"] = 1.0;
+
+        std::map<std::string, std::vector<double>> pts_dct;
+        pt_dct["B"] = {1.0};
+        EXPECT_THROW(feat_1->eval(pt_dct), std::logic_error);
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_mult_node.cc b/tests/googletest/feature_creation/feature_generation/test_mult_node.cc
index eecf8f70aa1863594069460fed4905138099c142..b7c61bcc1d0a38d5afed2e0691827712c616d32d 100644
--- a/tests/googletest/feature_creation/feature_generation/test_mult_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_mult_node.cc
@@ -24,7 +24,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {5.0};
@@ -195,5 +200,32 @@ namespace
 
         EXPECT_STREQ(_mult_test->expr().c_str(), "((A * B) * B)");
         EXPECT_STREQ(_mult_test->postfix_expr().c_str(), "0|1|mult|1|mult");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_mult_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_mult_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_mult_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_mult_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_mult_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(4, 0.0);
+        EXPECT_THROW(_mult_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _mult_test->set_parameters({}, true);
+        _mult_test->set_parameters(nullptr);
+        _mult_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], 10.0);
+        EXPECT_EQ(dfdp[1], 50.0);
+
+        EXPECT_EQ(dfdp[2], 90.0);
+        EXPECT_EQ(dfdp[3], 160.0);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_neg_exp_node.cc b/tests/googletest/feature_creation/feature_generation/test_neg_exp_node.cc
index 55d5707ab8f10bb0edbc41c4da200773761c10b9..899fc647cc13f8750b9d57b69c633f4aaacc483e 100644
--- a/tests/googletest/feature_creation/feature_generation/test_neg_exp_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_neg_exp_node.cc
@@ -29,7 +29,12 @@ namespace
     protected:
         void SetUp() override
         {
-            node_value_arrs::initialize_values_arr({4}, {1}, 3, 2, false);
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {0.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {0.0};
@@ -210,5 +215,32 @@ namespace
 
         EXPECT_STREQ(_neg_exp_test->expr().c_str(), "(exp(-1.0 * A))");
         EXPECT_STREQ(_neg_exp_test->postfix_expr().c_str(), "0|nexp");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_neg_exp_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_neg_exp_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_neg_exp_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_neg_exp_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_neg_exp_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_neg_exp_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _neg_exp_test->set_parameters({}, true);
+        _neg_exp_test->set_parameters(nullptr);
+        _neg_exp_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_LT(std::abs(dfdp[0] + _neg_exp_test->value()[0]), 1e-10);
+        EXPECT_LT(std::abs(dfdp[1] + _neg_exp_test->value()[1]), 1e-10);
+
+        EXPECT_LT(std::abs(dfdp[2] + _neg_exp_test->value()[2]), 1e-10);
+        EXPECT_LT(std::abs(dfdp[3] + _neg_exp_test->value()[3]), 1e-10);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_sin_node.cc b/tests/googletest/feature_creation/feature_generation/test_sin_node.cc
index b00850baa4cf805c696e3c55bca4ce3a3ebe3006..9636f59caf4cf7f6d48bd8c0754b59370e08ef07 100644
--- a/tests/googletest/feature_creation/feature_generation/test_sin_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_sin_node.cc
@@ -25,7 +25,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {0.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {0.0};
@@ -181,5 +186,32 @@ namespace
 
         EXPECT_STREQ(_sin_test->expr().c_str(), "sin(A)");
         EXPECT_STREQ(_sin_test->postfix_expr().c_str(), "0|sin");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_sin_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_sin_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_sin_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_sin_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_sin_test->parameters().size(), 0);
+
+        std::vector<double> params = {M_PI, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_sin_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _sin_test->set_parameters({}, true);
+        _sin_test->set_parameters(nullptr);
+        _sin_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_LT(std::abs(dfdp[0] - 1.0), 1e-10);
+        EXPECT_LT(std::abs(dfdp[1] - 1.0), 1e-10);
+
+        EXPECT_LT(std::abs(dfdp[2] + 1.0), 1e-10);
+        EXPECT_LT(std::abs(dfdp[3] - 1.0), 1e-10);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_six_pow_node.cc b/tests/googletest/feature_creation/feature_generation/test_six_pow_node.cc
index 3c5969682566a9352a5f4dbc9f18dfbc5b178430..f6a419502c4aae773bac230677c359e108c1cdf2 100644
--- a/tests/googletest/feature_creation/feature_generation/test_six_pow_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_six_pow_node.cc
@@ -30,7 +30,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {2.0};
@@ -207,5 +212,32 @@ namespace
 
         EXPECT_STREQ(_six_pow_test->expr().c_str(), "(A^6)");
         EXPECT_STREQ(_six_pow_test->postfix_expr().c_str(), "0|sp");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_six_pow_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_six_pow_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_six_pow_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_six_pow_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_six_pow_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_six_pow_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _six_pow_test->set_parameters({}, true);
+        _six_pow_test->set_parameters(nullptr);
+        _six_pow_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], 6.0);
+        EXPECT_EQ(dfdp[1], 192.0);
+
+        EXPECT_EQ(dfdp[2], 1458.0);
+        EXPECT_EQ(dfdp[3], 6144.0);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_sq_node.cc b/tests/googletest/feature_creation/feature_generation/test_sq_node.cc
index 5bf192801366d35351de994f2b0b08b7ee9551c6..81ea2229b45cd40e6da9fe5952e8241dfdb774ee 100644
--- a/tests/googletest/feature_creation/feature_generation/test_sq_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_sq_node.cc
@@ -27,7 +27,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 8.0};
             std::vector<double> test_value_1 =  {2.0};
@@ -167,5 +172,32 @@ namespace
 
         EXPECT_STREQ(_sq_test->expr().c_str(), "(A^2)");
         EXPECT_STREQ(_sq_test->postfix_expr().c_str(), "0|sq");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_sq_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_sq_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_sq_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_sq_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_sq_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_sq_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _sq_test->set_parameters({}, true);
+        _sq_test->set_parameters(nullptr);
+        _sq_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], 2.0);
+        EXPECT_EQ(dfdp[1], 4.0);
+
+        EXPECT_EQ(dfdp[2], 6.0);
+        EXPECT_EQ(dfdp[3], 16.0);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_sqrt_node.cc b/tests/googletest/feature_creation/feature_generation/test_sqrt_node.cc
index 0e47ee5555ff1d61457bd3f5d97da7ef4a8b7e63..9802acce87d03ab51a4bec0875348eb4b13de890 100644
--- a/tests/googletest/feature_creation/feature_generation/test_sqrt_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_sqrt_node.cc
@@ -30,7 +30,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {4.0};
@@ -218,5 +223,29 @@ namespace
 
         EXPECT_STREQ(_sqrt_test->expr().c_str(), "sqrt(A)");
         EXPECT_STREQ(_sqrt_test->postfix_expr().c_str(), "0|sqrt");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_sqrt_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_sqrt_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_sqrt_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_sqrt_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_sqrt_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(2, 0.0);
+        EXPECT_THROW(_sqrt_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _sqrt_test->set_parameters({}, true);
+        _sqrt_test->set_parameters(nullptr);
+        _sqrt_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], 0.5);
+        EXPECT_EQ(dfdp[3], 0.25);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_generation/test_sub_node.cc b/tests/googletest/feature_creation/feature_generation/test_sub_node.cc
index f4aad1568d82a9795835493d344c89dccbb0ec96..34f02096f0a5f4b7efda561d3d28035d55cfa813 100644
--- a/tests/googletest/feature_creation/feature_generation/test_sub_node.cc
+++ b/tests/googletest/feature_creation/feature_generation/test_sub_node.cc
@@ -24,7 +24,12 @@ namespace
     protected:
         void SetUp() override
         {
+            #ifdef PARAMETERIZE
+            node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, true);
+            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            #else
             node_value_arrs::initialize_values_arr({4}, {1}, 4, 2, false);
+            #endif
 
             std::vector<double> value_1 = {1.0, 2.0, 3.0, 4.0};
             std::vector<double> test_value_1 =  {5.0};
@@ -194,5 +199,32 @@ namespace
 
         EXPECT_STREQ(_sub_test->expr().c_str(), "((A - B) - B)");
         EXPECT_STREQ(_sub_test->postfix_expr().c_str(), "0|1|sub|1|sub");
+
+        #ifdef PARAMETERIZE
+        EXPECT_THROW(_sub_test->param_pointer(), std::logic_error);
+        EXPECT_EQ(_sub_test->n_params(), 0);
+
+        nlopt_wrapper::MAX_PARAM_DEPTH = 0;
+        EXPECT_EQ(_sub_test->n_params_possible(), 0);
+        nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+        EXPECT_EQ(_sub_test->n_params_possible(), 2);
+
+        EXPECT_EQ(_sub_test->parameters().size(), 0);
+
+        std::vector<double> params = {1.0, 0.0};
+        std::vector<double> dfdp(4, 0.0);
+        std::vector<double> grad(4, 0.0);
+        EXPECT_THROW(_sub_test->gradient(grad.data(), dfdp.data()), std::logic_error);
+
+        _sub_test->set_parameters({}, true);
+        _sub_test->set_parameters(nullptr);
+        _sub_test->param_derivative(params.data(), dfdp.data());
+
+        EXPECT_EQ(dfdp[0], -1.0);
+        EXPECT_EQ(dfdp[1], -1.0);
+
+        EXPECT_EQ(dfdp[2], -1.0);
+        EXPECT_EQ(dfdp[3], -1.0);
+        #endif
     }
 }
diff --git a/tests/googletest/feature_creation/feature_space/test_feat_space.cc b/tests/googletest/feature_creation/feature_space/test_feat_space.cc
index 3d4bc533f772af9f9638ef746d47f54c05bf5eee..7fc4b67a91926f837789f5b63735cfadb2870114 100644
--- a/tests/googletest/feature_creation/feature_space/test_feat_space.cc
+++ b/tests/googletest/feature_creation/feature_space/test_feat_space.cc
@@ -31,7 +31,7 @@ namespace
             std::vector<int> task_sizes = {5, 5};
             int n_samp = std::accumulate(task_sizes.begin(), task_sizes.end(), 0);
 
-            node_value_arrs::initialize_values_arr(task_sizes, {0, 0}, 3, 2, false);
+            node_value_arrs::initialize_values_arr(task_sizes, {0, 0}, 4, 2, false);
             node_value_arrs::initialize_d_matrix_arr();
 
             std::vector<double> value_1(n_samp, 0.0);
@@ -61,6 +61,7 @@ namespace
             FeatureNode feat_1(0, "A", value_1, std::vector<double>(), Unit("m"));
             FeatureNode feat_2(1, "B", value_2, std::vector<double>(), Unit("m"));
             FeatureNode feat_3(2, "C", value_3, std::vector<double>(), Unit("s"));
+            FeatureNode feat_4(3, "D", std::vector<double>(10, 1.0), std::vector<double>(), Unit("s"));
 
             std::vector<FeatureNode> phi_0 = {feat_1, feat_2, feat_3};
 
@@ -232,4 +233,15 @@ namespace
         boost::filesystem::remove_all("feature_space/");
         prop_sorted_d_mat::finalize_sorted_d_matrix_arr();
     }
+
+    TEST_F(FeatSpaceTest, CheckFailedSISTest)
+    {
+        _inputs.set_calc_type("regression");
+        _inputs.set_prop_train(_prop);
+        _inputs.set_n_sis_select(1000000);
+
+        FeatureSpace feat_space(_inputs);
+        EXPECT_THROW(feat_space.sis(_prop), std::logic_error);
+    }
+
 }
diff --git a/tests/googletest/feature_creation/parameterization/test_abs_diff_node.cc b/tests/googletest/feature_creation/parameterization/test_abs_diff_node.cc
index 79cd04b804cb3b42928e946f3fdda5d279c6cd8a..f125b860a110037d6674c0b8cc8790ee2b6e8588 100644
--- a/tests/googletest/feature_creation/parameterization/test_abs_diff_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_abs_diff_node.cc
@@ -13,7 +13,7 @@
 // limitations under the License.
 #ifdef PARAMETERIZE
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/abs_diff/parameterized_absolute_difference.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/exp/exponential.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,14 +26,13 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
-            _task_sizes_train = {90};
-            _task_sizes_test = {10};
+            _task_sizes_train = {900};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
 
-
             std::vector<double> value_1(_task_sizes_train[0], 0.0);
             std::vector<double> value_2(_task_sizes_train[0], 0.0);
 
@@ -59,17 +58,19 @@ namespace
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
             _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
 
-            _phi = {_feat_1, _feat_2};
+            node_ptr feat_3 = std::make_shared<ExpNode>(_feat_1, 2);
+
+            _phi = {_feat_1, _feat_2, feat_3};
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::abs_diff(_task_sizes_train[0], _phi[0]->value_ptr(), _phi[1]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::abs_diff(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -99,15 +100,15 @@ namespace
         unsigned long int feat_ind = _phi.size();
 
         generateAbsDiffParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (AbsDiffParamNode created with an absolute value above the upper bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (AbsDiffParamNode created with an absolute value above the upper bound)";
 
         generateAbsDiffParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e49, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (AbsDiffParamNode created with an absolute value below the lower bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (AbsDiffParamNode created with an absolute value below the lower bound)";
 
-        generateAbsDiffParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
+        generateAbsDiffParamNode(_phi, _phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 4) << " (Failure to create a valid feature)";
 
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 90), 1e-4);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(AbsDiffParamNodeTest, ConstructorTest)
@@ -132,8 +133,8 @@ namespace
 
         try
         {
-            _abs_diff_test = std::make_shared<AbsDiffParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _abs_diff_test->value_ptr(), 90), 1e-4);
+            _abs_diff_test = std::make_shared<AbsDiffParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _abs_diff_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -144,60 +145,67 @@ namespace
     TEST_F(AbsDiffParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _abs_diff_test = std::make_shared<AbsDiffParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _abs_diff_test = std::make_shared<AbsDiffParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
         node_ptr copy_test = _abs_diff_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(90, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::abs_diff(90, _phi[0]->value_ptr(), _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::abs_diff(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::abs_diff(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::abs_diff(_task_sizes_test[0], _phi[1]->test_value_ptr(), _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|abd: " << std::setprecision(13) << std::scientific << copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m");
+        postfix << "1|0|exp|abd: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "s");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
 
         double v1 = copy_test->feat(0)->value_ptr()[0];
-        double v2 = copy_test->feat(1)->value_ptr()[0];
+        double v2 = copy_test->feat(1)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = -1.0 * util_funcs::sign(v1 - (alpha * v2 + a));
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(AbsDiffParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _abs_diff_test = std::make_shared<AbsDiffParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _abs_diff_test = std::make_shared<AbsDiffParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_abs_diff_test->rung(), 1);
+        _abs_diff_test->set_value();
+        _abs_diff_test->set_test_value();
 
-        std::vector<double> expected_val(90, 0.0);
+        EXPECT_EQ(_abs_diff_test->rung(), 2);
 
-        allowed_op_funcs::abs_diff(90, _phi[0]->value_ptr(), _phi[1]->value_ptr(), _abs_diff_test->parameters()[0], _abs_diff_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_abs_diff_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_abs_diff_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _abs_diff_test->parameters();
 
-        allowed_op_funcs::abs_diff(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), _abs_diff_test->parameters()[0], _abs_diff_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_abs_diff_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_abs_diff_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::abs_diff(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_abs_diff_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_abs_diff_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::abs_diff(_task_sizes_test[0], _phi[1]->test_value_ptr(), _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_abs_diff_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_abs_diff_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|abd: " << std::setprecision(13) << std::scientific << _abs_diff_test->parameters()[0] << ',' << _abs_diff_test->parameters()[1];
-        EXPECT_STREQ(_abs_diff_test->unit().toString().c_str(), "m");
+        postfix << "1|0|exp|abd: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_abs_diff_test->unit().toString().c_str(), "s");
         EXPECT_STREQ(_abs_diff_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_abs_diff_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_abs_node.cc b/tests/googletest/feature_creation/parameterization/test_abs_node.cc
index 61f9c13196d0b62a11e4be45bc942e4779a4bdbe..f8d0bd487e2e2bf1607a7cfb13a8d10ee778469d 100644
--- a/tests/googletest/feature_creation/parameterization/test_abs_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_abs_node.cc
@@ -13,7 +13,7 @@
 // limitations under the License.
 #ifdef PARAMETERIZE
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/abs/parameterized_absolute_value.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/log/log.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,10 +26,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 1, 2, true);
 
@@ -37,28 +37,33 @@ namespace
             std::vector<double> test_value_1(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
-            std::uniform_real_distribution<double> distribution_feats(-50.0, 50.0);
+            std::uniform_real_distribution<double> distribution_feats(-7.5, 7.5);
             std::uniform_real_distribution<double> distribution_params(-2.50, 2.50);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
-                value_1[ii] = distribution_feats(generator);
+            {
+                value_1[ii] = std::exp(distribution_feats(generator));
+            }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
-                test_value_1[ii] = distribution_feats(generator);
+            {
+                test_value_1[ii] = std::exp(distribution_feats(generator));
+            }
 
-            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _phi = {_feat_1};
+            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m^3"));
+            node_ptr feat_2 = std::make_shared<LogNode>(_feat_1, 1);
+            _phi = {_feat_1, feat_2};
 
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::abs(_task_sizes_train[0], _phi[0]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::abs(_task_sizes_train[0], _phi[1]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -85,15 +90,15 @@ namespace
     {
         unsigned long int feat_ind = _phi.size();
 
-        generateAbsParamNode(_phi, _phi[0], feat_ind, 1e-50, 1e-40, _optimizer);
-        EXPECT_EQ(_phi.size(), 1) << " (AbsParamNode created with an absolute value above the upper bound)";
+        generateAbsParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
+        EXPECT_EQ(_phi.size(), 2) << " (AbsParamNode created with an absolute value above the upper bound)";
 
-        generateAbsParamNode(_phi, _phi[0], feat_ind, 1e49, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 1) << " (AbsParamNode created with an absolute value below the lower bound)";
+        generateAbsParamNode(_phi, _phi[1], feat_ind, 1e49, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 2) << " (AbsParamNode created with an absolute value below the lower bound)";
 
-        generateAbsParamNode(_phi, _phi[0], feat_ind, 1e-50, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-4);
+        generateAbsParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(AbsParamNodeTest, ConstructorTest)
@@ -102,7 +107,7 @@ namespace
 
         try
         {
-            _abs_test = std::make_shared<AbsParamNode>(_phi[0], feat_ind, 1e-50, 1e-40, _optimizer);
+            _abs_test = std::make_shared<AbsParamNode>(_phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
             EXPECT_TRUE(false) << " (AbsParamNode created with an absolute value above the upper bound)";
         }
         catch(const InvalidFeatureException& e)
@@ -110,7 +115,7 @@ namespace
 
         try
         {
-            _abs_test = std::make_shared<AbsParamNode>(_phi[0], feat_ind, 1e40, 1e50, _optimizer);
+            _abs_test = std::make_shared<AbsParamNode>(_phi[1], feat_ind, 1e40, 1e50, _optimizer);
             EXPECT_TRUE(false) << " (AbsParamNode created with an absolute value below the lower bound)";
         }
         catch(const InvalidFeatureException& e)
@@ -118,8 +123,8 @@ namespace
 
         try
         {
-            _abs_test = std::make_shared<AbsParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _abs_test->value_ptr(), 900), 1e-4);
+            _abs_test = std::make_shared<AbsParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _abs_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -130,25 +135,25 @@ namespace
     TEST_F(AbsParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _abs_test = std::make_shared<AbsParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _abs_test = std::make_shared<AbsParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _abs_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
 
-        allowed_op_funcs::abs(900, _phi[0]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::abs(_task_sizes_train[0], _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::abs(10, _phi[0]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::abs(_task_sizes_test[0], _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|abs: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m");
+        postfix << "0|log|abs: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1] << ',' << std::scientific <<_abs_test->parameters()[2] << ',' << _abs_test->parameters()[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
@@ -158,31 +163,36 @@ namespace
         double a = copy_test->parameters()[1];
         double df_dp = util_funcs::sign(alpha * v1 + a);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(AbsParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _abs_test = std::make_shared<AbsParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _abs_test = std::make_shared<AbsParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_abs_test->rung(), 1);
+        _abs_test->set_value();
+        _abs_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_abs_test->rung(), 2);
 
-        allowed_op_funcs::abs(900, _phi[0]->value_ptr(), _abs_test->parameters()[0], _abs_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_abs_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_abs_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
 
-        allowed_op_funcs::abs(10, _phi[0]->test_value_ptr(), _abs_test->parameters()[0], _abs_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_abs_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_abs_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::abs(_task_sizes_train[0], _phi[1]->value_ptr(), _abs_test->parameters()[0], _abs_test->parameters()[1], expected_val.data());
+        EXPECT_LT(std::abs(_abs_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_abs_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::abs(_task_sizes_test[0], _phi[1]->test_value_ptr(), _abs_test->parameters()[0], _abs_test->parameters()[1], expected_val.data());
+        EXPECT_LT(std::abs(_abs_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_abs_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|abs: " << std::setprecision(13) << std::scientific <<_abs_test->parameters()[0] << ',' << _abs_test->parameters()[1];
-        EXPECT_STREQ(_abs_test->unit().toString().c_str(), "m");
+        postfix << "0|log|abs: " << std::setprecision(13) << std::scientific <<_abs_test->parameters()[0] << ',' << _abs_test->parameters()[1] << ',' << std::scientific <<_abs_test->parameters()[2] << ',' << _abs_test->parameters()[3];
+        EXPECT_STREQ(_abs_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(_abs_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_abs_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_add_node.cc b/tests/googletest/feature_creation/parameterization/test_add_node.cc
index 6043a497cf4df9260a9bc18e817153ceff09bcec..5510c9a9c0b6fb7c7777528367e8c2fb0fa6f98d 100644
--- a/tests/googletest/feature_creation/parameterization/test_add_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_add_node.cc
@@ -14,7 +14,6 @@
 #ifdef PARAMETERIZE
 
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/add/parameterized_add.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -27,10 +26,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
-            _task_sizes_train = {90};
-            _task_sizes_test = {10};
+            _task_sizes_train = {900};
+            _task_sizes_test = {100};
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
 
 
@@ -57,19 +56,21 @@ namespace
             }
 
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("m"));
 
-            _phi = {_feat_1, _feat_2};
+            node_ptr feat_3 = std::make_shared<AddNode>(_feat_1, _feat_2, 2);
+
+            _phi = {_feat_1, _feat_2, feat_3};
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::add(_task_sizes_train[0], _phi[0]->value_ptr(), _phi[1]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::add(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -98,14 +99,14 @@ namespace
         unsigned long int feat_ind = _phi.size();
 
         generateAddParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (AddParamNode created with an absolute value above the upper bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (AddParamNode created with an absolute value above the upper bound)";
 
         generateAddParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e49, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (AddParamNode created with an absolute value below the lower bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (AddParamNode created with an absolute value below the lower bound)";
 
-        generateAddParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 90), 1e-10);
+        generateAddParamNode(_phi, _phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 4) << " (Failure to create a valid feature)";
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-5);
     }
 
     TEST_F(AddParamNodeTest, ConstructorTest)
@@ -130,8 +131,8 @@ namespace
 
         try
         {
-            _add_test = std::make_shared<AddParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _add_test->value_ptr(), 90), 1e-10);
+            _add_test = std::make_shared<AddParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _add_test->value_ptr(), _task_sizes_train[0]), 1e-5);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -142,60 +143,67 @@ namespace
     TEST_F(AddParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _add_test = std::make_shared<AddParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _add_test = std::make_shared<AddParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+
+        _add_test->set_value();
+        _add_test->set_test_value();
 
         node_ptr copy_test = _add_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(90, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::add(90, _phi[0]->value_ptr(), _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::add(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::add(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::add(_task_sizes_test[0], _phi[1]->test_value_ptr(), _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|add: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
+        postfix << "1|0|1|add|add: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(copy_test->unit().toString().c_str(), "m");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
         double v1 = copy_test->feat(0)->value_ptr()[0];
-        double v2 = copy_test->feat(1)->value_ptr()[0];
+        double v2 = copy_test->feat(1)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = 1.0;
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(AddParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _add_test = std::make_shared<AddParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _add_test = std::make_shared<AddParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_add_test->rung(), 1);
+        EXPECT_EQ(_add_test->rung(), 2);
 
-        std::vector<double> expected_val(90, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _add_test->parameters();
 
-        allowed_op_funcs::add(90, _phi[0]->value_ptr(), _phi[1]->value_ptr(), _add_test->parameters()[0], _add_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_add_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_add_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::add(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_add_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_add_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::add(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), _add_test->parameters()[0], _add_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_add_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_add_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::add(_task_sizes_test[0], _phi[1]->test_value_ptr(), _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_add_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_add_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|add: " << std::setprecision(13) << std::scientific <<_add_test->parameters()[0] << ',' << _add_test->parameters()[1];
+        postfix << "1|0|1|add|add: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(_add_test->unit().toString().c_str(), "m");
         EXPECT_STREQ(_add_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_add_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_cb_node.cc b/tests/googletest/feature_creation/parameterization/test_cb_node.cc
index 634bd30bfc11de9c7c4279aa090f24dfe46a18c1..c89ecb148aca44babaa6f6f70d60e38e88b99967 100644
--- a/tests/googletest/feature_creation/parameterization/test_cb_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_cb_node.cc
@@ -14,7 +14,7 @@
 #ifdef PARAMETERIZE
 
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/cb/parameterized_cube.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/cos/cos.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -27,52 +27,48 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
-            node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
+            node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 1, 2, true);
 
 
             std::vector<double> value_1(_task_sizes_train[0], 0.0);
-            std::vector<double> value_2(_task_sizes_train[0], 0.0);
 
             std::vector<double> test_value_1(_task_sizes_test[0], 0.0);
-            std::vector<double> test_value_2(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
             std::uniform_real_distribution<double> distribution_feats(-500.0, 500.0);
-            std::uniform_real_distribution<double> distribution_params(1e-10, 1.50);
+            std::uniform_real_distribution<double> distribution_params(-1.25, 1.25);
             std::normal_distribution<double> distribution_err(0.0, 0.01);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
                 value_1[ii] = distribution_feats(generator);
-                value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
             {
                 test_value_1[ii] = distribution_feats(generator);
-                test_value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
-            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit(""));
+            node_ptr feat_2 = std::make_shared<CosNode>(_feat_1, 1);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, feat_2};
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
             allowed_op_funcs::cb(_task_sizes_train[0], _phi[1]->value_ptr(), _alpha, _a, _prop.data());
             std::transform(_prop.begin(), _prop.end(), _prop.begin(), [&](double p){return p + distribution_err(generator);});
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -81,8 +77,7 @@ namespace
         }
 
         node_ptr _feat_1;
-        node_ptr _feat_2;
-        node_ptr _exp_test;
+        node_ptr _cb_test;
 
         std::vector<node_ptr> _phi;
         std::vector<double> _prop;
@@ -108,7 +103,7 @@ namespace
 
         generateCbParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-4);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(CbParamNodeTest, ConstructorTest)
@@ -117,7 +112,7 @@ namespace
 
         try
         {
-            _exp_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
+            _cb_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
             EXPECT_TRUE(false) << " (CbParamNode created with an absolute value above the upper bound)";
         }
         catch(const InvalidFeatureException& e)
@@ -125,7 +120,7 @@ namespace
 
         try
         {
-            _exp_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e49, 1e50, _optimizer);
+            _cb_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e49, 1e50, _optimizer);
             EXPECT_TRUE(false) << " (CbParamNode created with an absolute value below the lower bound)";
         }
         catch(const InvalidFeatureException& e)
@@ -133,8 +128,8 @@ namespace
 
         try
         {
-            _exp_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _exp_test->value_ptr(), 900), 1e-4);
+            _cb_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _cb_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -145,59 +140,65 @@ namespace
     TEST_F(CbParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _exp_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _cb_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
-        node_ptr copy_test = _exp_test->hard_copy();
+        node_ptr copy_test = _cb_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::cb(900, _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cb(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::cb(10, _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cb(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|cb: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "s^3");
+        postfix << "0|cos|cb: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
-
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = 3.0 * std::pow(alpha * v1 + a, 2.0);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(CbParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _exp_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _cb_test = std::make_shared<CbParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+
+        _cb_test->set_value();
+        _cb_test->set_test_value();
 
-        EXPECT_EQ(_exp_test->rung(), 1);
+        EXPECT_EQ(_cb_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _cb_test->parameters();
 
-        allowed_op_funcs::cb(900, _phi[1]->value_ptr(), _exp_test->parameters()[0], _exp_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_exp_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_exp_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cb(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_cb_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_cb_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::cb(10, _phi[1]->test_value_ptr(), _exp_test->parameters()[0], _exp_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_exp_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_exp_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cb(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_cb_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_cb_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|cb: " << std::setprecision(13) << std::scientific <<_exp_test->parameters()[0] << ',' << _exp_test->parameters()[1];
-        EXPECT_STREQ(_exp_test->unit().toString().c_str(), "s^3");
-        EXPECT_STREQ(_exp_test->postfix_expr().c_str(), postfix.str().c_str());
+        postfix << "0|cos|cb: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_cb_test->unit().toString().c_str(), "Unitless");
+        EXPECT_STREQ(_cb_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_cb_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_cbrt_node.cc b/tests/googletest/feature_creation/parameterization/test_cbrt_node.cc
index 55bc4601bd77c4dfe072aa8d96282db46ee45fea..412d7c855a49796505543de763a46fc5f6bc78bd 100644
--- a/tests/googletest/feature_creation/parameterization/test_cbrt_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_cbrt_node.cc
@@ -13,8 +13,8 @@
 // limitations under the License.
 #ifdef PARAMETERIZE
 
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/abs/absolute_value.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/cbrt/parameterized_cube_root.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -27,49 +27,45 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
-            node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
+            _task_sizes_test = {100};
+            node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 1, 2, true);
 
 
             std::vector<double> value_1(_task_sizes_train[0], 0.0);
-            std::vector<double> value_2(_task_sizes_train[0], 0.0);
 
             std::vector<double> test_value_1(_task_sizes_test[0], 0.0);
-            std::vector<double> test_value_2(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
-            std::uniform_real_distribution<double> distribution_feats(10.0, 5000.0);
+            std::uniform_real_distribution<double> distribution_feats(1e-10, 5000.0);
             std::uniform_real_distribution<double> distribution_params(0.5, 1.50);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
                 value_1[ii] = distribution_feats(generator);
-                value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
             {
                 test_value_1[ii] = distribution_feats(generator);
-                test_value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
-            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m^9"));
+            node_ptr feat_2 = std::make_shared<CbrtNode>(_feat_1, 1);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, feat_2};
             _a = distribution_params(generator);
-            _alpha = std::pow(distribution_params(generator), 3.0);
+            _alpha = 1.0;
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
             allowed_op_funcs::cbrt(_task_sizes_train[0], _phi[1]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -78,7 +74,6 @@ namespace
         }
 
         node_ptr _feat_1;
-        node_ptr _feat_2;
         node_ptr _cbrt_test;
 
         std::vector<node_ptr> _phi;
@@ -105,7 +100,7 @@ namespace
 
         generateCbrtParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-4);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(CbrtParamNodeTest, ConstructorTest)
@@ -131,7 +126,7 @@ namespace
         try
         {
             _cbrt_test = std::make_shared<CbrtParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _cbrt_test->value_ptr(), 900), 1e-4);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _cbrt_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -146,32 +141,33 @@ namespace
 
         node_ptr copy_test = _cbrt_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::cbrt(900, _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cbrt(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::cbrt(10, _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cbrt(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|cbrt: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "s^0.333333");
+        postfix << "0|cbrt|cbrt: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = 1.0 / 3.0 * std::pow(alpha * v1 + a, -2.0 / 3.0);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(CbrtParamNodeTest, AttributesTest)
@@ -179,22 +175,28 @@ namespace
         unsigned long int feat_ind = _phi.size();
         _cbrt_test = std::make_shared<CbrtParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_cbrt_test->rung(), 1);
+        _cbrt_test->set_value();
+        _cbrt_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_cbrt_test->rung(), 2);
 
-        allowed_op_funcs::cbrt(900, _phi[1]->value_ptr(), _cbrt_test->parameters()[0], _cbrt_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_cbrt_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_cbrt_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _cbrt_test->parameters();
 
-        allowed_op_funcs::cbrt(10, _phi[1]->test_value_ptr(), _cbrt_test->parameters()[0], _cbrt_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_cbrt_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_cbrt_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cbrt(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_cbrt_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_cbrt_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::cbrt(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_cbrt_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_cbrt_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|cbrt: " << std::setprecision(13) << std::scientific <<_cbrt_test->parameters()[0] << ',' << _cbrt_test->parameters()[1];
-        EXPECT_STREQ(_cbrt_test->unit().toString().c_str(), "s^0.333333");
+        postfix << "0|cbrt|cbrt: " << std::setprecision(13) << std::scientific <<params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_cbrt_test->unit().toString().c_str(), "m");
         EXPECT_STREQ(_cbrt_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_cbrt_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_cos_node.cc b/tests/googletest/feature_creation/parameterization/test_cos_node.cc
index 8168f9906a2af9c2a6ae4a5d454c80596814855d..8d2e2b7e3aa94448fc62c8fd6927784b0261e641 100644
--- a/tests/googletest/feature_creation/parameterization/test_cos_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_cos_node.cc
@@ -12,10 +12,10 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 #ifdef PARAMETERIZE
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/cb/cube.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/cos/parameterized_cos.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/cos/cos.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/sin/sin.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -28,10 +28,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 3, 2, true);
 
 
@@ -56,24 +56,25 @@ namespace
                 test_value_2[ii] = distribution_feats(generator);
             }
 
-            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
+            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("s"));
             _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
             _feat_3 = std::make_shared<FeatureNode>(2, "B", value_2, test_value_2, Unit(""));
 
             _phi = {_feat_1, _feat_2, _feat_3};
             _phi.push_back(std::make_shared<CosNode>(_feat_3, 3, 1e-50, 1e50));
             _phi.push_back(std::make_shared<SinNode>(_feat_3, 4, 1e-50, 1e50));
+            _phi.push_back(std::make_shared<CbNode>(_feat_1, 5, 1e-50, 1e50));
 
             _a = 0.143;
             _alpha = 1.05;
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::cos(_task_sizes_train[0], _phi[0]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::cos(_task_sizes_train[0], _phi[5]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -115,7 +116,7 @@ namespace
         generateCosParamNode(_phi, _phi[4], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz) << " (CosParamNode created from SinNode)";
 
-        generateCosParamNode(_phi, _phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        generateCosParamNode(_phi, _phi[5], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz + 1) << " (Failure to create a valid feature)";
         EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 90), 1e-5);
     }
@@ -158,7 +159,7 @@ namespace
 
         try
         {
-            _cos_test = std::make_shared<CosParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+            _cos_test = std::make_shared<CosParamNode>(_phi[5], feat_ind, 1e-50, 1e50, _optimizer);
             EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _cos_test->value_ptr(), 90), 1e-5);
         }
         catch(const InvalidFeatureException& e)
@@ -170,59 +171,66 @@ namespace
     TEST_F(CosParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _cos_test = std::make_shared<CosParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _cos_test = std::make_shared<CosParamNode>(_phi[5], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _cos_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::cos(900, _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cos(_task_sizes_train[0], _phi[5]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::cos(10, _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cos(_task_sizes_test[0], _phi[5]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|cos: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
+        postfix << "0|cb|cos: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = -1.0 * std::sin(alpha * v1 + a);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(CosParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _cos_test = std::make_shared<CosParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _cos_test = std::make_shared<CosParamNode>(_phi[5], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_cos_test->rung(), 1);
+        _cos_test->set_value();
+        _cos_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_cos_test->rung(), 2);
 
-        allowed_op_funcs::cos(900, _phi[1]->value_ptr(), _cos_test->parameters()[0], _cos_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_cos_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_cos_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _cos_test->parameters();
 
-        allowed_op_funcs::cos(10, _phi[1]->test_value_ptr(), _cos_test->parameters()[0], _cos_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_cos_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_cos_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::cos(_task_sizes_train[0], _phi[5]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_cos_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_cos_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::cos(_task_sizes_test[0], _phi[5]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_cos_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_cos_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|cos: " << std::setprecision(13) << std::scientific <<_cos_test->parameters()[0] << ',' << _cos_test->parameters()[1];
+        postfix << "0|cb|cos: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(_cos_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(_cos_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_cos_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_div_node.cc b/tests/googletest/feature_creation/parameterization/test_div_node.cc
index cf268ecad5a9c300f2285a76b5e8d101d0087805..66ff13fcbd5d50420f47459a85a1350c6044f7a5 100644
--- a/tests/googletest/feature_creation/parameterization/test_div_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_div_node.cc
@@ -13,7 +13,6 @@
 // limitations under the License.
 #ifdef PARAMETERIZE
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/div/parameterized_divide.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,10 +25,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
-            _task_sizes_train = {90};
-            _task_sizes_test = {10};
+            _task_sizes_train = {900};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
 
@@ -58,18 +57,19 @@ namespace
 
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
             _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            node_ptr feat_3 = std::make_shared<DivNode>(_feat_1, _feat_2, 2);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, _feat_2, feat_3};
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::div(_task_sizes_train[0], _phi[0]->value_ptr(), _phi[1]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::div(_task_sizes_train[0], _phi[2]->value_ptr(), _phi[1]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -98,14 +98,14 @@ namespace
         unsigned long int feat_ind = _phi.size();
 
         generateDivParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (DivParamNode created with an absolute value above the upper bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (DivParamNode created with an absolute value above the upper bound)";
 
         generateDivParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e49, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (DivParamNode created with an absolute value below the lower bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (DivParamNode created with an absolute value below the lower bound)";
 
-        generateDivParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 90), 1e-10);
+        generateDivParamNode(_phi, _phi[2], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 4) << " (Failure to create a valid feature)";
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-5);
     }
 
     TEST_F(DivParamNodeTest, ConstructorTest)
@@ -130,8 +130,8 @@ namespace
 
         try
         {
-            _div_test = std::make_shared<DivParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _div_test->value_ptr(), 90), 1e-10);
+            _div_test = std::make_shared<DivParamNode>(_phi[2], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _div_test->value_ptr(), _task_sizes_train[0]), 1e-5);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -142,60 +142,67 @@ namespace
     TEST_F(DivParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _div_test = std::make_shared<DivParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _div_test = std::make_shared<DivParamNode>(_phi[2], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _div_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(90, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::div(90, _phi[0]->value_ptr(), _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::div(_task_sizes_train[0], _phi[2]->value_ptr(&params[2]), _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::div(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::div(_task_sizes_test[0], _phi[2]->test_value_ptr(&params[2]), _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|div: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m * s^-1");
+        postfix << "0|1|div|1|div: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m * s^-2");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
-        double v2 = copy_test->feat(1)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
+        double v2 = copy_test->feat(1)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = -1.0 * v1 / std::pow(alpha * v2 + a, 2.0);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(DivParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _div_test = std::make_shared<DivParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _div_test = std::make_shared<DivParamNode>(_phi[2], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_div_test->rung(), 1);
+        _div_test->set_value();
+        _div_test->set_test_value();
 
-        std::vector<double> expected_val(90, 0.0);
+        EXPECT_EQ(_div_test->rung(), 2);
 
-        allowed_op_funcs::div(90, _phi[0]->value_ptr(), _phi[1]->value_ptr(), _div_test->parameters()[0], _div_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_div_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_div_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _div_test->parameters();
 
-        allowed_op_funcs::div(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), _div_test->parameters()[0], _div_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_div_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_div_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::div(_task_sizes_train[0], _phi[2]->value_ptr(&params[2]), _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_div_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_div_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::div(_task_sizes_test[0], _phi[2]->test_value_ptr(&params[2]), _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_div_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_div_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|div: " << std::setprecision(13) << std::scientific <<_div_test->parameters()[0] << ',' << _div_test->parameters()[1];
-        EXPECT_STREQ(_div_test->unit().toString().c_str(), "m * s^-1");
+        postfix << "0|1|div|1|div: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_div_test->unit().toString().c_str(), "m * s^-2");
         EXPECT_STREQ(_div_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_div_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_exp_node.cc b/tests/googletest/feature_creation/parameterization/test_exp_node.cc
index ad14078ba4fd7a6bfd7f88aa87daceee60841d4d..03133b14702be8cfa1a1b3c2d5ab3afb5435ce0c 100644
--- a/tests/googletest/feature_creation/parameterization/test_exp_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_exp_node.cc
@@ -12,11 +12,11 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 #ifdef PARAMETERIZE
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/abs/absolute_value.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/exp/parameterized_exponential.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/exp/exponential.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/log/log.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/negative_exponential.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -29,10 +29,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 3, 2, true);
 
@@ -44,7 +44,7 @@ namespace
 
             std::default_random_engine generator;
             std::uniform_real_distribution<double> distribution_feats(-2.0, 2.0);
-            std::uniform_real_distribution<double> distribution_params(0.75, 1.25);
+            std::uniform_real_distribution<double> distribution_params(0.5, 1.20);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
@@ -66,17 +66,18 @@ namespace
             _phi.push_back(std::make_shared<ExpNode>(_feat_3, 3, 1e-50, 1e50));
             _phi.push_back(std::make_shared<LogNode>(_feat_3, 4, 1e-50, 1e50));
             _phi.push_back(std::make_shared<NegExpNode>(_feat_3, 5, 1e-50, 1e50));
+            _phi.push_back(std::make_shared<AbsNode>(_feat_1, 6, 1e-50, 1e50));
 
             _a = std::log(distribution_params(generator));
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::exp(_task_sizes_train[0], _phi[0]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::exp(_task_sizes_train[0], _phi[6]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -121,9 +122,9 @@ namespace
         generateExpParamNode(_phi, _phi[5], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz) << " (ExpParamNode created from NegExpNode)";
 
-        generateExpParamNode(_phi, _phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        generateExpParamNode(_phi, _phi[6], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz + 1) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-4);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(ExpParamNodeTest, ConstructorTest)
@@ -171,8 +172,8 @@ namespace
 
         try
         {
-            _exp_test = std::make_shared<ExpParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _exp_test->value_ptr(), 900), 1e-4);
+            _exp_test = std::make_shared<ExpParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _exp_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -183,59 +184,66 @@ namespace
     TEST_F(ExpParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _exp_test = std::make_shared<ExpParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _exp_test = std::make_shared<ExpParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _exp_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::exp(900, _phi[0]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::exp(_task_sizes_train[0], _phi[6]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::exp(10, _phi[0]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::exp(_task_sizes_test[0], _phi[6]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|exp: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
+        postfix << "0|abs|exp: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = std::exp(alpha * v1 + a);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(ExpParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _exp_test = std::make_shared<ExpParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _exp_test = std::make_shared<ExpParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_exp_test->rung(), 1);
+        _exp_test->set_value();
+        _exp_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_exp_test->rung(), 2);
 
-        allowed_op_funcs::exp(900, _phi[0]->value_ptr(), _exp_test->parameters()[0], _exp_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_exp_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_exp_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _exp_test->parameters();
 
-        allowed_op_funcs::exp(10, _phi[0]->test_value_ptr(), _exp_test->parameters()[0], _exp_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_exp_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_exp_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::exp(_task_sizes_train[0], _phi[6]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_exp_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_exp_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::exp(_task_sizes_test[0], _phi[6]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_exp_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_exp_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|exp: " << std::setprecision(13) << std::scientific <<_exp_test->parameters()[0] << ',' << _exp_test->parameters()[1];
+        postfix << "0|abs|exp: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(_exp_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(_exp_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_exp_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_inv_node.cc b/tests/googletest/feature_creation/parameterization/test_inv_node.cc
index d460d0e0eed18a4cbb1441b864b74bff0ffa23ed..5b1b3416161aa0dc198d9879bc05daf83144f9cd 100644
--- a/tests/googletest/feature_creation/parameterization/test_inv_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_inv_node.cc
@@ -13,7 +13,7 @@
 // limitations under the License.
 #ifdef PARAMETERIZE
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/inv/parameterized_inverse.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/six_pow/sixth_power.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,50 +26,46 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
-            _task_sizes_train = {90};
-            _task_sizes_test = {10};
+            _task_sizes_train = {900};
+            _task_sizes_test = {100};
 
-            node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
+            node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 1, 2, true);
 
 
             std::vector<double> value_1(_task_sizes_train[0], 0.0);
-            std::vector<double> value_2(_task_sizes_train[0], 0.0);
 
             std::vector<double> test_value_1(_task_sizes_test[0], 0.0);
-            std::vector<double> test_value_2(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
             std::uniform_real_distribution<double> distribution_feats(-50.0, 50.0);
-            std::uniform_real_distribution<double> distribution_params(1e-10, 2.50);
+            std::uniform_real_distribution<double> distribution_params(0.001, 10.0);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
                 value_1[ii] = distribution_feats(generator);
-                value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
             {
                 test_value_1[ii] = distribution_feats(generator);
-                test_value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
-            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit(""));
+            node_ptr feat_2 = std::make_shared<SixPowNode>(_feat_1, 1);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, feat_2};
             _a = distribution_params(generator);
-            _alpha = distribution_params(generator);
+            _alpha = 1.0;
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
             allowed_op_funcs::inv(_task_sizes_train[0], _phi[1]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -78,7 +74,6 @@ namespace
         }
 
         node_ptr _feat_1;
-        node_ptr _feat_2;
         node_ptr _inv_test;
 
         std::vector<node_ptr> _phi;
@@ -105,7 +100,7 @@ namespace
 
         generateInvParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 90), 1e-10);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-5);
     }
 
     TEST_F(InvParamNodeTest, ConstructorTest)
@@ -131,7 +126,7 @@ namespace
         try
         {
             _inv_test = std::make_shared<InvParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _inv_test->value_ptr(), 90), 1e-10);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _inv_test->value_ptr(), _task_sizes_train[0]), 1e-5);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -146,32 +141,33 @@ namespace
 
         node_ptr copy_test = _inv_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(90, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::inv(90, _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::inv(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::inv(10, _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::inv(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|inv: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "s^-1");
+        postfix << "0|sp|inv: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = -1.0 / std::pow(alpha * v1 + a, 2.0);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(InvParamNodeTest, AttributesTest)
@@ -179,22 +175,28 @@ namespace
         unsigned long int feat_ind = _phi.size();
         _inv_test = std::make_shared<InvParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_inv_test->rung(), 1);
+        _inv_test->set_value();
+        _inv_test->set_test_value();
 
-        std::vector<double> expected_val(90, 0.0);
+        EXPECT_EQ(_inv_test->rung(), 2);
 
-        allowed_op_funcs::inv(90, _phi[1]->value_ptr(), _inv_test->parameters()[0], _inv_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_inv_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_inv_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _inv_test->parameters();
 
-        allowed_op_funcs::inv(10, _phi[1]->test_value_ptr(), _inv_test->parameters()[0], _inv_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_inv_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_inv_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::inv(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_inv_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_inv_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::inv(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_inv_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_inv_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|inv: " << std::setprecision(13) << std::scientific <<_inv_test->parameters()[0] << ',' << _inv_test->parameters()[1];
-        EXPECT_STREQ(_inv_test->unit().toString().c_str(), "s^-1");
+        postfix << "0|sp|inv: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_inv_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(_inv_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_inv_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_log_node.cc b/tests/googletest/feature_creation/parameterization/test_log_node.cc
index d46051ad326cfdbf99d5208e35c74db0e50585f9..dd52aafac3379aade4364e5e0c7c40ec229188db 100644
--- a/tests/googletest/feature_creation/parameterization/test_log_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_log_node.cc
@@ -12,11 +12,11 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 #ifdef PARAMETERIZE
-#include <feature_creation/node/operator_nodes/allowed_operator_nodes/log/parameterized_log.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/abs/absolute_value.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/exp/exponential.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/log/log.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/log/parameterized_log.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/negative_exponential.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -30,7 +30,7 @@ namespace
         void SetUp() override
         {
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 3, 2, true);
 
@@ -43,21 +43,21 @@ namespace
             std::vector<double> test_value_3(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
-            std::uniform_real_distribution<double> distribution_feats(-10.0, 10.0);
+            std::uniform_real_distribution<double> distribution_feats(-100.0, 100.0);
             std::uniform_real_distribution<double> distribution_params(0.1, 1.50);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
                 value_1[ii] = distribution_feats(generator);
                 value_2[ii] = distribution_feats(generator);
-                value_3[ii] = std::exp(distribution_feats(generator));
+                value_3[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
             {
                 test_value_1[ii] = distribution_feats(generator);
                 test_value_2[ii] = distribution_feats(generator);
-                test_value_3[ii] = std::exp(distribution_feats(generator));
+                test_value_3[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
@@ -68,17 +68,18 @@ namespace
             _phi.push_back(std::make_shared<ExpNode>(_feat_2, 3, 1e-50, 1e50));
             _phi.push_back(std::make_shared<LogNode>(_feat_3, 4, 1e-50, 1e50));
             _phi.push_back(std::make_shared<NegExpNode>(_feat_2, 5, 1e-50, 1e50));
+            _phi.push_back(std::make_shared<AbsNode>(_feat_1, 6, 1e-50, 1e50));
 
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::log(_task_sizes_train[0], _phi[2]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::log(_task_sizes_train[0], _phi[6]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -123,9 +124,9 @@ namespace
         generateLogParamNode(_phi, _phi[5], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz) << " (LogParamNode created from NegExpNode)";
 
-        generateLogParamNode(_phi, _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+        generateLogParamNode(_phi, _phi[6], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz + 1) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-4);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(LogParamNodeTest, ConstructorTest)
@@ -174,8 +175,8 @@ namespace
 
         try
         {
-            _log_test = std::make_shared<LogParamNode>(_phi[2], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _log_test->value_ptr(), 900), 1e-4);
+            _log_test = std::make_shared<LogParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _log_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -186,59 +187,66 @@ namespace
     TEST_F(LogParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _log_test = std::make_shared<LogParamNode>(_phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+        _log_test = std::make_shared<LogParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _log_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::log(900, _phi[2]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::log(_task_sizes_train[0], _phi[6]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::log(10, _phi[2]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::log(_task_sizes_test[0], _phi[6]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "2|log: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
+        postfix << "0|abs|log: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = 1.0 / (alpha * v1 + a);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(LogParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _log_test = std::make_shared<LogParamNode>(_phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+        _log_test = std::make_shared<LogParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_log_test->rung(), 1);
+        _log_test->set_value();
+        _log_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_log_test->rung(), 2);
 
-        allowed_op_funcs::log(900, _phi[2]->value_ptr(), _log_test->parameters()[0], _log_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_log_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_log_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _log_test->parameters();
 
-        allowed_op_funcs::log(10, _phi[2]->test_value_ptr(), _log_test->parameters()[0], _log_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_log_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_log_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::log(_task_sizes_train[0], _phi[6]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_log_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_log_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::log(_task_sizes_test[0], _phi[6]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_log_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_log_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "2|log: " << std::setprecision(13) << std::scientific <<_log_test->parameters()[0] << ',' << _log_test->parameters()[1];
+        postfix << "0|abs|log: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(_log_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(_log_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_log_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_mult_node.cc b/tests/googletest/feature_creation/parameterization/test_mult_node.cc
index 31642f71fadc803c48601756880064d20dd241d3..85d5ad7db48c5bbbe23b83254407c0551edb0222 100644
--- a/tests/googletest/feature_creation/parameterization/test_mult_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_mult_node.cc
@@ -13,7 +13,6 @@
 // limitations under the License.
 #ifdef PARAMETERIZE
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/mult/parameterized_multiply.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,10 +25,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
 
@@ -57,19 +56,20 @@ namespace
 
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
             _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            node_ptr feat_3 = std::make_shared<MultNode>(_feat_1, _feat_2, 2);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, _feat_2, feat_3};
 
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::mult(_task_sizes_train[0], _phi[0]->value_ptr(), _phi[1]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::mult(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -98,14 +98,14 @@ namespace
         unsigned long int feat_ind = _phi.size();
 
         generateMultParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (MultParamNode created with an absolute value above the upper bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (MultParamNode created with an absolute value above the upper bound)";
 
         generateMultParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e49, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (MultParamNode created with an absolute value below the lower bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (MultParamNode created with an absolute value below the lower bound)";
 
-        generateMultParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-4);
+        generateMultParamNode(_phi, _phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 4) << " (Failure to create a valid feature)";
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(MultParamNodeTest, ConstructorTest)
@@ -130,8 +130,8 @@ namespace
 
         try
         {
-            _mult_test = std::make_shared<MultParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _mult_test->value_ptr(), 900), 1e-4);
+            _mult_test = std::make_shared<MultParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _mult_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -142,60 +142,67 @@ namespace
     TEST_F(MultParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _mult_test = std::make_shared<MultParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _mult_test = std::make_shared<MultParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _mult_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::mult(900, _phi[0]->value_ptr(), _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::mult(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::mult(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::mult(_task_sizes_test[0], _phi[1]->test_value_ptr(), _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|mult: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m * s");
+        postfix << "1|0|1|mult|mult: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m * s^2");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
         double v1 = copy_test->feat(0)->value_ptr()[0];
-        double v2 = copy_test->feat(1)->value_ptr()[0];
+        double v2 = copy_test->feat(1)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = v1;
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(MultParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _mult_test = std::make_shared<MultParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _mult_test = std::make_shared<MultParamNode>(_phi[1], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_mult_test->rung(), 1);
+        _mult_test->set_value();
+        _mult_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_mult_test->rung(), 2);
 
-        allowed_op_funcs::mult(900, _phi[0]->value_ptr(), _phi[1]->value_ptr(), _mult_test->parameters()[0], _mult_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_mult_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_mult_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _mult_test->parameters();
 
-        allowed_op_funcs::mult(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), _mult_test->parameters()[0], _mult_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_mult_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_mult_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::mult(_task_sizes_train[0], _phi[1]->value_ptr(), _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_mult_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_mult_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::mult(_task_sizes_test[0], _phi[1]->test_value_ptr(), _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_mult_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_mult_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|mult: " << std::setprecision(13) << std::scientific <<_mult_test->parameters()[0] << ',' << _mult_test->parameters()[1];
-        EXPECT_STREQ(_mult_test->unit().toString().c_str(), "m * s");
+        postfix << "1|0|1|mult|mult: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_mult_test->unit().toString().c_str(), "m * s^2");
         EXPECT_STREQ(_mult_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_mult_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_neg_exp_node.cc b/tests/googletest/feature_creation/parameterization/test_neg_exp_node.cc
index bd994ab690e89f8506529a6e60ec511b58102d1b..c3007bc7c4126cbec195fdb87ff8d883216c4f8b 100644
--- a/tests/googletest/feature_creation/parameterization/test_neg_exp_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_neg_exp_node.cc
@@ -12,11 +12,11 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 #ifdef PARAMETERIZE
-#include <feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/parameterized_negative_exponential.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/exp/exponential.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/log/log.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/negative_exponential.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/parameterized_negative_exponential.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/sqrt/square_root.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -29,10 +29,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 3, 2, true);
 
@@ -43,19 +43,19 @@ namespace
             std::vector<double> test_value_2(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
-            std::uniform_real_distribution<double> distribution_feats(-2.0, 2.0);
+            std::uniform_real_distribution<double> distribution_feats(1e-10, 2.0);
             std::uniform_real_distribution<double> distribution_params(0.75, 1.25);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
                 value_1[ii] = distribution_feats(generator);
-                value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
+                value_2[ii] = distribution_feats(generator);
             }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
             {
                 test_value_1[ii] = distribution_feats(generator);
-                test_value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
+                test_value_2[ii] = distribution_feats(generator);
             }
 
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
@@ -66,17 +66,18 @@ namespace
             _phi.push_back(std::make_shared<ExpNode>(_feat_3, 3, 1e-50, 1e50));
             _phi.push_back(std::make_shared<LogNode>(_feat_3, 4, 1e-50, 1e50));
             _phi.push_back(std::make_shared<NegExpNode>(_feat_3, 5, 1e-50, 1e50));
+            _phi.push_back(std::make_shared<SqrtNode>(_feat_1, 56, 1e-50, 1e50));
 
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::neg_exp(_task_sizes_train[0], _phi[0]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::neg_exp(_task_sizes_train[0], _phi[6]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -120,10 +121,10 @@ namespace
         generateNegExpParamNode(_phi, _phi[5], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz) << " (NegExpParamNode created from NegExpNode)";
 
-        generateNegExpParamNode(_phi, _phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        generateNegExpParamNode(_phi, _phi[6], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz + 1) << " (Failure to create a valid feature)";
 
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-5);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(NegExpParamNodeTest, ConstructorTest)
@@ -172,8 +173,8 @@ namespace
 
         try
         {
-            _neg_exp_test = std::make_shared<NegExpParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _neg_exp_test->value_ptr(), 900), 1e-5);
+            _neg_exp_test = std::make_shared<NegExpParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _neg_exp_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -184,59 +185,66 @@ namespace
     TEST_F(NegExpParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _neg_exp_test = std::make_shared<NegExpParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _neg_exp_test = std::make_shared<NegExpParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _neg_exp_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::neg_exp(900, _phi[0]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
+        allowed_op_funcs::neg_exp(_task_sizes_train[0], _phi[6]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
         EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
         EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::neg_exp(10, _phi[0]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
+        allowed_op_funcs::neg_exp(_task_sizes_test[0], _phi[6]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
         EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
         EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|nexp: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
+        postfix << "0|sqrt|nexp: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = -1.0 * std::exp(-1.0 * alpha * v1 + a);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(NegExpParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _neg_exp_test = std::make_shared<NegExpParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _neg_exp_test = std::make_shared<NegExpParamNode>(_phi[6], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_neg_exp_test->rung(), 1);
+        _neg_exp_test->set_value();
+        _neg_exp_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_neg_exp_test->rung(), 2);
 
-        allowed_op_funcs::neg_exp(900, _phi[0]->value_ptr(), _neg_exp_test->parameters()[0], _neg_exp_test->parameters()[1], expected_val.data());
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _neg_exp_test->parameters();
+
+        allowed_op_funcs::neg_exp(_task_sizes_train[0], _phi[6]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
         EXPECT_LT(std::abs(_neg_exp_test->value_ptr()[0] - expected_val[0]), 1e-5);
         EXPECT_LT(std::abs(_neg_exp_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::neg_exp(10, _phi[0]->test_value_ptr(), _neg_exp_test->parameters()[0], _neg_exp_test->parameters()[1], expected_val.data());
+        allowed_op_funcs::neg_exp(_task_sizes_test[0], _phi[6]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
         EXPECT_LT(std::abs(_neg_exp_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
         EXPECT_LT(std::abs(_neg_exp_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|nexp: " << std::setprecision(13) << std::scientific <<_neg_exp_test->parameters()[0] << ',' << _neg_exp_test->parameters()[1];
+        postfix << "0|sqrt|nexp: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(_neg_exp_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(_neg_exp_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_neg_exp_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_sin_node.cc b/tests/googletest/feature_creation/parameterization/test_sin_node.cc
index b14261d3d50df2a7bc08dfb8ca69e9ca1e9d19aa..05a866496d77f2290f21e80cda3268cae768fa03 100644
--- a/tests/googletest/feature_creation/parameterization/test_sin_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_sin_node.cc
@@ -14,8 +14,7 @@
 #ifdef PARAMETERIZE
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/sin/parameterized_sin.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/cos/cos.hpp>
-#include <feature_creation/node/operator_nodes/allowed_operator_nodes/sin/sin.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/sq/square.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -28,10 +27,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 3, 2, true);
 
@@ -61,19 +60,20 @@ namespace
             _feat_3 = std::make_shared<FeatureNode>(2, "B", value_2, test_value_2, Unit(""));
 
             _phi = {_feat_1, _feat_2, _feat_3};
-            _phi.push_back(std::make_shared<CosNode>(_feat_3, 2, 1e-50, 1e50));
-            _phi.push_back(std::make_shared<SinNode>(_feat_3, 2, 1e-50, 1e50));
+            _phi.push_back(std::make_shared<CosNode>(_feat_3, 3, 1e-50, 1e50));
+            _phi.push_back(std::make_shared<SinNode>(_feat_3, 4, 1e-50, 1e50));
+            _phi.push_back(std::make_shared<SqNode>(_feat_1, 5, 1e-50, 1e50));
 
             _a = 0.143;
             _alpha = 1.05;
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::sin(_task_sizes_train[0], _phi[0]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::sin(_task_sizes_train[0], _phi[5]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -115,7 +115,7 @@ namespace
         generateSinParamNode(_phi, _phi[4], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz) << " (SinParamNode created from SinNode)";
 
-        generateSinParamNode(_phi, _phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        generateSinParamNode(_phi, _phi[5], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), phi_sz + 1) << " (Failure to create a valid feature)";
         EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 90), 1e-5);
     }
@@ -158,7 +158,7 @@ namespace
 
         try
         {
-            _sin_test = std::make_shared<SinParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+            _sin_test = std::make_shared<SinParamNode>(_phi[5], feat_ind, 1e-50, 1e50, _optimizer);
             EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _sin_test->value_ptr(), 90), 1e-5);
         }
         catch(const InvalidFeatureException& e)
@@ -170,59 +170,66 @@ namespace
     TEST_F(SinParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _sin_test = std::make_shared<SinParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _sin_test = std::make_shared<SinParamNode>(_phi[5], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _sin_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::sin(900, _phi[0]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sin(_task_sizes_train[0], _phi[5]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::sin(10, _phi[0]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sin(_task_sizes_test[0], _phi[5]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|sin: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
+        postfix << "0|sq|sin: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = std::cos(alpha * v1 + a);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(SinParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _sin_test = std::make_shared<SinParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _sin_test = std::make_shared<SinParamNode>(_phi[5], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_sin_test->rung(), 1);
+        _sin_test->set_value();
+        _sin_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_sin_test->rung(), 2);
 
-        allowed_op_funcs::sin(900, _phi[0]->value_ptr(), _sin_test->parameters()[0], _sin_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_sin_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_sin_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _sin_test->parameters();
 
-        allowed_op_funcs::sin(10, _phi[0]->test_value_ptr(), _sin_test->parameters()[0], _sin_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_sin_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_sin_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sin(_task_sizes_train[0], _phi[5]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_sin_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_sin_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::sin(_task_sizes_test[0], _phi[5]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_sin_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_sin_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|sin: " << std::setprecision(13) << std::scientific <<_sin_test->parameters()[0] << ',' << _sin_test->parameters()[1];
+        postfix << "0|sq|sin: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(_sin_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(_sin_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_sin_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_six_pow_node.cc b/tests/googletest/feature_creation/parameterization/test_six_pow_node.cc
index ba5766b6e105ee4507b3bee67880bb22d4c4d36b..c60315c6c1d3862308d34b74cbc6f075a7ffc276 100644
--- a/tests/googletest/feature_creation/parameterization/test_six_pow_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_six_pow_node.cc
@@ -13,7 +13,6 @@
 // limitations under the License.
 #ifdef PARAMETERIZE
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/six_pow/parameterized_sixth_power.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,49 +25,44 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
 
             std::vector<double> value_1(_task_sizes_train[0], 0.0);
-            std::vector<double> value_2(_task_sizes_train[0], 0.0);
-
             std::vector<double> test_value_1(_task_sizes_test[0], 0.0);
-            std::vector<double> test_value_2(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
-            std::uniform_real_distribution<double> distribution_feats(-50.00, 50.00);
-            std::uniform_real_distribution<double> distribution_params(1e-10, 2.00);
+            std::uniform_real_distribution<double> distribution_feats(-2.00, 2.00);
+            std::uniform_real_distribution<double> distribution_params(1e-5, 2.00);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
                 value_1[ii] = distribution_feats(generator);
-                value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
             {
                 test_value_1[ii] = distribution_feats(generator);
-                test_value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            node_ptr feat_2 = std::make_shared<SixPowNode>(_feat_1, 1);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, feat_2};
             _a = distribution_params(generator);
-            _alpha = distribution_params(generator);
+            _alpha = 1.0;
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::sixth_pow(_task_sizes_train[0], _phi[0]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::sixth_pow(_task_sizes_train[0], _phi[1]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -77,7 +71,6 @@ namespace
         }
 
         node_ptr _feat_1;
-        node_ptr _feat_2;
         node_ptr _six_pow_test;
 
         std::vector<node_ptr> _phi;
@@ -102,9 +95,9 @@ namespace
         generateSixPowParamNode(_phi, _phi[0], feat_ind, 1e49, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), 2) << " (SixPowParamNode created with an absolute value below the lower bound)";
 
-        generateSixPowParamNode(_phi, _phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        generateSixPowParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-4);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(SixPowParamNodeTest, ConstructorTest)
@@ -129,8 +122,8 @@ namespace
 
         try
         {
-            _six_pow_test = std::make_shared<SixPowParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _six_pow_test->value_ptr(), 900), 1e-4);
+            _six_pow_test = std::make_shared<SixPowParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _six_pow_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -141,59 +134,66 @@ namespace
     TEST_F(SixPowParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _six_pow_test = std::make_shared<SixPowParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _six_pow_test = std::make_shared<SixPowParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _six_pow_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::sixth_pow(900, _phi[0]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sixth_pow(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-4);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-4);
 
-        allowed_op_funcs::sixth_pow(10, _phi[0]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sixth_pow(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-4);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-4);
 
         std::stringstream postfix;
-        postfix << "0|sp: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m^6");
+        postfix << "0|sp|sp: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m^36");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = 6.0 * std::pow(alpha * v1 + a, 5.0);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-4);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-4);
     }
 
     TEST_F(SixPowParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _six_pow_test = std::make_shared<SixPowParamNode>(_phi[0], feat_ind, 1e-50, 1e50, _optimizer);
+        _six_pow_test = std::make_shared<SixPowParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+
+        _six_pow_test->set_value();
+        _six_pow_test->set_test_value();
 
-        EXPECT_EQ(_six_pow_test->rung(), 1);
+        EXPECT_EQ(_six_pow_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _six_pow_test->parameters();
 
-        allowed_op_funcs::sixth_pow(900, _phi[0]->value_ptr(), _six_pow_test->parameters()[0], _six_pow_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_six_pow_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_six_pow_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sixth_pow(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_six_pow_test->value_ptr()[0] - expected_val[0]), 1e-4);
+        EXPECT_LT(std::abs(_six_pow_test->value()[0] - expected_val[0]), 1e-4);
 
-        allowed_op_funcs::sixth_pow(10, _phi[0]->test_value_ptr(), _six_pow_test->parameters()[0], _six_pow_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_six_pow_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_six_pow_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sixth_pow(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_six_pow_test->test_value_ptr()[0] - expected_val[0]), 1e-4);
+        EXPECT_LT(std::abs(_six_pow_test->test_value()[0] - expected_val[0]), 1e-4);
 
         std::stringstream postfix;
-        postfix << "0|sp: " << std::setprecision(13) << std::scientific <<_six_pow_test->parameters()[0] << ',' << _six_pow_test->parameters()[1];
-        EXPECT_STREQ(_six_pow_test->unit().toString().c_str(), "m^6");
+        postfix << "0|sp|sp: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_six_pow_test->unit().toString().c_str(), "m^36");
         EXPECT_STREQ(_six_pow_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_six_pow_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_sq_node.cc b/tests/googletest/feature_creation/parameterization/test_sq_node.cc
index a0d0df7b9de24bb78f7721e11589bb7da6523c9b..d786879d6d99ea31ed3cb401fce82eb204cc5e39 100644
--- a/tests/googletest/feature_creation/parameterization/test_sq_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_sq_node.cc
@@ -12,8 +12,8 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 #ifdef PARAMETERIZE
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/sin/sin.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/sq/parameterized_square.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,49 +26,44 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
-            _task_sizes_train = {90};
-            _task_sizes_test = {10};
+            _task_sizes_train = {900};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
 
             std::vector<double> value_1(_task_sizes_train[0], 0.0);
-            std::vector<double> value_2(_task_sizes_train[0], 0.0);
-
             std::vector<double> test_value_1(_task_sizes_test[0], 0.0);
-            std::vector<double> test_value_2(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
-            std::uniform_real_distribution<double> distribution_feats(-50.0, 50.0);
-            std::uniform_real_distribution<double> distribution_params(1e-10, 2.50);
+            std::uniform_real_distribution<double> distribution_feats(-6.23, 6.23);
+            std::uniform_real_distribution<double> distribution_params(-2.50, 2.50);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
                 value_1[ii] = distribution_feats(generator);
-                value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
             {
                 test_value_1[ii] = distribution_feats(generator);
-                test_value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
             }
 
-            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit(""));
+            node_ptr feat_2 = std::make_shared<SinNode>(_feat_1, 1);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, feat_2};
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
             allowed_op_funcs::sq(_task_sizes_train[0], _phi[1]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -104,7 +99,7 @@ namespace
 
         generateSqParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
         EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 90), 1e-4);
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(SqParamNodeTest, ConstructorTest)
@@ -130,7 +125,7 @@ namespace
         try
         {
             _sq_test = std::make_shared<SqParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _sq_test->value_ptr(), 90), 1e-4);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _sq_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -145,32 +140,33 @@ namespace
 
         node_ptr copy_test = _sq_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(90, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::sq(90, _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sq(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::sq(10, _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sq(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|sq: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "s^2");
+        postfix << "0|sin|sq: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = 2.0 * (alpha * v1 + a);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(SqParamNodeTest, AttributesTest)
@@ -178,22 +174,28 @@ namespace
         unsigned long int feat_ind = _phi.size();
         _sq_test = std::make_shared<SqParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_sq_test->rung(), 1);
+        _sq_test->set_value();
+        _sq_test->set_test_value();
+
+        EXPECT_EQ(_sq_test->rung(), 2);
 
-        std::vector<double> expected_val(90, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _sq_test->parameters();
 
-        allowed_op_funcs::sq(90, _phi[1]->value_ptr(), _sq_test->parameters()[0], _sq_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_sq_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_sq_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sq(_task_sizes_train[0], _phi[1]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_sq_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_sq_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::sq(10, _phi[1]->test_value_ptr(), _sq_test->parameters()[0], _sq_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_sq_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_sq_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sq(_task_sizes_test[0], _phi[1]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_sq_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_sq_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|sq: " << std::setprecision(13) << std::scientific <<_sq_test->parameters()[0] << ',' << _sq_test->parameters()[1];
-        EXPECT_STREQ(_sq_test->unit().toString().c_str(), "s^2");
+        postfix << "0|sin|sq: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_sq_test->unit().toString().c_str(), "Unitless");
         EXPECT_STREQ(_sq_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_sq_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_sqrt_node.cc b/tests/googletest/feature_creation/parameterization/test_sqrt_node.cc
index 1cba40dd10539c340664884269b97929811de7a1..076dd39d61c38bda0483383184b3d95aca04532e 100644
--- a/tests/googletest/feature_creation/parameterization/test_sqrt_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_sqrt_node.cc
@@ -13,7 +13,7 @@
 // limitations under the License.
 #ifdef PARAMETERIZE
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/sqrt/parameterized_square_root.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/sub/subtract.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,10 +26,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
             _task_sizes_train = {900};
-            _task_sizes_test = {10};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
 
@@ -40,35 +40,36 @@ namespace
             std::vector<double> test_value_2(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
-            std::uniform_real_distribution<double> distribution_feats(1.0, 500.0);
-            std::uniform_real_distribution<double> distribution_params(0.5, 1.50);
+            std::uniform_real_distribution<double> distribution_feats(-20.0, 20.0);
+            std::uniform_real_distribution<double> distribution_params(0.05, 1.50);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
             {
-                value_1[ii] = distribution_feats(generator);
-                value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
+                value_1[ii] = std::abs(distribution_feats(generator)) + 20.0;
+                value_2[ii] = distribution_feats(generator);
             }
 
             for(int ii = 0; ii < _task_sizes_test[0]; ++ii)
             {
-                test_value_1[ii] = distribution_feats(generator);
-                test_value_2[ii] = std::abs(distribution_feats(generator)) + 1e-10;
+                test_value_1[ii] = std::abs(distribution_feats(generator)) + 20.0;
+                test_value_2[ii] = distribution_feats(generator);
             }
 
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            node_ptr feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("m"));
+            node_ptr feat_3 = std::make_shared<SubNode>(_feat_1, feat_2, 2);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, feat_2, feat_3};
             _a = distribution_params(generator);
             _alpha = std::pow(distribution_params(generator), 2.0);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::sqrt(_task_sizes_train[0], _phi[1]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::sqrt(_task_sizes_train[0], _phi[2]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -77,7 +78,6 @@ namespace
         }
 
         node_ptr _feat_1;
-        node_ptr _feat_2;
         node_ptr _sqrt_test;
 
         std::vector<node_ptr> _phi;
@@ -96,15 +96,15 @@ namespace
     {
         unsigned long int feat_ind = _phi.size();
 
-        generateSqrtParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (SqrtParamNode created with an absolute value above the upper bound)";
+        generateSqrtParamNode(_phi, _phi[0], feat_ind, 1e-50, 1e-49, _optimizer);
+        EXPECT_EQ(_phi.size(), 3) << " (SqrtParamNode created with an absolute value above the upper bound)";
 
-        generateSqrtParamNode(_phi, _phi[1], feat_ind, 1e49, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (SqrtParamNode created with an absolute value below the lower bound)";
+        generateSqrtParamNode(_phi, _phi[0], feat_ind, 1e49, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 3) << " (SqrtParamNode created with an absolute value below the lower bound)";
 
-        generateSqrtParamNode(_phi, _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 900), 1e-4);
+        generateSqrtParamNode(_phi, _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 4) << " (Failure to create a valid feature)";
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-4);
     }
 
     TEST_F(SqrtParamNodeTest, ConstructorTest)
@@ -113,7 +113,7 @@ namespace
 
         try
         {
-            _sqrt_test = std::make_shared<SqrtParamNode>(_phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
+            _sqrt_test = std::make_shared<SqrtParamNode>(_phi[0], feat_ind, 1e-50, 1e-49, _optimizer);
             EXPECT_TRUE(false) << " (SqrtParamNode created with an absolute value above the upper bound)";
         }
         catch(const InvalidFeatureException& e)
@@ -121,7 +121,7 @@ namespace
 
         try
         {
-            _sqrt_test = std::make_shared<SqrtParamNode>(_phi[1], feat_ind, 1e49, 1e50, _optimizer);
+            _sqrt_test = std::make_shared<SqrtParamNode>(_phi[0], feat_ind, 1e49, 1e50, _optimizer);
             EXPECT_TRUE(false) << " (SqrtParamNode created with an absolute value below the lower bound)";
         }
         catch(const InvalidFeatureException& e)
@@ -129,8 +129,8 @@ namespace
 
         try
         {
-            _sqrt_test = std::make_shared<SqrtParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _sqrt_test->value_ptr(), 900), 1e-4);
+            _sqrt_test = std::make_shared<SqrtParamNode>(_phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _sqrt_test->value_ptr(), _task_sizes_train[0]), 1e-4);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -141,59 +141,66 @@ namespace
     TEST_F(SqrtParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _sqrt_test = std::make_shared<SqrtParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _sqrt_test = std::make_shared<SqrtParamNode>(_phi[2], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _sqrt_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(900, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::sqrt(900, _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sqrt(_task_sizes_train[0], _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::sqrt(10, _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sqrt(_task_sizes_test[0], _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|sqrt: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
-        EXPECT_STREQ(copy_test->unit().toString().c_str(), "s^0.5");
+        postfix << "0|1|sub|sqrt: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(copy_test->unit().toString().c_str(), "m^0.5");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
-        double v1 = copy_test->feat(0)->value_ptr()[0];
+        double v1 = copy_test->feat(0)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = 0.5 * std::pow(alpha * v1 + a, -0.5);
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v1), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(SqrtParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _sqrt_test = std::make_shared<SqrtParamNode>(_phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _sqrt_test = std::make_shared<SqrtParamNode>(_phi[2], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_sqrt_test->rung(), 1);
+        _sqrt_test->set_value();
+        _sqrt_test->set_test_value();
 
-        std::vector<double> expected_val(900, 0.0);
+        EXPECT_EQ(_sqrt_test->rung(), 2);
 
-        allowed_op_funcs::sqrt(900, _phi[1]->value_ptr(), _sqrt_test->parameters()[0], _sqrt_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_sqrt_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_sqrt_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _sqrt_test->parameters();
 
-        allowed_op_funcs::sqrt(10, _phi[1]->test_value_ptr(), _sqrt_test->parameters()[0], _sqrt_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_sqrt_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_sqrt_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sqrt(_task_sizes_train[0], _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_sqrt_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_sqrt_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::sqrt(_task_sizes_test[0], _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_sqrt_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_sqrt_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "1|sqrt: " << std::setprecision(13) << std::scientific <<_sqrt_test->parameters()[0] << ',' << _sqrt_test->parameters()[1];
-        EXPECT_STREQ(_sqrt_test->unit().toString().c_str(), "s^0.5");
+        postfix << "0|1|sub|sqrt: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
+        EXPECT_STREQ(_sqrt_test->unit().toString().c_str(), "m^0.5");
         EXPECT_STREQ(_sqrt_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_sqrt_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/parameterization/test_sub_node.cc b/tests/googletest/feature_creation/parameterization/test_sub_node.cc
index 3fc6fd86cda141a72d0194700dd3067e193b5dd1..1345ebd38d414b20f3c0dfdf41727d9a9b22f321 100644
--- a/tests/googletest/feature_creation/parameterization/test_sub_node.cc
+++ b/tests/googletest/feature_creation/parameterization/test_sub_node.cc
@@ -12,8 +12,8 @@
 // See the License for the specific language governing permissions and
 // limitations under the License.
 #ifdef PARAMETERIZE
+#include <feature_creation/node/operator_nodes/allowed_operator_nodes/neg_exp/negative_exponential.hpp>
 #include <feature_creation/node/operator_nodes/allowed_operator_nodes/sub/parameterized_subtract.hpp>
-#include <feature_creation/node/value_storage/nodes_value_containers.hpp>
 #include <feature_creation/node/FeatureNode.hpp>
 #include "gtest/gtest.h"
 
@@ -26,10 +26,10 @@ namespace
     protected:
         void SetUp() override
         {
-            nlopt_wrapper::MAX_PARAM_DEPTH = 1;
+            nlopt_wrapper::MAX_PARAM_DEPTH = 2;
 
-            _task_sizes_train = {90};
-            _task_sizes_test = {10};
+            _task_sizes_train = {900};
+            _task_sizes_test = {100};
 
             node_value_arrs::initialize_values_arr(_task_sizes_train, _task_sizes_test, 2, 2, true);
 
@@ -40,7 +40,7 @@ namespace
             std::vector<double> test_value_2(_task_sizes_test[0], 0.0);
 
             std::default_random_engine generator;
-            std::uniform_real_distribution<double> distribution_feats(-50.0, 50.0);
+            std::uniform_real_distribution<double> distribution_feats(-15.0, 15.0);
             std::uniform_real_distribution<double> distribution_params(-2.50, 2.50);
 
             for(int ii = 0; ii < _task_sizes_train[0]; ++ii)
@@ -56,19 +56,20 @@ namespace
             }
 
             _feat_1 = std::make_shared<FeatureNode>(0, "A", value_1, test_value_1, Unit("m"));
-            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit("s"));
+            _feat_2 = std::make_shared<FeatureNode>(1, "B", value_2, test_value_2, Unit());
+            node_ptr feat_3 = std::make_shared<NegExpNode>(_feat_2, 2);
 
-            _phi = {_feat_1, _feat_2};
+            _phi = {_feat_1, _feat_2, feat_3};
             _a = distribution_params(generator);
             _alpha = distribution_params(generator);
 
             _prop = std::vector<double>(_task_sizes_train[0], 0.0);
-            _gradient.resize(_task_sizes_train[0] * 2, 1.0);
+            _gradient.resize(_task_sizes_train[0] * 4, 1.0);
             _dfdp.resize(_task_sizes_train[0]);
 
-            allowed_op_funcs::sub(_task_sizes_train[0], _phi[0]->value_ptr(), _phi[1]->value_ptr(), _alpha, _a, _prop.data());
+            allowed_op_funcs::sub(_task_sizes_train[0], _phi[0]->value_ptr(), _phi[2]->value_ptr(), _alpha, _a, _prop.data());
 
-            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 1);
+            _optimizer = nlopt_wrapper::get_optimizer("regression",_task_sizes_train, _prop, 2);
         }
 
         void TearDown() override
@@ -98,14 +99,14 @@ namespace
         unsigned long int feat_ind = _phi.size();
 
         generateSubParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e-40, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (SubParamNode created with an absolute value above the upper bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (SubParamNode created with an absolute value above the upper bound)";
 
         generateSubParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e49, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 2) << " (SubParamNode created with an absolute value below the lower bound)";
+        EXPECT_EQ(_phi.size(), 3) << " (SubParamNode created with an absolute value below the lower bound)";
 
-        generateSubParamNode(_phi, _phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-        EXPECT_EQ(_phi.size(), 3) << " (Failure to create a valid feature)";
-        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), 90), 1e-10);
+        generateSubParamNode(_phi, _phi[0], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+        EXPECT_EQ(_phi.size(), 4) << " (Failure to create a valid feature)";
+        EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _phi.back()->value_ptr(), _task_sizes_train[0]), 1e-5);
     }
 
     TEST_F(SubParamNodeTest, ConstructorTest)
@@ -130,8 +131,8 @@ namespace
 
         try
         {
-            _sub_test = std::make_shared<SubParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
-            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _sub_test->value_ptr(), 90), 1e-10);
+            _sub_test = std::make_shared<SubParamNode>(_phi[0], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
+            EXPECT_LT(1.0 - util_funcs::r2(_prop.data(), _sub_test->value_ptr(), _task_sizes_train[0]), 1e-5);
         }
         catch(const InvalidFeatureException& e)
         {
@@ -142,60 +143,67 @@ namespace
     TEST_F(SubParamNodeTest, HardCopyTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _sub_test = std::make_shared<SubParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _sub_test = std::make_shared<SubParamNode>(_phi[0], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
 
         node_ptr copy_test = _sub_test->hard_copy();
 
-        EXPECT_EQ(copy_test->rung(), 1);
+        EXPECT_EQ(copy_test->rung(), 2);
 
-        std::vector<double> expected_val(90, 0.0);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = copy_test->parameters();
 
-        allowed_op_funcs::sub(90, _phi[0]->value_ptr(), _phi[1]->value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sub(_task_sizes_train[0], _phi[0]->value_ptr(), _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->value()[0] - expected_val[0]), 1e-5);
 
-        allowed_op_funcs::sub(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), copy_test->parameters()[0], copy_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sub(_task_sizes_test[0], _phi[0]->test_value_ptr(), _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(copy_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(copy_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|sub: " << std::setprecision(13) << std::scientific <<copy_test->parameters()[0] << ',' << copy_test->parameters()[1];
+        postfix << "0|1|nexp|sub: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(copy_test->unit().toString().c_str(), "m");
         EXPECT_STREQ(copy_test->postfix_expr().c_str(), postfix.str().c_str());
 
         copy_test->gradient(_gradient.data(), _dfdp.data());
         double v1 = copy_test->feat(0)->value_ptr()[0];
-        double v2 = copy_test->feat(1)->value_ptr()[0];
+        double v2 = copy_test->feat(1)->value_ptr(&params[2])[0];
 
-        double alpha = copy_test->parameters()[0];
-        double a = copy_test->parameters()[1];
+        double alpha = params[0];
+        double a = params[1];
         double df_dp = -1.0;
 
-        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-10);
-        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-10);
+        EXPECT_LT(std::abs(_gradient[0] - df_dp * v2), 1e-5);
+        EXPECT_LT(std::abs(_gradient[_task_sizes_train[0]] - df_dp), 1e-5);
     }
 
     TEST_F(SubParamNodeTest, AttributesTest)
     {
         unsigned long int feat_ind = _phi.size();
-        _sub_test = std::make_shared<SubParamNode>(_phi[0], _phi[1], feat_ind, 1e-50, 1e50, _optimizer);
+        _sub_test = std::make_shared<SubParamNode>(_phi[0], _phi[2], feat_ind, 1e-50, 1e50, _optimizer);
 
-        EXPECT_EQ(_sub_test->rung(), 1);
+        _sub_test->set_value();
+        _sub_test->set_test_value();
 
-        std::vector<double> expected_val(90, 0.0);
+        EXPECT_EQ(_sub_test->rung(), 2);
 
-        allowed_op_funcs::sub(90, _phi[0]->value_ptr(), _phi[1]->value_ptr(), _sub_test->parameters()[0], _sub_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_sub_test->value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_sub_test->value()[0] - expected_val[0]), 1e-10);
+        std::vector<double> expected_val(_task_sizes_train[0], 0.0);
+        std::vector<double> params = _sub_test->parameters();
 
-        allowed_op_funcs::sub(10, _phi[0]->test_value_ptr(), _phi[1]->test_value_ptr(), _sub_test->parameters()[0], _sub_test->parameters()[1], expected_val.data());
-        EXPECT_LT(std::abs(_sub_test->test_value_ptr()[0] - expected_val[0]), 1e-10);
-        EXPECT_LT(std::abs(_sub_test->test_value()[0] - expected_val[0]), 1e-10);
+        allowed_op_funcs::sub(_task_sizes_train[0], _phi[0]->value_ptr(), _phi[2]->value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_sub_test->value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_sub_test->value()[0] - expected_val[0]), 1e-5);
+
+        allowed_op_funcs::sub(_task_sizes_test[0], _phi[0]->test_value_ptr(), _phi[2]->test_value_ptr(&params[2]), params[0], params[1], expected_val.data());
+        EXPECT_LT(std::abs(_sub_test->test_value_ptr()[0] - expected_val[0]), 1e-5);
+        EXPECT_LT(std::abs(_sub_test->test_value()[0] - expected_val[0]), 1e-5);
 
         std::stringstream postfix;
-        postfix << "0|1|sub: " << std::setprecision(13) << std::scientific <<_sub_test->parameters()[0] << ',' << _sub_test->parameters()[1];
+        postfix << "0|1|nexp|sub: " << std::setprecision(13) << std::scientific << params[0] << ',' << params[1] << ',' << params[2] << ',' << params[3];
         EXPECT_STREQ(_sub_test->unit().toString().c_str(), "m");
         EXPECT_STREQ(_sub_test->postfix_expr().c_str(), postfix.str().c_str());
+
+        EXPECT_THROW(_sub_test->set_parameters({1.0, 0.0, 1.0}), std::logic_error);
     }
 }
 #endif
diff --git a/tests/googletest/feature_creation/units/test_untis.cc b/tests/googletest/feature_creation/units/test_untis.cc
index e367712a312204a4591a4d2055e6c4005a59bc74..f50049dbd0de1ac2f6b42d46a0966375a9ff6e29 100644
--- a/tests/googletest/feature_creation/units/test_untis.cc
+++ b/tests/googletest/feature_creation/units/test_untis.cc
@@ -23,17 +23,22 @@ namespace {
         Unit u_2("s");
 
         EXPECT_NE(u_1, u_2);
+        EXPECT_NE(u_1, Unit());
         EXPECT_EQ(u_1, Unit(u_1));
         EXPECT_EQ(u_2, Unit(u_2));
 
         EXPECT_STREQ(u_1.toString().c_str(), "m");
+        EXPECT_STREQ(u_1.toLatexString().c_str(), "m");
+
         EXPECT_STREQ(u_2.toString().c_str(), "s");
+        EXPECT_STREQ(u_2.toLatexString().c_str(), "s");
 
         EXPECT_EQ(u_1 / u_2, Unit("m/s"));
         EXPECT_EQ(u_1 / u_1, Unit());
         EXPECT_EQ(u_1 * u_2, Unit("m*s"));
         EXPECT_EQ(u_1 * u_1, Unit("m^2.0"));
         EXPECT_EQ(u_1 ^ 2.0, Unit("m^2.0"));
+        EXPECT_EQ(u_1 ^ 0.0, Unit(""));
         EXPECT_EQ(u_1.inverse(), Unit("1 / m"));
 
         u_2 /= u_1;
@@ -64,17 +69,22 @@ namespace {
         Unit u_2(dct_2);
 
         EXPECT_NE(u_1, u_2);
+        EXPECT_NE(u_1, Unit());
         EXPECT_EQ(u_1, Unit(u_1));
         EXPECT_EQ(u_2, Unit(u_2));
 
         EXPECT_STREQ(u_1.toString().c_str(), "m");
+        EXPECT_STREQ(u_1.toLatexString().c_str(), "m");
+
         EXPECT_STREQ(u_2.toString().c_str(), "s");
+        EXPECT_STREQ(u_2.toLatexString().c_str(), "s");
 
         EXPECT_EQ(u_1 / u_2, Unit("m/s"));
         EXPECT_EQ(u_1 / u_1, Unit());
         EXPECT_EQ(u_1 * u_2, Unit("m*s"));
         EXPECT_EQ(u_1 * u_1, Unit("m^2.0"));
         EXPECT_EQ(u_1 ^ 2.0, Unit("m^2.0"));
+        EXPECT_EQ(u_1 ^ 0.0, Unit(""));
         EXPECT_EQ(u_1.inverse(), Unit("1 / m"));
 
         u_2 /= u_1;
diff --git a/tests/googletest/feature_creation/utils/test_utils.cc b/tests/googletest/feature_creation/utils/test_utils.cc
index 58fa6e5fc5b1c96f29963363fe69f8836114c2c2..edea1ddec8068fd807dfaddc850d6406a4ddd400 100644
--- a/tests/googletest/feature_creation/utils/test_utils.cc
+++ b/tests/googletest/feature_creation/utils/test_utils.cc
@@ -52,11 +52,15 @@ namespace
 
     TEST_F(FeatCreationUtilsTest, TestPostfix2Node)
     {
+        EXPECT_THROW(str2node::postfix2node("0|asdf", _phi0, _feat_ind), std::logic_error);
+        EXPECT_THROW(str2node::postfix2node("1|0|sq", _phi0, _feat_ind), std::logic_error);
+
         node_ptr test = str2node::postfix2node("0|2|div|exp|1|add", _phi0, _feat_ind);
         EXPECT_EQ(test->type(), NODE_TYPE::ADD);
         EXPECT_EQ(test->rung(), 3);
         EXPECT_LT(abs(test->value()[1] - (std::exp(2.0) + 2.0)), 1e-10);
         EXPECT_STREQ(test->expr().c_str(), "(exp((A / C)) + B)");
+
     }
 
     TEST_F(FeatCreationUtilsTest, TestPhiSelFromFile)
@@ -118,6 +122,8 @@ namespace
             out_file_stream << "-";
         out_file_stream << std::endl;
         out_file_stream.close();
+
+        EXPECT_THROW(str2node::phi_from_file("not_phi_file.txt", _phi0), std::logic_error);
         std::vector<node_ptr> phi = str2node::phi_from_file("phi.txt", _phi0);
 
         ASSERT_EQ(phi.size(), 10);
@@ -132,4 +138,46 @@ namespace
         EXPECT_STREQ(phi[5]->expr().c_str(), "(exp((A / C)) - B)");
         boost::filesystem::remove("phi.txt");
     }
+
+    TEST_F(FeatCreationUtilsTest, TestType2Str)
+    {
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::FEAT).c_str(), "feature");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::MODEL_FEATURE).c_str(), "model");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::ADD).c_str(), "add");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::SUB).c_str(), "sub");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::ABS_DIFF).c_str(), "abs_diff");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::MULT).c_str(), "mult");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::DIV).c_str(), "div");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::EXP).c_str(), "exp");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::NEG_EXP).c_str(), "neg_exp");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::INV).c_str(), "inv");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::SQ).c_str(), "sq");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::CB).c_str(), "cb");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::SIX_POW).c_str(), "six_pow");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::SQRT).c_str(), "sqrt");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::CBRT).c_str(), "cbrt");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::LOG).c_str(), "log");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::ABS).c_str(), "abs");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::SIN).c_str(), "sin");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::COS).c_str(), "cos");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_ADD).c_str(), "p:add");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_SUB).c_str(), "p:sub");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_ABS_DIFF).c_str(), "p:abs_diff");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_MULT).c_str(), "p:mult");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_DIV).c_str(), "p:div");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_EXP).c_str(), "p:exp");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_NEG_EXP).c_str(), "p:neg_exp");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_INV).c_str(), "p:inv");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_SQ).c_str(), "p:sq");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_CB).c_str(), "p:cb");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_SIX_POW).c_str(), "p:six_pow");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_SQRT).c_str(), "p:sqrt");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_CBRT).c_str(), "p:cbrt");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_LOG).c_str(), "p:log");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_ABS).c_str(), "p:abs");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_SIN).c_str(), "p:sin");
+        EXPECT_STREQ(node_identifier::feature_type_to_string(NODE_TYPE::PARAM_COS).c_str(), "p:cos");
+
+        EXPECT_THROW(node_identifier::feature_type_to_string(NODE_TYPE::MAX), std::logic_error);
+    }
 }
diff --git a/tests/googletest/feature_creation/value_storage/test_value_storage.cc b/tests/googletest/feature_creation/value_storage/test_value_storage.cc
index 3a6f4aa5a83ad20a19ee2f4051ab611619cfb998..4614922ef947f320642e8cfc0f69b6a75fa9aa7f 100644
--- a/tests/googletest/feature_creation/value_storage/test_value_storage.cc
+++ b/tests/googletest/feature_creation/value_storage/test_value_storage.cc
@@ -21,7 +21,13 @@ namespace {
     //test mean calculations
     TEST(ValueStorage, ValueStorageTest)
     {
+        EXPECT_THROW(node_value_arrs::initialize_values_arr({5}, {2}, 1, -2, true), std::logic_error);
+
         node_value_arrs::initialize_values_arr({5}, {2}, 1, 2, true);
+
+        EXPECT_THROW(node_value_arrs::set_task_sz_train({20}), std::logic_error);
+        EXPECT_THROW(node_value_arrs::set_task_sz_test({6}), std::logic_error);
+
         EXPECT_EQ(node_value_arrs::N_SAMPLES, 5);
         EXPECT_EQ(node_value_arrs::N_SAMPLES_TEST, 2);
         EXPECT_EQ(node_value_arrs::N_RUNGS_STORED, 0);
@@ -35,6 +41,7 @@ namespace {
         EXPECT_EQ(node_value_arrs::TEMP_STORAGE_TEST_ARR.size(), node_value_arrs::MAX_N_THREADS * (6 * 1 + 1) * 2);
         EXPECT_EQ(node_value_arrs::TEMP_STORAGE_TEST_REG.size(), node_value_arrs::MAX_N_THREADS * (6 * 1 + 1));
 
+        EXPECT_THROW(node_value_arrs::resize_values_arr(10, 2), std::logic_error);
         node_value_arrs::resize_values_arr(1, 2);
         EXPECT_EQ(node_value_arrs::N_SAMPLES, 5);
         EXPECT_EQ(node_value_arrs::N_SAMPLES_TEST, 2);
@@ -58,6 +65,7 @@ namespace {
         node_value_arrs::get_value_ptr(1, 1, 0)[1] = 1.0;
         EXPECT_EQ(node_value_arrs::VALUES_ARR[6], 1.0);
 
+        EXPECT_THROW(node_value_arrs::get_test_value_ptr(1000, 100, 0, 0)[1], std::logic_error);
         node_value_arrs::get_test_value_ptr(0, 1, 0, 0)[1] = 1.0;
         EXPECT_EQ(node_value_arrs::TEST_VALUES_ARR[1], 1.0);
 
@@ -113,5 +121,12 @@ namespace {
         EXPECT_EQ(node_value_arrs::MAX_N_THREADS, omp_get_max_threads());
         EXPECT_EQ(node_value_arrs::N_OP_SLOTS, 0);
         EXPECT_EQ(node_value_arrs::N_PARAM_OP_SLOTS, 0);
+
+        node_value_arrs::set_task_sz_train({3, 2});
+        EXPECT_EQ(node_value_arrs::TASK_SZ_TRAIN[0], 3);
+        EXPECT_EQ(node_value_arrs::TASK_START_TRAIN[0], 0);
+
+        node_value_arrs::set_task_sz_test({2, 0});
+        EXPECT_EQ(node_value_arrs::TASK_SZ_TEST[0], 2);
     }
 }
diff --git a/tests/googletest/inputs/data.csv b/tests/googletest/inputs/data.csv
index ee3d4cd2cc7a2d326f44369b2600601aa663824b..7203e72111fd3051ebef425b03b146b479c10e98 100644
--- a/tests/googletest/inputs/data.csv
+++ b/tests/googletest/inputs/data.csv
@@ -1,4 +1,4 @@
-Sample,task,property (m),A (m)
+# Sample,task,property (m),A (m)
 a,task_1,1.0,1.0
 b,task_1,4.0,2.0
 c,task_2,9.0,3.0
diff --git a/tests/googletest/inputs/input_parser.cc b/tests/googletest/inputs/input_parser.cc
index d3129f52caa723ee4812013e85f68fcd18a9ec88..7ec7a6d39a123d5343cf9bef6855c21338b69c79 100644
--- a/tests/googletest/inputs/input_parser.cc
+++ b/tests/googletest/inputs/input_parser.cc
@@ -138,30 +138,48 @@ namespace
     TEST_F(InputParserTests, DefaultConsructor)
     {
         InputParser inputs;
+        EXPECT_THROW(inputs.task_sizes_train(), std::logic_error);
+        EXPECT_THROW(inputs.task_sizes_train_copy(), std::logic_error);
+
+        EXPECT_THROW(inputs.task_sizes_test(), std::logic_error);
+        EXPECT_THROW(inputs.task_sizes_test_copy(), std::logic_error);
+
         inputs.set_task_sizes_train(_task_sizes_train);
         EXPECT_EQ(inputs.task_sizes_train()[0], _task_sizes_train[0]);
 
         inputs.set_task_sizes_test(_task_sizes_test);
         EXPECT_EQ(inputs.task_sizes_test()[0], _task_sizes_test[0]);
 
+        EXPECT_THROW(inputs.sample_ids_train(), std::logic_error);
+        EXPECT_THROW(inputs.sample_ids_train_copy(), std::logic_error);
         inputs.set_sample_ids_train(_sample_ids_train);
         EXPECT_EQ(inputs.sample_ids_train()[0], _sample_ids_train[0]);
 
+        EXPECT_THROW(inputs.sample_ids_test(), std::logic_error);
+        EXPECT_THROW(inputs.sample_ids_test_copy(), std::logic_error);
         inputs.set_sample_ids_test(_sample_ids_test);
         EXPECT_EQ(inputs.sample_ids_test()[0], _sample_ids_test[0]);
 
+        EXPECT_THROW(inputs.task_names(), std::logic_error);
+        EXPECT_THROW(inputs.task_names_copy(), std::logic_error);
         inputs.set_task_names(_task_names);
         EXPECT_EQ(inputs.task_names()[0], _task_names[0]);
 
         inputs.set_allowed_ops(_allowed_ops);
         EXPECT_EQ(inputs.allowed_ops()[0], _allowed_ops[0]);
 
+        EXPECT_THROW(inputs.prop_train(), std::logic_error);
+        EXPECT_THROW(inputs.prop_train_copy(), std::logic_error);
         inputs.set_prop_train(_prop_train);
         EXPECT_EQ(inputs.prop_train()[0], _prop_train[0]);
 
+        EXPECT_THROW(inputs.prop_test(), std::logic_error);
+        EXPECT_THROW(inputs.prop_test_copy(), std::logic_error);
         inputs.set_prop_test(_prop_test);
         EXPECT_EQ(inputs.prop_test()[0], _prop_test[0]);
 
+        EXPECT_THROW(inputs.leave_out_inds(), std::logic_error);
+        EXPECT_THROW(inputs.leave_out_inds_copy(), std::logic_error);
         inputs.set_leave_out_inds(_leave_out_inds);
         EXPECT_EQ(inputs.leave_out_inds()[0], _leave_out_inds[0]);
 
@@ -169,6 +187,9 @@ namespace
         EXPECT_EQ(inputs.n_samp_test(), 1);
         EXPECT_EQ(inputs.n_samp_train(), 3);
 
+        EXPECT_THROW(inputs.phi_0(), std::logic_error);
+        EXPECT_THROW(inputs.phi_0_copy(), std::logic_error);
+        EXPECT_THROW(inputs.phi_0_ptrs(), std::logic_error);
         inputs.set_phi_0(_phi_0);
         EXPECT_EQ(inputs.phi_0()[0].feat_ind(), _phi_0[0].feat_ind());
         EXPECT_EQ(inputs.phi_0_ptrs()[0]->feat_ind(), _phi_0[0].feat_ind());
@@ -199,9 +220,11 @@ namespace
 
         inputs.set_l_bound(_l_bound);
         EXPECT_EQ(inputs.l_bound(), _l_bound);
+        EXPECT_THROW(inputs.set_u_bound(_l_bound / 2), std::logic_error);
 
         inputs.set_u_bound(_u_bound);
         EXPECT_EQ(inputs.u_bound(), _u_bound);
+        EXPECT_THROW(inputs.set_l_bound(_u_bound * 2), std::logic_error);
 
         inputs.set_n_dim(_n_dim);
         EXPECT_EQ(inputs.n_dim(), _n_dim);
@@ -247,10 +270,7 @@ namespace
 
     TEST_F(InputParserTests, FileConsructor)
     {
-        boost::property_tree::ptree propTree;
-        boost::property_tree::json_parser::read_json(_filename, propTree);
-        propTree.put("data_file", _data_file);
-        InputParser inputs(propTree, _filename, mpi_setup::comm);
+        InputParser inputs(_filename);
         EXPECT_EQ(inputs.sample_ids_train()[0], _sample_ids_train[0]);
         EXPECT_EQ(inputs.sample_ids_test()[0], _sample_ids_test[0]);
         EXPECT_EQ(inputs.task_names()[0], _task_names[0]);
@@ -284,12 +304,86 @@ namespace
         EXPECT_EQ(inputs.n_models_store(), _n_models_store);
         EXPECT_EQ(inputs.fix_intercept(), _fix_intercept);
 
-#ifdef PARAMETERIZE
+        #ifdef PARAMETERIZE
         EXPECT_EQ(inputs.allowed_param_ops()[0], _allowed_param_ops[0]);
         EXPECT_EQ(inputs.max_param_depth(), _max_param_depth);
         EXPECT_EQ(inputs.nlopt_seed(), _nlopt_seed);
         EXPECT_EQ(inputs.global_param_opt(), _global_param_opt);
         EXPECT_EQ(inputs.reparam_residual(), _reparam_residual);
-#endif
+        #endif
+    }
+
+    TEST_F(InputParserTests, CheckSizes)
+    {
+        InputParser inputs;
+        inputs.clear_data();
+        node_value_arrs::finalize_values_arr();
+
+        inputs.set_task_sizes_train({2, 2, 2});
+        EXPECT_THROW(inputs.set_task_sizes_test(_task_sizes_test), std::logic_error);
+        EXPECT_THROW(inputs.set_task_names({"A"}), std::logic_error);
+        EXPECT_THROW(inputs.set_prop_train(_prop_train), std::logic_error);
+        EXPECT_THROW(inputs.set_sample_ids_train(_sample_ids_train), std::logic_error);
+        EXPECT_THROW(inputs.set_phi_0((_phi_0)), std::logic_error);
+        inputs.clear_data();
+
+        inputs.set_task_sizes_test({2, 2, 2});
+        EXPECT_THROW(inputs.set_task_sizes_train(_task_sizes_train), std::logic_error);
+        EXPECT_THROW(inputs.set_task_names({"A"}), std::logic_error);
+        EXPECT_THROW(inputs.set_prop_test(_prop_test), std::logic_error);
+        EXPECT_THROW(inputs.set_sample_ids_test(_sample_ids_test), std::logic_error);
+        EXPECT_THROW(inputs.set_leave_out_inds(_leave_out_inds), std::logic_error);
+        EXPECT_THROW(inputs.set_phi_0((_phi_0)), std::logic_error);
+        inputs.clear_data();
+
+        inputs.set_task_names({"A", "B", "C", "D"});
+        EXPECT_THROW(inputs.set_task_sizes_train(_task_sizes_train), std::logic_error);
+        EXPECT_THROW(inputs.set_task_sizes_test(_task_sizes_test), std::logic_error);
+        inputs.clear_data();
+
+        inputs.set_prop_train(std::vector<double>(6, 0.0));
+        EXPECT_THROW(inputs.set_task_sizes_train(_task_sizes_train), std::logic_error);
+        EXPECT_THROW(inputs.set_sample_ids_train(_sample_ids_train), std::logic_error);
+        EXPECT_THROW(inputs.set_phi_0(_phi_0), std::logic_error);
+        inputs.clear_data();
+
+        inputs.set_sample_ids_train(std::vector<std::string>(6, "A"));
+        EXPECT_THROW(inputs.set_task_sizes_train(_task_sizes_train), std::logic_error);
+        EXPECT_THROW(inputs.set_prop_train(_prop_train), std::logic_error);
+        EXPECT_THROW(inputs.set_phi_0(_phi_0), std::logic_error);
+        inputs.clear_data();
+
+        inputs.set_prop_test(std::vector<double>(6, 0.0));
+        EXPECT_THROW(inputs.set_task_sizes_test(_task_sizes_test), std::logic_error);
+        EXPECT_THROW(inputs.set_sample_ids_test(_sample_ids_test), std::logic_error);
+        EXPECT_THROW(inputs.set_phi_0(_phi_0), std::logic_error);
+        EXPECT_THROW(inputs.set_leave_out_inds(_leave_out_inds), std::logic_error);
+        inputs.clear_data();
+
+        inputs.set_leave_out_inds({1, 2, 3});
+        EXPECT_THROW(inputs.set_task_sizes_test(_task_sizes_test), std::logic_error);
+        EXPECT_THROW(inputs.set_sample_ids_test(_sample_ids_test), std::logic_error);
+        EXPECT_THROW(inputs.set_phi_0(_phi_0), std::logic_error);
+        EXPECT_THROW(inputs.set_prop_test(_prop_test), std::logic_error);
+        inputs.clear_data();
+
+        inputs.set_sample_ids_test(std::vector<std::string>(6, "A"));
+        EXPECT_THROW(inputs.set_task_sizes_test(_task_sizes_test), std::logic_error);
+        EXPECT_THROW(inputs.set_prop_test(_prop_test), std::logic_error);
+        EXPECT_THROW(inputs.set_phi_0(_phi_0), std::logic_error);
+        EXPECT_THROW(inputs.set_leave_out_inds(_leave_out_inds), std::logic_error);
+        inputs.clear_data();
+
+        _phi_0 = {FeatureNode(0, "feat_1", {1.0, 2.0, 3.0, 4.0}, {5.0, 6.0}, Unit("m"))};
+        inputs.set_phi_0(_phi_0);
+        EXPECT_THROW(inputs.set_prop_train(_prop_train), std::logic_error);
+        EXPECT_THROW(inputs.set_task_sizes_train(_task_sizes_train), std::logic_error);
+        EXPECT_THROW(inputs.set_sample_ids_train(_sample_ids_train), std::logic_error);
+
+        EXPECT_THROW(inputs.set_prop_test(_prop_test), std::logic_error);
+        EXPECT_THROW(inputs.set_task_sizes_test(_task_sizes_test), std::logic_error);
+        EXPECT_THROW(inputs.set_sample_ids_test(_sample_ids_test), std::logic_error);
+        EXPECT_THROW(inputs.set_leave_out_inds(_leave_out_inds), std::logic_error);
+        inputs.clear_data();
     }
 }
diff --git a/tests/googletest/inputs/sisso.json b/tests/googletest/inputs/sisso.json
index 2e655b860bda6796662d0cc9186eed6563eaf6e2..94b77049a836e16cb06811a6bd41a25f72002508 100644
--- a/tests/googletest/inputs/sisso.json
+++ b/tests/googletest/inputs/sisso.json
@@ -5,7 +5,8 @@
     "n_residual": 1,
     "n_models_store": 1,
     "n_rung_store": 1,
-    "data_file": "googletest/inputs/data.csv",
+    "data_file_relatice_to_json": true,
+    "data_file": "data.csv",
     "property_key": "property",
     "task_key": "task",
     "leave_out_inds": [3],
diff --git a/tests/googletest/inputs/sisso_param.json b/tests/googletest/inputs/sisso_param.json
index f41261dc992d0c1d090c254ae5530fee190504b2..68a27b344ae776ea50526f6de14a1d33de481626 100644
--- a/tests/googletest/inputs/sisso_param.json
+++ b/tests/googletest/inputs/sisso_param.json
@@ -5,7 +5,8 @@
     "n_residual": 1,
     "n_models_store": 1,
     "n_rung_store": 1,
-    "data_file": "googletest/inputs/data.csv",
+    "data_file_relatice_to_json": true,
+    "data_file": "data.csv",
     "property_key": "property",
     "task_key": "task",
     "leave_out_inds": [3],
diff --git a/tests/googletest/utils/test_str_utils.cc b/tests/googletest/utils/test_str_utils.cc
index f14dd04b8b3f12e418facc210480a2890a8ec90e..ba230e6dadbe5802bbc8ce903575e4366423857e 100644
--- a/tests/googletest/utils/test_str_utils.cc
+++ b/tests/googletest/utils/test_str_utils.cc
@@ -33,5 +33,7 @@ namespace {
         EXPECT_STREQ(str_split[1].c_str(), "B");
         EXPECT_STREQ(str_split[2].c_str(), "C");
         EXPECT_STREQ(str_split[3].c_str(), "D :     5");
+
+        EXPECT_STREQ(str_utils::latexify("A_sd_fg").c_str(), "A_{sd, fg}");
     }
 }
diff --git a/tests/pytest/test_descriptor_identifier/matlab_functions/model_log_regressor.m b/tests/pytest/test_descriptor_identifier/matlab_functions/model_log_regressor.m
index 90ff4941e2b68e2032ddd73f4f7dbeaa117aba3b..28125195cbb3a3c5213b677e9de9ca7d17d5534b 100644
--- a/tests/pytest/test_descriptor_identifier/matlab_functions/model_log_regressor.m
+++ b/tests/pytest/test_descriptor_identifier/matlab_functions/model_log_regressor.m
@@ -1,5 +1,5 @@
 function P = model_log_regressor(X)
-% Returns the value of Prop = exp(c0) * ((B + A))^a0 * ((|D - B|))^a1
+% Returns the value of Prop = ((B + A))^a0 * ((|D - B|))^a1
 %
 % X = [
 %     B,
@@ -17,7 +17,7 @@ D = reshape(X(:, 3), 1, []);
 f0 = (B + A);
 f1 = abs(D - B);
 
-c0 = 2.1945699276e-13;
+c0 = 0.0;
 a0 = 1.2000000000e+00;
 a1 = -1.9500000000e+00;
 
diff --git a/tests/pytest/test_descriptor_identifier/model_files/test_classifier_fail_overlap.dat b/tests/pytest/test_descriptor_identifier/model_files/test_classifier_fail_overlap.dat
new file mode 100644
index 0000000000000000000000000000000000000000..cc68ee55cc7362d377a853c5520b0f9c6eb775c6
--- /dev/null
+++ b/tests/pytest/test_descriptor_identifier/model_files/test_classifier_fail_overlap.dat
@@ -0,0 +1,35 @@
+# [(feat_9 - feat_8), (feat_1 * feat_0)]
+# Property Label: $Class$; Unit of the Property: Unitless
+# # Samples in Convex Hull Overlap Region: 5;# Samples SVM Misclassified: 0
+# Decision Boundaries
+# Task    w0                      w1                      b
+# all_0,  1.326205649731981e+00, -1.744239999671528e+00,  9.075950727790907e-01, 
+# Feature Rung, Units, and Expressions
+# 0;  1; Unitless;                                         9|8|sub; (feat_9 - feat_8); $\left(feat_{9} - feat_{8}\right)$; (feat_9 - feat_8); feat_9,feat_8
+# 1;  1; Unitless;                                         1|0|mult; (feat_1 * feat_0); $\left(feat_{1} feat_{0}\right)$; (feat_1 .* feat_0); feat_1,feat_0
+# Number of Samples Per Task
+# Task,   n_mats_test             
+# all,    20                    
+# Test Indexes: [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19 ]
+
+# Sample ID , Property Value        ,  Property Value (EST) ,  Feature 0 Value      ,  Feature 1 Value      
+0           ,  0.000000000000000e+00,  0.000000000000000e+00, -2.031012155743963e-01,  2.746937325370922e+00
+1           ,  0.000000000000000e+00,  0.000000000000000e+00,  3.178950007570245e-01,  3.060176523352919e+00
+2           ,  0.000000000000000e+00,  0.000000000000000e+00,  1.350899479575136e+00,  1.904914737747669e+00
+3           ,  0.000000000000000e+00,  0.000000000000000e+00,  3.112816979685040e-01,  2.597514970348419e+00
+4           ,  0.000000000000000e+00,  0.000000000000000e+00,  3.256274649800963e-01,  3.823832277859604e+00
+5           ,  0.000000000000000e+00,  0.000000000000000e+00,  7.291401227120657e-01,  2.789443909864211e+00
+6           ,  0.000000000000000e+00,  0.000000000000000e+00,  3.051059498409199e-01,  2.087853428517832e+00
+7           ,  0.000000000000000e+00,  0.000000000000000e+00,  5.345306546910435e-01,  2.507012794375703e+00
+8           ,  0.000000000000000e+00,  0.000000000000000e+00, -5.273941950401386e-01,  1.812203393718137e+00
+9           ,  0.000000000000000e+00,  0.000000000000000e+00, -1.780367164555883e-01,  3.143604947474592e+00
+10          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.274772829175870e+00, -2.054835335229399e+00
+11          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.099107097822589e+00, -2.210701955514223e+00
+12          ,  1.000000000000000e+00,  1.000000000000000e+00, -2.522737334308300e-02, -2.127724030671242e+00
+13          ,  1.000000000000000e+00,  1.000000000000000e+00, -8.048228984834345e-01, -3.158579181125339e+00
+14          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.875592975314526e-01, -1.974183109498213e+00
+15          ,  1.000000000000000e+00,  1.000000000000000e+00,  6.149517560499549e-01, -1.721250821422664e+00
+16          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.679421386195452e+00, -2.639919246265093e+00
+17          ,  1.000000000000000e+00,  1.000000000000000e+00, -3.729001722113563e-01, -2.014587145399039e+00
+18          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.241140800579893e+00, -1.410625471724265e+00
+19          ,  1.000000000000000e+00,  1.000000000000000e+00,  3.358150821235752e-01, -1.769187643167631e+00
diff --git a/tests/pytest/test_descriptor_identifier/model_files/test_log_regressor.dat b/tests/pytest/test_descriptor_identifier/model_files/test_log_regressor.dat
index 0594278ce2e9ef216951f103b3b671d521787a7b..a6810bce51929ed79dbc868e9889fdbf0e96d577 100644
--- a/tests/pytest/test_descriptor_identifier/model_files/test_log_regressor.dat
+++ b/tests/pytest/test_descriptor_identifier/model_files/test_log_regressor.dat
@@ -1,9 +1,9 @@
-# exp(c0) * ((B + A))^a0 * ((|D - B|))^a1
+# ((B + A))^a0 * ((|D - B|))^a1
 # Property Label: $Prop$; Unit of the Property: Unitless
 # RMSE: 1.61410365875894e-15; Max AE: 3.10862446895044e-15
 # Coefficients
-# Task   a0                      a1                      c0
-# all ,  1.199999999999988e+00, -1.950000000000029e+00,  2.194569927587456e-13, 
+# Task   a0                      a1
+# all ,  1.199999999999988e+00, -1.950000000000029e+00,
 # Feature Rung, Units, and Expressions
 # 0;  1; Unitless;                                         1|0|add; (B + A); $\left(B + A\right)$; (B + A); B,A
 # 1;  1; Unitless;                                         3|1|abd; (|D - B|); $\left(\left|D - B\right|\right)$; abs(D - B); D,B
diff --git a/tests/pytest/test_descriptor_identifier/model_files/train_classifier_fail_overlap.dat b/tests/pytest/test_descriptor_identifier/model_files/train_classifier_fail_overlap.dat
new file mode 100644
index 0000000000000000000000000000000000000000..c98b8489ab242abbdc608dc7a7f39790276a78dc
--- /dev/null
+++ b/tests/pytest/test_descriptor_identifier/model_files/train_classifier_fail_overlap.dat
@@ -0,0 +1,94 @@
+# [(feat_9 - feat_8), (feat_1 * feat_0)]
+# Property Label: $Class$; Unit of the Property: Unitless
+# # Samples in Convex Hull Overlap Region: 5;# Samples SVM Misclassified: 0
+# Decision Boundaries
+# Task    w0                      w1                      b
+# all_0,  1.326205649731981e+00, -1.744239999671528e+00,  9.075950727790907e-01, 
+# Feature Rung, Units, and Expressions
+# 0;  1; Unitless;                                         9|8|sub; (feat_9 - feat_8); $\left(feat_{9} - feat_{8}\right)$; (feat_9 - feat_8); feat_9,feat_8
+# 1;  1; Unitless;                                         1|0|mult; (feat_1 * feat_0); $\left(feat_{1} feat_{0}\right)$; (feat_1 .* feat_0); feat_1,feat_0
+# Number of Samples Per Task
+# Task, n_mats_train            
+# all , 80                    
+
+# Sample ID , Property Value        ,  Property Value (EST) ,  Feature 0 Value      ,  Feature 1 Value      
+20          ,  0.000000000000000e+00,  0.000000000000000e+00, -1.438535620957356e+00, -1.000000000000000e-04
+21          ,  0.000000000000000e+00,  0.000000000000000e+00,  1.190224286778585e-01,  1.551266013053755e+00
+22          ,  0.000000000000000e+00,  0.000000000000000e+00, -1.425077659929501e-01,  2.130683687424777e+00
+23          ,  0.000000000000000e+00,  0.000000000000000e+00, -4.540168315339039e-01,  2.634756018203185e+00
+24          ,  0.000000000000000e+00,  0.000000000000000e+00,  6.192845577547714e-01,  1.928013807462464e+00
+25          ,  0.000000000000000e+00,  0.000000000000000e+00, -8.781439552075476e-01,  2.167912710491058e+00
+26          ,  0.000000000000000e+00,  0.000000000000000e+00, -3.535569323953591e-01,  1.528444956153448e+00
+27          ,  0.000000000000000e+00,  0.000000000000000e+00,  1.339601116269036e-01,  1.802459125102224e+00
+28          ,  0.000000000000000e+00,  0.000000000000000e+00, -1.136023513610327e+00,  3.215479730179889e+00
+29          ,  0.000000000000000e+00,  0.000000000000000e+00,  6.038078099519975e-01,  2.573190329369634e+00
+30          ,  0.000000000000000e+00,  0.000000000000000e+00, -1.215056999365121e-01,  1.595270557148176e+00
+31          ,  0.000000000000000e+00,  0.000000000000000e+00, -5.518151942675462e-01,  1.730168908796035e+00
+32          ,  0.000000000000000e+00,  0.000000000000000e+00,  1.650765229640842e+00,  3.203700967878904e+00
+33          ,  0.000000000000000e+00,  0.000000000000000e+00, -1.184077335611730e+00,  1.517408938916172e+00
+34          ,  0.000000000000000e+00,  0.000000000000000e+00, -2.221541956972297e-01,  1.751520526988180e+00
+35          ,  0.000000000000000e+00,  0.000000000000000e+00, -4.681000124924655e-01,  1.824502458636519e+00
+36          ,  0.000000000000000e+00,  0.000000000000000e+00, -1.050081904577687e-01,  1.724802253834064e+00
+37          ,  0.000000000000000e+00,  0.000000000000000e+00,  1.339981411155463e+00,  2.558207468331875e+00
+38          ,  0.000000000000000e+00,  0.000000000000000e+00, -1.684823671566578e-01,  1.506294601099636e+00
+39          ,  0.000000000000000e+00,  0.000000000000000e+00,  7.714455467324950e-01,  2.242769710603608e+00
+40          ,  0.000000000000000e+00,  0.000000000000000e+00, -1.622660863089168e+00, -1.000000000000000e-04
+41          ,  0.000000000000000e+00,  0.000000000000000e+00, -3.851141611133064e-01,  1.621724598445333e+00
+42          ,  0.000000000000000e+00,  0.000000000000000e+00, -7.796131832604434e-01,  3.412199890833602e+00
+43          ,  0.000000000000000e+00,  0.000000000000000e+00,  6.330392653717503e-01,  2.644525290379403e+00
+44          ,  0.000000000000000e+00,  0.000000000000000e+00,  4.472597809306964e-01,  1.639977210905994e+00
+45          ,  0.000000000000000e+00,  0.000000000000000e+00,  5.619997969970609e-01,  2.117832540122095e+00
+46          ,  0.000000000000000e+00,  0.000000000000000e+00,  2.693335908708820e-01,  3.719200588905764e+00
+47          ,  0.000000000000000e+00,  0.000000000000000e+00, -6.945169947212362e-01,  2.658310913357233e+00
+48          ,  0.000000000000000e+00,  0.000000000000000e+00, -2.608343436805389e-01,  2.389278127799646e+00
+49          ,  0.000000000000000e+00,  0.000000000000000e+00, -5.883177866461617e-01,  1.194385279781109e+00
+50          ,  0.000000000000000e+00,  0.000000000000000e+00,  3.016305034685407e-01,  2.163287243369974e+00
+51          ,  0.000000000000000e+00,  0.000000000000000e+00, -8.429615971293545e-01,  3.143453483796918e+00
+52          ,  0.000000000000000e+00,  0.000000000000000e+00, -2.305301655628482e-01,  2.373605928069240e+00
+53          ,  0.000000000000000e+00,  0.000000000000000e+00,  8.169785601229205e-01,  3.393041023148420e+00
+54          ,  0.000000000000000e+00,  0.000000000000000e+00,  5.880968210966282e-01,  1.540775049989281e+00
+55          ,  0.000000000000000e+00,  0.000000000000000e+00, -8.439557195782834e-01,  2.354515308140759e+00
+56          ,  0.000000000000000e+00,  0.000000000000000e+00,  1.145691901781707e-01,  3.057598248128036e+00
+57          ,  0.000000000000000e+00,  0.000000000000000e+00,  5.052378789612302e-01,  3.681321981867383e+00
+58          ,  0.000000000000000e+00,  0.000000000000000e+00,  6.830515610497974e-01,  2.677195784075541e+00
+59          ,  0.000000000000000e+00,  0.000000000000000e+00, -3.962323210078385e-01,  2.494759927195949e+00
+60          ,  1.000000000000000e+00,  1.000000000000000e+00,  6.978964070103477e-02,  1.000000000000000e-04
+61          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.017083127934503e+00, -1.718221667867104e+00
+62          ,  1.000000000000000e+00,  1.000000000000000e+00, -5.038361140934988e-02, -3.023687952494995e+00
+63          ,  1.000000000000000e+00,  1.000000000000000e+00,  2.981066824725631e-02, -2.580950415579647e+00
+64          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.173640969341423e+00, -2.015913518051938e+00
+65          ,  1.000000000000000e+00,  1.000000000000000e+00,  8.711405011252915e-02, -3.331488038371359e+00
+66          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.309781594456224e+00, -2.340258337136148e+00
+67          ,  1.000000000000000e+00,  1.000000000000000e+00, -2.024028100438937e-01, -1.817820634181115e+00
+68          ,  1.000000000000000e+00,  1.000000000000000e+00, -2.684686877159819e-01, -1.754047733957138e+00
+69          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.446320111150274e-01, -2.385204762866371e+00
+70          ,  1.000000000000000e+00,  1.000000000000000e+00, -2.832821671606189e-01, -2.001289065001360e+00
+71          ,  1.000000000000000e+00,  1.000000000000000e+00, -3.128846468236810e-01, -1.884355389893358e+00
+72          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.383377419667691e-01, -2.044929395284636e+00
+73          ,  1.000000000000000e+00,  1.000000000000000e+00, -8.811671096262539e-01, -1.442201355797836e+00
+74          ,  1.000000000000000e+00,  1.000000000000000e+00,  6.544208451153577e-02, -1.908068625698732e+00
+75          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.036366915038913e+00, -2.016924107725964e+00
+76          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.334871147341559e-01, -1.634604715418913e+00
+77          ,  1.000000000000000e+00,  1.000000000000000e+00,  7.123254204690519e-01, -2.150275095672414e+00
+78          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.759107096658776e+00, -2.342128876529649e+00
+79          ,  1.000000000000000e+00,  1.000000000000000e+00,  5.445726305421505e-02, -1.698710028312236e+00
+80          ,  1.000000000000000e+00,  1.000000000000000e+00,  7.553415620459540e-01,  1.000000000000000e-04
+81          ,  1.000000000000000e+00,  1.000000000000000e+00, -2.764313999225854e-02, -1.519240762581481e+00
+82          ,  1.000000000000000e+00,  1.000000000000000e+00, -4.406804475082324e-01, -2.024875026617072e+00
+83          ,  1.000000000000000e+00,  1.000000000000000e+00, -9.929257149617352e-01, -2.241942575124601e+00
+84          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.466600027097579e+00, -2.984909012607663e+00
+85          ,  1.000000000000000e+00,  1.000000000000000e+00, -5.990304867158840e-01, -2.388164897459385e+00
+86          ,  1.000000000000000e+00,  1.000000000000000e+00,  3.040420794796370e-01, -1.894050465195215e+00
+87          ,  1.000000000000000e+00,  1.000000000000000e+00, -5.909515296974093e-01, -2.454144932345226e+00
+88          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.091152792865723e+00, -2.563576277205860e+00
+89          ,  1.000000000000000e+00,  1.000000000000000e+00, -6.755252548115369e-01, -2.593071076035451e+00
+90          ,  1.000000000000000e+00,  1.000000000000000e+00,  6.506490705074306e-01, -2.742653045444400e+00
+91          ,  1.000000000000000e+00,  1.000000000000000e+00,  1.321034297602704e+00, -2.220389516459539e+00
+92          ,  1.000000000000000e+00,  1.000000000000000e+00,  3.854877052279315e-02, -2.765058645463596e+00
+93          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.153450083656313e-01, -1.522894852256558e+00
+94          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.185090801197946e-01, -2.756212326574877e+00
+95          ,  1.000000000000000e+00,  1.000000000000000e+00,  3.123253615639401e-01, -3.575465250587423e+00
+96          ,  1.000000000000000e+00,  1.000000000000000e+00,  2.245979218959215e-02, -2.016739417798566e+00
+97          ,  1.000000000000000e+00,  1.000000000000000e+00, -1.260091086602861e-01, -3.076103843283174e+00
+98          ,  1.000000000000000e+00,  1.000000000000000e+00, -3.656366231240911e-01, -3.116616975503573e+00
+99          ,  1.000000000000000e+00,  1.000000000000000e+00, -4.323166403743459e-01, -1.373441707188801e+00
diff --git a/tests/pytest/test_descriptor_identifier/model_files/train_log_regressor.dat b/tests/pytest/test_descriptor_identifier/model_files/train_log_regressor.dat
index f33da46e688e73a3c081fb85dd020bd315e3a8b7..7a569fde93567aef38d22bb0b899e2cd50e137df 100644
--- a/tests/pytest/test_descriptor_identifier/model_files/train_log_regressor.dat
+++ b/tests/pytest/test_descriptor_identifier/model_files/train_log_regressor.dat
@@ -1,9 +1,9 @@
-# exp(c0) * ((B + A))^a0 * ((|D - B|))^a1
+# ((B + A))^a0 * ((|D - B|))^a1
 # Property Label: $Prop$; Unit of the Property: Unitless
 # RMSE: 3.17364877036896e-10; Max AE: 3.06818037643097e-09
 # Coefficients
-# Task   a0                      a1                      c0
-# all ,  1.199999999999988e+00, -1.950000000000029e+00,  2.194569927587456e-13, 
+# Task   a0                      a1
+# all ,  1.199999999999988e+00, -1.950000000000029e+00,
 # Feature Rung, Units, and Expressions
 # 0;  1; Unitless;                                         1|0|add; (B + A); $\left(B + A\right)$; (B + A); B,A
 # 1;  1; Unitless;                                         3|1|abd; (|D - B|); $\left(\left|D - B\right|\right)$; abs(D - B); D,B
diff --git a/tests/pytest/test_descriptor_identifier/test_class_model_from_file.py b/tests/pytest/test_descriptor_identifier/test_class_model_from_file.py
index 1de2f3ec5a1f65297555f16be8e23e3bf128c68a..c3c398d64e6a533da1c38e6b0b17632d90c01c2f 100644
--- a/tests/pytest/test_descriptor_identifier/test_class_model_from_file.py
+++ b/tests/pytest/test_descriptor_identifier/test_class_model_from_file.py
@@ -25,19 +25,38 @@ parent = Path(__file__).parent
 
 
 def test_class_model_from_file():
+    try:
+        model = load_model(
+            str(parent / "model_files/train_classifier_fail_overlap.dat"),
+            str(parent / "model_files/test_classifier.dat"),
+        )
+        raise ValueError("Model created that should fail")
+    except RuntimeError:
+        pass
+
+    try:
+        model = load_model(
+            str(parent / "model_files/train_classifier.dat"),
+            str(parent / "model_files/test_classifier_fail_overlap.dat"),
+        )
+        raise ValueError("Model created that should fail")
+    except RuntimeError:
+        pass
+
     model = load_model(
         str(parent / "model_files/train_classifier.dat"),
         str(parent / "model_files/test_classifier.dat"),
     )
 
-    mat_fxn_fn = "model_classifier.m"
+    mat_fxn_fn = "test_matlab_fxn/model_classifier"
     mat_fxn_fn_real = str(parent / "matlab_functions" / "model_classifier.m")
 
     model.write_matlab_fxn(mat_fxn_fn)
     actual_lines = open(mat_fxn_fn_real).readlines()
-    test_lines = open(mat_fxn_fn).readlines()
+    test_lines = open(mat_fxn_fn + ".m").readlines()
 
-    Path(mat_fxn_fn).unlink()
+    Path(mat_fxn_fn + ".m").unlink()
+    Path("test_matlab_fxn").rmdir()
     for tl, al in zip(test_lines, actual_lines):
         assert tl == al
 
diff --git a/tests/pytest/test_descriptor_identifier/test_class_model_train_from_file.py b/tests/pytest/test_descriptor_identifier/test_class_model_train_from_file.py
index 69789dae0d77152dff61581ff316fd9e3524d66c..575b841cb2320d95ecfd0a471f84836e08e5702e 100644
--- a/tests/pytest/test_descriptor_identifier/test_class_model_train_from_file.py
+++ b/tests/pytest/test_descriptor_identifier/test_class_model_train_from_file.py
@@ -25,6 +25,14 @@ parent = Path(__file__).parent
 
 
 def test_class_model_train_from_file():
+    try:
+        model = load_model(
+            str(parent / "model_files/train_classifier_fail_overlap.dat"),
+        )
+        raise ValueError("Model created that should fail")
+    except RuntimeError:
+        pass
+
     model = load_model(str(parent / "model_files/train_classifier.dat"))
 
     assert np.all(np.abs(model.fit - model.prop_train) < 1e-7)
diff --git a/tests/pytest/test_descriptor_identifier/test_classifier.py b/tests/pytest/test_descriptor_identifier/test_classifier.py
index 269603494f597107bdf589c0529d39729b288172..c784219eee34f863a6ba9bd309c2b895c1b32bf2 100644
--- a/tests/pytest/test_descriptor_identifier/test_classifier.py
+++ b/tests/pytest/test_descriptor_identifier/test_classifier.py
@@ -121,11 +121,17 @@ def test_sisso_classifier():
     shutil.rmtree("models/")
     shutil.rmtree("feature_space/")
 
-    assert sisso.models[0][0].n_convex_overlap_train == 4
-    assert sisso.models[1][0].n_convex_overlap_train == 0
+    # assert sisso.models[0][0].n_convex_overlap_train == 4
+    # assert sisso.models[1][0].n_convex_overlap_train == 0
 
     assert sisso.models[0][0].n_convex_overlap_test == 0
-    assert sisso.models[1][0].n_convex_overlap_test == 0
+    # assert sisso.models[1][0].n_convex_overlap_test == 0
+
+    assert np.all(sisso.prop_train != inputs.prop_train)
+    assert np.all(sisso.prop_test != inputs.prop_test)
+
+    assert np.all(sisso.task_sizes_train == inputs.task_sizes_train)
+    assert np.all(sisso.task_sizes_test == inputs.task_sizes_test)
 
 
 if __name__ == "__main__":
diff --git a/tests/pytest/test_descriptor_identifier/test_log_reg_model_from_file.py b/tests/pytest/test_descriptor_identifier/test_log_reg_model_from_file.py
index a7ad48dd749cedcf2f44c2c22fac76c301e7dbca..e125c8e45f09876b0bc65ac3adadda677298178d 100644
--- a/tests/pytest/test_descriptor_identifier/test_log_reg_model_from_file.py
+++ b/tests/pytest/test_descriptor_identifier/test_log_reg_model_from_file.py
@@ -55,7 +55,7 @@ def test_log_reg_model_from_file():
     assert model.feats[1].postfix_expr == "3|1|abd"
 
     actual_coefs = [
-        [1.20, -1.95, 2.194569927587456e-13],
+        [1.20, -1.95],
     ]
 
     assert np.all(
@@ -89,9 +89,10 @@ def test_log_reg_model_from_file():
     assert model.percentile_95_ae < 1e-7
     assert model.percentile_95_test_ae < 1e-7
 
+    print(model.latex_str)
     assert (
         model.latex_str
-        == "$\\exp\\left(c_0\\right)\\left(\\left(B + A\\right)\\right)^{a_0}\\left(\\left(\\left|D - B\\right|\\right)\\right)^{a_1}$"
+        == "$\\left(\\left(B + A\\right)\\right)^{a_0}\\left(\\left(\\left|D - B\\right|\\right)\\right)^{a_1}$"
     )
 
 
diff --git a/tests/pytest/test_descriptor_identifier/test_log_reg_train_model_from_file.py b/tests/pytest/test_descriptor_identifier/test_log_reg_train_model_from_file.py
index fdb945401e9182dc1fb3a66a855ded363eef5ba6..d6ed7824cd33aa0906ade355b13878bdd81d3f7a 100644
--- a/tests/pytest/test_descriptor_identifier/test_log_reg_train_model_from_file.py
+++ b/tests/pytest/test_descriptor_identifier/test_log_reg_train_model_from_file.py
@@ -40,7 +40,7 @@ def test_log_reg_model_from_file():
     assert model.feats[1].postfix_expr == "3|1|abd"
 
     actual_coefs = [
-        [1.20, -1.95, 2.194569927587456e-13],
+        [1.20, -1.95],
     ]
 
     assert np.all(
@@ -60,7 +60,7 @@ def test_log_reg_model_from_file():
     assert model.percentile_95_ae < 1e-7
     assert (
         model.latex_str
-        == "$\\exp\\left(c_0\\right)\\left(\\left(B + A\\right)\\right)^{a_0}\\left(\\left(\\left|D - B\\right|\\right)\\right)^{a_1}$"
+        == "$\\left(\\left(B + A\\right)\\right)^{a_0}\\left(\\left(\\left|D - B\\right|\\right)\\right)^{a_1}$"
     )
 
 
diff --git a/tests/pytest/test_descriptor_identifier/test_log_regressor.py b/tests/pytest/test_descriptor_identifier/test_log_regressor.py
index bb56c606ba77f3f97bf53b180fefe9b4eb927735..292868b6c9fa97f25f4a6d6fa06c68efcde97f54 100644
--- a/tests/pytest/test_descriptor_identifier/test_log_regressor.py
+++ b/tests/pytest/test_descriptor_identifier/test_log_regressor.py
@@ -83,6 +83,12 @@ def test_sisso_log_regressor():
     assert sisso.models[1][0].rmse < 1e-7
     assert sisso.models[1][0].test_rmse < 1e-7
 
+    assert np.all(np.abs(sisso.prop_train - np.log(inputs.prop_train)) < 1e-10)
+    assert np.all(np.abs(sisso.prop_test - np.log(inputs.prop_test)) < 1e-10)
+
+    assert np.all(sisso.task_sizes_train == inputs.task_sizes_train)
+    assert np.all(sisso.task_sizes_test == inputs.task_sizes_test)
+
 
 if __name__ == "__main__":
     test_sisso_log_regressor()
diff --git a/tests/pytest/test_descriptor_identifier/test_regressor.py b/tests/pytest/test_descriptor_identifier/test_regressor.py
index 9bdbf81eee94ced8514a37f1911fb4fbd8e41b38..1b7be5164e212b87ffa734894917e34f8cad2c33 100644
--- a/tests/pytest/test_descriptor_identifier/test_regressor.py
+++ b/tests/pytest/test_descriptor_identifier/test_regressor.py
@@ -93,6 +93,12 @@ def test_sisso_regressor():
     assert np.all(inputs.task_sizes_train == sisso.models[1][0].task_sizes_train)
     assert np.all(inputs.task_sizes_test == sisso.models[1][0].task_sizes_test)
 
+    assert np.all(np.abs(sisso.prop_train - inputs.prop_train) < 1e-10)
+    assert np.all(np.abs(sisso.prop_test - inputs.prop_test) < 1e-10)
+
+    assert np.all(sisso.task_sizes_train == inputs.task_sizes_train)
+    assert np.all(sisso.task_sizes_test == inputs.task_sizes_test)
+
 
 if __name__ == "__main__":
     test_sisso_regressor()
diff --git a/tests/pytest/test_feature_creation/test_feature_space/test_feature_space.py b/tests/pytest/test_feature_creation/test_feature_space/test_feature_space.py
index 2987cf21c929ebcf6890722874c0cfc9d840003e..9a7dfe9a689351b44fd0e55ec047d218a0ca4c97 100644
--- a/tests/pytest/test_feature_creation/test_feature_space/test_feature_space.py
+++ b/tests/pytest/test_feature_creation/test_feature_space/test_feature_space.py
@@ -36,7 +36,7 @@ def test_feature_space():
             f"feat_{ff}",
             np.random.random(task_sizes_train[0]) * 1e2 - 50,
             np.random.random(task_sizes_test[0]) * 1e2 - 50,
-            Unit(),
+            Unit("s"),
         )
         for ff in range(10)
     ]
@@ -50,6 +50,34 @@ def test_feature_space():
     inputs.max_rung = 2
     inputs.n_sis_select = 10
 
+    try:
+        inputs.n_rung_generate = 2
+        feat_space = FeatureSpace(inputs)
+        raise ValueError("FeatureSpace created with invalid parameters")
+    except RuntimeError:
+        inputs.n_rung_generate = 0
+        pass
+
+    try:
+        inputs.n_rung_generate = 1
+        inputs.n_rung_store = 2
+        feat_space = FeatureSpace(inputs)
+        raise ValueError("FeatureSpace created with invalid parameters")
+    except RuntimeError:
+        inputs.n_rung_generate = 0
+        inputs.n_rung_store = 1
+        pass
+
+    try:
+        inputs.allowed_ops = ["exp"]
+        feat_space = FeatureSpace(inputs)
+        raise ValueError(
+            "FeatureSpace created when there is a rung with no features created"
+        )
+    except RuntimeError:
+        inputs.allowed_ops = ["add", "sub", "mult", "sq", "cb", "sqrt", "cbrt"]
+        pass
+
     feat_space = FeatureSpace(inputs)
     feat_space.sis(inputs.prop_train)
 
@@ -76,10 +104,25 @@ def test_feature_space():
     assert feat_space.get_feature(0).expr == "feat_0"
     assert feat_space.phi_selected[1].d_mat_ind == 1
 
+    try:
+        feat_space.remove_feature(feat_space.phi_selected[0].feat_ind)
+        raise ValueError("Removed selected feature.")
+    except RuntimeError:
+        pass
+
     test_expr = feat_space.get_feature(len(feat_space.phi) - 2).expr
     feat_space.remove_feature(len(feat_space.phi) - 2)
     assert feat_space.get_feature(len(feat_space.phi) - 2).expr != test_expr
 
+    if feat_space.parameterized_feats_allowed:
+        try:
+            inputs.max_param_depth = 10
+            feat_space = FeatureSpace(inputs)
+            raise ValueError("FeatureSpace created with invalid parameters")
+        except RuntimeError:
+            inputs.max_param_depth = 0
+            pass
+
 
 if __name__ == "__main__":
     test_feature_space()