NIFTy issueshttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues2022-12-01T10:32:32Zhttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/354Nifty Documentation 7->82022-12-01T10:32:32ZVincent EberleNifty Documentation 7->8# Changes in the Documentation
We have been thinking about changing the default [NIFTy documentation](http://ift.pages.mpcdf.de/nifty/) to nifty8 as it is somehow the stable version now.
I think of two different scenarios:
1) We drop th...# Changes in the Documentation
We have been thinking about changing the default [NIFTy documentation](http://ift.pages.mpcdf.de/nifty/) to nifty8 as it is somehow the stable version now.
I think of two different scenarios:
1) We drop the online nifty7 documentation
2) We move the nifty8 index.html in to the bottom directory and the nifty7 into a 'nifty7' directory. (exactly the other way around as it is now)
Neither is much work. Personally, I would prefer the first option.
@gedenhof @pfrank @mtrVincent EberleVincent Eberlehttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/353Expensive sanity checks in optimize_kl.py?2022-11-09T13:10:23ZMartin ReineckeExpensive sanity checks in optimize_kl.py?The file `optimize_KL.py` contains several calls to `check_MPI_equality`, to ensure that certain objects are identical on all MPI tasks. While this is definitely useful, some of these calls try to compare objects that can become huge (wh...The file `optimize_KL.py` contains several calls to `check_MPI_equality`, to ensure that certain objects are identical on all MPI tasks. While this is definitely useful, some of these calls try to compare objects that can become huge (which leads to a lot of additional memory/time consumption) and will in extreme cases exceed 2GB, causing MPI failures.
Would it be possible to either remove these cases or simplify the comparison somehow, e.g. by hashing the data and comparing the hash across MPI tasks?
@pfrank, @gedenhof, @kjakohttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/352Minisanity history broken if likelihoods with different names are used2022-05-13T14:06:13ZPhilipp Arrasparras@mpa-garching.mpg.deMinisanity history broken if likelihoods with different names are usedReproducer:
```python
import nifty8 as ift
dom = ift.RGSpace(2)
lh0 = ift.GaussianEnergy(domain=dom).ducktape("inp")
lh1 = ift.GaussianEnergy(domain=dom).ducktape("inp")
lh0.name = "First lh"
lh1.name = "Second lh"
ic = ift.GradientNo...Reproducer:
```python
import nifty8 as ift
dom = ift.RGSpace(2)
lh0 = ift.GaussianEnergy(domain=dom).ducktape("inp")
lh1 = ift.GaussianEnergy(domain=dom).ducktape("inp")
lh0.name = "First lh"
lh1.name = "Second lh"
ic = ift.GradientNormController(iteration_limit=2)
mini = ift.SteepestDescent(ic)
def lhs(iglobal):
if iglobal == 0:
return lh0
return lh1
ift.optimize_kl(lhs, 2, 0, mini, ic, None, overwrite=True)
```
Output:
```
Iteration limit reached. Assuming convergence
========================================================
reduced χ² mean # dof
--------------------------------------------------------
Data residuals
First lh 0.0 0.0 2
Latent space
inp 0.0 0.0 2
========================================================
/home/philipp/git/nifty/nifty8/plot.py:342: UserWarning: Attempting to set identical left == right == 0.0 results in singular transformations; automatically expanding.
ax.set_xlim((mi-delta, ma+delta))
========================================================
reduced χ² mean # dof
--------------------------------------------------------
Data residuals
Second lh 0.0 0.0 2
Latent space
inp 0.0 0.0 2
========================================================
Traceback (most recent call last):
File "/home/philipp/asdf.py", line 18, in <module>
ift.optimize_kl(lhs, 2, 0, mini, ic, None, overwrite=True)
File "/home/philipp/git/nifty/nifty8/minimization/optimize_kl.py", line 392, in optimize_kl
_minisanity(likelihood_energy, iglobal, sl, comm, plot_minisanity_history)
File "/home/philipp/git/nifty/nifty8/minimization/optimize_kl.py", line 627, in _minisanity
v = ms_val[k1][k2][k3]
KeyError: 'First lh'
```Lukas PlatzLukas Platzhttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/351correlated Field model linearization adjoint very slow if total_N != 02022-03-29T18:52:49ZJakob Rothcorrelated Field model linearization adjoint very slow if total_N != 0As noticed by @parras @pfrank and me the adjoint of the linearization of the correlated field model is very slow if total_N != 0.
Here is a demo:
```
import nifty8 as ift
import numpy as np
sp1 = ift.RGSpace((4000, 4000))
cfmaker = ift...As noticed by @parras @pfrank and me the adjoint of the linearization of the correlated field model is very slow if total_N != 0.
Here is a demo:
```
import nifty8 as ift
import numpy as np
sp1 = ift.RGSpace((4000, 4000))
cfmaker = ift.CorrelatedFieldMaker('')
cfmaker.add_fluctuations(sp1, (0.1, 1e-2), (2, .2), (.01, .5), (-4, 2.),
'amp1')
cfmaker.set_amplitude_total_offset(0., (1e-2, 1e-6))
cf0 = cfmaker.finalize(0)
n_tot=1
cfmaker = ift.CorrelatedFieldMaker('', total_N=n_tot)
cfmaker.add_fluctuations(sp1, (0.1, 1e-2), (2, .2), (.01, .5), (-4, 2.),
'amp1', dofdex=np.arange(n_tot))
cfmaker.set_amplitude_total_offset(0., (1e-2, 1e-6), dofdex=np.arange(n_tot))
cf1 = cfmaker.finalize(0)
print("benchmark for total_N = 0")
ift.exec_time(cf0)
print("benchmark for total_N = 1")
ift.exec_time(cf1)
```
I had a quick look at the issue. The problem seems to be the _Distributor operator in the correlated field model file: https://gitlab.mpcdf.mpg.de/ift/nifty/-/blob/NIFTy_8/src/library/correlated_fields.py#L214
For the case of total_N != 0 this _Distributor is called multiple times. The call which is very slow in adjoint direction is in line: https://gitlab.mpcdf.mpg.de/ift/nifty/-/blob/NIFTy_8/src/library/correlated_fields.py#L362 and the line below.
Note this _Distibutor operator is only used for the case total_N != 0, and therefore the simple case with total_N = 0 is reasonably fast.https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/349Do not track ipynb and instead use jupytext2022-05-28T08:50:24ZGordian EdenhoferDo not track ipynb and instead use jupytextSee https://gitlab.mpcdf.mpg.de/ift/nifty/-/merge_requests/702#note_111326 .See https://gitlab.mpcdf.mpg.de/ift/nifty/-/merge_requests/702#note_111326 .Gordian EdenhoferGordian Edenhoferhttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/348Pipeline broken?2022-01-27T11:26:59ZJakob RothPipeline broken?It seems to me that our pipeline is broken. The check_no_assert step fails for !734 although no assert was introduced.It seems to me that our pipeline is broken. The check_no_assert step fails for !734 although no assert was introduced.https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/347More MPI Bugs2022-02-01T08:47:37ZJakob RothMore MPI BugsWhen I execute the getting_started_3.py script with MPI `mpiexec -n 2 python getting_started_3.py` I get the following error message:
```
Traceback (most recent call last):
File "/home/jakob/nifty/demos/getting_started_3.py", line 163,...When I execute the getting_started_3.py script with MPI `mpiexec -n 2 python getting_started_3.py` I get the following error message:
```
Traceback (most recent call last):
File "/home/jakob/nifty/demos/getting_started_3.py", line 163, in <module>
main()
File "/home/jakob/nifty/demos/getting_started_3.py", line 153, in main
[pspec.force(mock_position), samples.average(logspec).exp()],
File "/home/jakob/nifty/nifty8/minimization/sample_list.py", line 306, in average
return utilities.allreduce_sum(res, self.comm) / n
File "/home/jakob/nifty/nifty8/utilities.py", line 371, in allreduce_sum
vals[j] = vals[j] + comm.recv(source=who[j+step])
File "/home/jakob/nifty/nifty8/field.py", line 726, in func2
return self._binary_op(other, op)
File "/home/jakob/nifty/nifty8/field.py", line 689, in _binary_op
utilities.check_object_identity(other._domain, self._domain)
File "/home/jakob/nifty/nifty8/utilities.py", line 419, in check_object_identity
raise ValueError(f"Mismatch:\n{obj0}\n{obj1}")
ValueError: Mismatch:
DomainTuple, len: 1
* PowerSpace(harmonic_partner=RGSpace(shape=(128, 128), distances=(1.0, 1.0), harmonic=True), binbounds=None)
DomainTuple, len: 1
* PowerSpace(harmonic_partner=RGSpace(shape=(128, 128), distances=(1.0, 1.0), harmonic=True), binbounds=None)
```
When executing without MPI I get no error.https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/346Nifty installation2022-11-25T17:58:16ZVincent EberleNifty installationInstallation Issues
-------------------
Jakob Roth and me found a concerning issue.
Recently it is not possible to `pip install -e .`
Somehow this returns this error [1].
But the installation works for `pip install .`
You can try to ...Installation Issues
-------------------
Jakob Roth and me found a concerning issue.
Recently it is not possible to `pip install -e .`
Somehow this returns this error [1].
But the installation works for `pip install .`
You can try to reproduce it by uninstalling nifty and reinstalling it..
Does someone know why this is happening?
@jroth @parras @mtr @gedenhof
[1]
`ERROR: Command errored out with exit status 1:
command: /usr/bin/python3 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/home/veberle/MPA/nifty/setup.py'"'"'; __file__='"'"'/home/veberle/MPA/nifty/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' develop --no-deps --user --prefix=
cwd: /home/veberle/MPA/nifty/
Complete output (28 lines):
running develop
/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/veberle/MPA/nifty/setup.py", line 33, in <module>
setup(name="nifty8",
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/__init__.py", line 153, in setup
return distutils.core.setup(**attrs)
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/_distutils/core.py", line 148, in setup
return run_commands(dist)
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/_distutils/core.py", line 163, in run_commands
dist.run_commands()
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/_distutils/dist.py", line 967, in run_commands
self.run_command(cmd)
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/_distutils/dist.py", line 985, in run_command
cmd_obj.ensure_finalized()
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/_distutils/cmd.py", line 107, in ensure_finalized
self.finalize_options()
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/command/develop.py", line 52, in finalize_options
easy_install.finalize_options(self)
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 263, in finalize_options
self._fix_install_dir_for_user_site()
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 375, in _fix_install_dir_for_user_site
self.select_scheme(scheme_name)
File "/tmp/pip-build-env-9z5pmolp/overlay/lib/python3.8/site-packages/setuptools/command/easy_install.py", line 716, in select_scheme
scheme = INSTALL_SCHEMES[name]
KeyError: 'unix_user'
----------------------------------------
ERROR: Can't roll back nifty8; was not uninstalled
ERROR: Command errored out with exit status 1: /usr/bin/python3 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/home/veberle/MPA/nifty/setup.py'"'"'; __file__='"'"'/home/veberle/MPA/nifty/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' develop --no-deps --user --prefix= Check the logs for full command output.
`https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/345sample list average crashes for mpi run with map estimator2022-01-27T11:18:26ZJakob Rothsample list average crashes for mpi run with map estimatorWhen computing a map estimate with multiple MPI tasks the sample list average method crashes.
Here is an example code that works if executed normally with python, but crashes in an MPI run
```
import numpy as np
import nifty8 as ift
ift...When computing a map estimate with multiple MPI tasks the sample list average method crashes.
Here is an example code that works if executed normally with python, but crashes in an MPI run
```
import numpy as np
import nifty8 as ift
ift.random.push_sseq_from_seed(27)
try:
from mpi4py import MPI
comm = MPI.COMM_WORLD
master = comm.Get_rank() == 0
except ImportError:
comm = None
master = True
position_space = ift.RGSpace([128, 128])
op = ift.makeOp(ift.full(position_space, 10.))
noise = 0.1
N = ift.ScalingOperator(position_space, noise, np.float64)
mock_position = ift.from_random(op.domain)
data = op(mock_position) + N.draw_sample()
lh = ift.GaussianEnergy(mean=data, inverse_covariance=N.inverse) @ op
ic_sampling = ift.AbsDeltaEnergyController(
name="Sampling (linear)", deltaE=0.05, iteration_limit=10
)
ic_newton = ift.AbsDeltaEnergyController(
name="Newton", deltaE=0.5, convergence_level=2, iteration_limit=5
)
minimizer = ift.NewtonCG(ic_newton)
def callback(samples, i):
plot = ift.Plot()
mean = samples.average(op)
plot.add(mean, title="Reconstruction", zmin=0, zmax=1)
if master:
plot.output()
n_iterations = 3
n_samples = lambda iiter: 0 if iiter < 1 else 2
samples = ift.optimize_kl(
lh,
n_iterations,
n_samples,
minimizer,
ic_sampling,
None,
overwrite=True,
comm=comm,
callback=callback,
)
```https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/344Unexpected behaviour of `MultiField.scale`2022-01-31T18:02:56ZPhilipp Arrasparras@mpa-garching.mpg.deUnexpected behaviour of `MultiField.scale`The setup:
```python
import nifty8 as ift
import numpy as np
dom = ift.UnstructuredDomain([1])
mdom = {"a": dom, "b": dom}
fld = ift.from_random(mdom)
fld2 = ift.from_random(mdom)
```
I would expect the following line to fail since th...The setup:
```python
import nifty8 as ift
import numpy as np
dom = ift.UnstructuredDomain([1])
mdom = {"a": dom, "b": dom}
fld = ift.from_random(mdom)
fld2 = ift.from_random(mdom)
```
I would expect the following line to fail since the argument of scale is not a scalar.
```python
fld3 = fld.scale(fld2)
```
But it doesn't.
Does it at least compute something sensible?
```python
for kk in mdom.keys():
assert np.all(fld[kk].val * fld2[kk].val == fld3[kk].val)
```
No.https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/341Correlated Field model, `total_N != 0`: Sample statistic is not correct2021-12-08T10:00:54ZPhilipp Arrasparras@mpa-garching.mpg.deCorrelated Field model, `total_N != 0`: Sample statistic is not correctIf I compare prior samples generated by `SimpleCorrelatedField` and `CorrelatedFieldMaker` with `total_N > 0`, the statistics looks different.
```
import nifty8 as ift
offset_mean = 0.
offset_std = None
fluctuations = (0.1, 0.05)
loglo...If I compare prior samples generated by `SimpleCorrelatedField` and `CorrelatedFieldMaker` with `total_N > 0`, the statistics looks different.
```
import nifty8 as ift
offset_mean = 0.
offset_std = None
fluctuations = (0.1, 0.05)
loglogslope = (-4, 1)
dom = ift.RGSpace(2625, 20)
op = ift.SimpleCorrelatedField(dom, offset_mean, offset_std, fluctuations, None, None, loglogslope)
ift.single_plot([op(ift.from_random(op.domain)) for _ in range(20)], name="cfm_single.png")
N = 1
cfm = ift.CorrelatedFieldMaker("", N)
cfm.add_fluctuations(dom, fluctuations, None, None, loglogslope)
cfm.set_amplitude_total_offset(offset_mean, offset_std)
op = cfm.finalize(0)
prior_samples = [op(ift.from_random(op.domain)) for _ in range(10)]
p = ift.Plot()
for ii in range(N):
p.add([ift.DomainTupleFieldInserter(pp.domain, 0, (ii,)).adjoint(pp) for pp in prior_samples])
p.output(name="cfm_multi.png")
```
Prior samples for SimpleCorrelatedField:
![cfm_single](/uploads/3f9db21b292bccab96ec48c94e3296c6/cfm_single.png)
Prior samples for non-trivial CorrelatedFieldMaker with `N_total = 1`
![cfm_multi](/uploads/564ca8466f3e955ab7218bfe8ff8149f/cfm_multi.png)
ping: @jrothhttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/339optimize_kl: Improvement list2021-11-26T11:52:41ZPhilipp Arrasparras@mpa-garching.mpg.deoptimize_kl: Improvement list- [x] Fix sample plot with ground truth
- [x] Option for storing all intermediate latent positions
- [x] String of model and domain to hdf5
- [x] Save nifty random state
- [x] Pseudo code of optimize_kl into doc string
- [x] plottable sp...- [x] Fix sample plot with ground truth
- [x] Option for storing all intermediate latent positions
- [x] String of model and domain to hdf5
- [x] Save nifty random state
- [x] Pseudo code of optimize_kl into doc string
- [x] plottable spelling?
- [x] Check documentaiton on callback (iglobal)Philipp Arrasparras@mpa-garching.mpg.dePhilipp Arrasparras@mpa-garching.mpg.dehttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/337Possible error in assert_equal and assert_allclose for MultiFields2021-09-24T08:03:42ZPhilipp FrankPossible error in assert_equal and assert_allclose for MultiFieldsIn their current form, both methods `assert_equal` and `assert_allclose` only perform a one sided test for `MultiFields`. This leads to some unexpected behaviour when comparing two Fields which are only equal one some sub-domain. In part...In their current form, both methods `assert_equal` and `assert_allclose` only perform a one sided test for `MultiFields`. This leads to some unexpected behaviour when comparing two Fields which are only equal one some sub-domain. In particular the following test
```
import nifty8 as ift
d = ift.RGSpace(10)
d = ift.MultiDomain.make({'a' : d, 'b' : d})
f = ift.from_random(d)
fp = f.extract_by_keys(['a',])
ift.extra.assert_equal(fp, f)
```
will pass, but
```
ift.extra.assert_equal(f, fp)
```
produces an error, as only the keys of the first input are looped over. The same goes for `assert_allclose`.
This does not look like an intended behaviour to me. @mtr, @parras, am I missing something here?https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/336Sampling dtypes in sandwich operators2021-09-22T08:26:46ZPhilipp Arrasparras@mpa-garching.mpg.deSampling dtypes in sandwich operatorsI think (but I am not sure), that it is relatively important to fix the FIXME that @pfrank has put here:
https://gitlab.mpcdf.mpg.de/ift/nifty/-/blob/NIFTy_8/src/operators/sandwich_operator.py#L77I think (but I am not sure), that it is relatively important to fix the FIXME that @pfrank has put here:
https://gitlab.mpcdf.mpg.de/ift/nifty/-/blob/NIFTy_8/src/operators/sandwich_operator.py#L77https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/335CorrelatedFieldMaker amplitude normalization breaks fluctuation amplitude2021-08-24T10:52:00ZLukas PlatzCorrelatedFieldMaker amplitude normalization breaks fluctuation amplitudeIf one adds more than one amplitude operator to a `CorrelatedFieldMaker` and sets a very small/large offset standard deviation, the field fluctuation amplitude gets distorted.
This is because in `get_normalized_amplitudes()` every ampli...If one adds more than one amplitude operator to a `CorrelatedFieldMaker` and sets a very small/large offset standard deviation, the field fluctuation amplitude gets distorted.
This is because in `get_normalized_amplitudes()` every amplitude operator gets devided by `self.azm` and the results of this function are then multiplied with an outer product to create the joint amplitude operator.
In CFs with more than one amplitude operator, this results in the joint amplitude operator being divided by multiples of the zeromode operator and thus an incorrect output scaling behavior.
To demonstrate this effect, I created correlated field operators for identical 2d fields, but one with a single amplitude operator for `ift.RGSpace((N, N)` and one with two amplitude operators on `ift.RGSpace(N)`. From top to bottom, I varied the `offset std` setting passed to `set_amplitude_total_offset()` and in each case plotted a histogram of the sampled fields' standard deviation.
![output](/uploads/6779601e231148c36a77a91aa8e7fd62/bug.png)
One can clearly see the output fluctuation amplitude is independant from the `offset_std` for the 'single' case, as it should be, but not for the 'dual' case.
A fix for this is proposed in merge requests !669 and !670.
To reproduce, run the following code:
```
import nifty7 as ift
import matplotlib.pyplot as plt
fluct_pars = {
'fluctuations': (1.0, 0.1),
'flexibility': None,
'asperity': None,
'loglogavgslope': (-2.0, 0.2),
'prefix': 'flucts_',
'dofdex': None
}
n_offsets = 5
n_samples = 1000
dom_single = ift.RGSpace((25, 25))
dom_dual = ift.RGSpace(25)
fig, axs = plt.subplots(nrows=n_offsets, ncols=2, figsize=(12, 4 * n_offsets))
for i in range(n_offsets):
cfmaker_single = ift.CorrelatedFieldMaker(prefix='str(i)_')
cfmaker_dual = ift.CorrelatedFieldMaker(prefix='str(i)_')
cfmaker_single.add_fluctuations(dom_single, **fluct_pars)
cfmaker_dual.add_fluctuations(dom_dual, **fluct_pars)
cfmaker_dual.add_fluctuations(dom_dual, **fluct_pars)
offset_std = 10 ** -i
zm_pars = {'offset_mean': 0.,
'offset_std': (offset_std, offset_std / 10.)}
cfmaker_single.set_amplitude_total_offset(**zm_pars)
cfmaker_dual.set_amplitude_total_offset(**zm_pars)
cf_single = cfmaker_single.finalize(prior_info=0)
cf_dual = cfmaker_dual.finalize(prior_info=0)
samples_single = [cf_single(ift.from_random(cf_single.domain)).val.std() for _ in range(n_samples)]
samples_dual = [cf_dual(ift.from_random(cf_dual.domain)).val.std() for _ in range(n_samples)]
_ = axs[i, 0].hist(samples_single, bins=20, density=True)
_ = axs[i, 1].hist(samples_dual, bins=20, density=True)
axs[i, 0].set_title(f'offset_std = {offset_std:1.0e}, single')
axs[i, 1].set_title(f'offset_std = {offset_std:1.0e}, dual')
axs[i, 0].set_xlabel('sample std')
axs[i, 1].set_xlabel('sample std')
plt.tight_layout()
plt.show()
```
Any comments?https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/334`MetricGaussianKL` changed its interface (there is not `.make`) but there is ...2021-07-01T15:34:54ZGordian Edenhofer`MetricGaussianKL` changed its interface (there is not `.make`) but there is no corresponding changelog entryhttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/333New feature: Add flag to minisanity that disables terminal colors2021-08-05T12:39:13ZPhilipp Arrasparras@mpa-garching.mpg.deNew feature: Add flag to minisanity that disables terminal colorsPhilipp Arrasparras@mpa-garching.mpg.dePhilipp Arrasparras@mpa-garching.mpg.dehttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/321Windows compatibility2021-05-05T17:08:08ZLukas PlatzWindows compatibilityA collaborator of mine just tried to install Nifty on a Windows machine in an Anaconda environment and had the problem that the symlink from `nifty7` to `src` was apparently breaking the setup. Once she removed it and renamed `src` to `n...A collaborator of mine just tried to install Nifty on a Windows machine in an Anaconda environment and had the problem that the symlink from `nifty7` to `src` was apparently breaking the setup. Once she removed it and renamed `src` to `nifty7`, the setup worked.
I have not checked if this is the general behavior under Windows or if it is just her setup, but assume it is the former.
Has anybody else experience with this and can weigh in?
What was the rationale behind changing the source location and introducing the symlink?
Cheers,
Lukashttps://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/320Is `ift.operators.operator._FunctionApplier` exposed to the NIFTy namespace? ...2021-04-07T10:02:09ZLukas PlatzIs `ift.operators.operator._FunctionApplier` exposed to the NIFTy namespace? If not, why?I just had the case where I wanted to prepend a pointwise operator to a give operator. For *ap*pending a pointwise operator, we have the syntax `op.ptw()`, but do we also have a direct way to *pre*pend an operator?
What I came up with i...I just had the case where I wanted to prepend a pointwise operator to a give operator. For *ap*pending a pointwise operator, we have the syntax `op.ptw()`, but do we also have a direct way to *pre*pend an operator?
What I came up with in a hunch was `op @ ift.ScalingOperator(op.domain, 1.).abs()`, but that is just horrible.
Is there a simple way to do this that I forgot? If not, why don't we expose `operator._FunctionApplier` as `ift.FunctionApplier`?
Cheers!https://gitlab.mpcdf.mpg.de/ift/nifty/-/issues/319AttributeError: 'OperatorAdapter' object has no attribute 'duckape'2021-03-07T22:57:53ZGordian EdenhoferAttributeError: 'OperatorAdapter' object has no attribute 'duckape'OperatorAdapter, e.g. the adjoint of an operator, should support ducktaping.OperatorAdapter, e.g. the adjoint of an operator, should support ducktaping.