CorrelatedFieldMaker amplitude normalization breaks fluctuation amplitude
If one adds more than one amplitude operator to a CorrelatedFieldMaker
and sets a very small/large offset standard deviation, the field fluctuation amplitude gets distorted.
This is because in get_normalized_amplitudes()
every amplitude operator gets devided by self.azm
and the results of this function are then multiplied with an outer product to create the joint amplitude operator.
In CFs with more than one amplitude operator, this results in the joint amplitude operator being divided by multiples of the zeromode operator and thus an incorrect output scaling behavior.
To demonstrate this effect, I created correlated field operators for identical 2d fields, but one with a single amplitude operator for ift.RGSpace((N, N)
and one with two amplitude operators on ift.RGSpace(N)
. From top to bottom, I varied the offset std
setting passed to set_amplitude_total_offset()
and in each case plotted a histogram of the sampled fields' standard deviation.
One can clearly see the output fluctuation amplitude is independant from the offset_std
for the 'single' case, as it should be, but not for the 'dual' case.
A fix for this is proposed in merge requests !669 (closed) and !670 (closed).
To reproduce, run the following code:
import nifty7 as ift
import matplotlib.pyplot as plt
fluct_pars = {
'fluctuations': (1.0, 0.1),
'flexibility': None,
'asperity': None,
'loglogavgslope': (-2.0, 0.2),
'prefix': 'flucts_',
'dofdex': None
}
n_offsets = 5
n_samples = 1000
dom_single = ift.RGSpace((25, 25))
dom_dual = ift.RGSpace(25)
fig, axs = plt.subplots(nrows=n_offsets, ncols=2, figsize=(12, 4 * n_offsets))
for i in range(n_offsets):
cfmaker_single = ift.CorrelatedFieldMaker(prefix='str(i)_')
cfmaker_dual = ift.CorrelatedFieldMaker(prefix='str(i)_')
cfmaker_single.add_fluctuations(dom_single, **fluct_pars)
cfmaker_dual.add_fluctuations(dom_dual, **fluct_pars)
cfmaker_dual.add_fluctuations(dom_dual, **fluct_pars)
offset_std = 10 ** -i
zm_pars = {'offset_mean': 0.,
'offset_std': (offset_std, offset_std / 10.)}
cfmaker_single.set_amplitude_total_offset(**zm_pars)
cfmaker_dual.set_amplitude_total_offset(**zm_pars)
cf_single = cfmaker_single.finalize(prior_info=0)
cf_dual = cfmaker_dual.finalize(prior_info=0)
samples_single = [cf_single(ift.from_random(cf_single.domain)).val.std() for _ in range(n_samples)]
samples_dual = [cf_dual(ift.from_random(cf_dual.domain)).val.std() for _ in range(n_samples)]
_ = axs[i, 0].hist(samples_single, bins=20, density=True)
_ = axs[i, 1].hist(samples_dual, bins=20, density=True)
axs[i, 0].set_title(f'offset_std = {offset_std:1.0e}, single')
axs[i, 1].set_title(f'offset_std = {offset_std:1.0e}, dual')
axs[i, 0].set_xlabel('sample std')
axs[i, 1].set_xlabel('sample std')
plt.tight_layout()
plt.show()
Any comments?