Go to the U of M home page
School of Physics & Astronomy
Probe Mission Study Wiki
foregroundstelecon20210909

Telecon Notes Sept. 9, 2021

Agenda:

  1. NILC
  2. Kris

=== Notes (Shaul) ===

    • Unni's simulations can do models 91, 92, and 93. Models 96 and 98 are not part of the PySM library and would be more difficult to implement. They are also like to be more difficult to account for with Commander and thus may require more time investment.
    • consensus - move ahead with 91, 92, 93 and commander 1.
    • Otherwise, Ragnhild is still working on 91 with r=0.
  • NILC: Mathieu has all the results.
    • SH advocates making a table with the various results
    • SH will ask Colin and Alex to participate in telecon and explain the 73%
    • MR should change the results to 73% delensing
  • Small scale synchrotron
    • still waiting for Kris' results

Hi all,

To us, the question “So, what are the lowest and highest frequencies that should be observed, and why?” really seems quite poorly defined, and will never have a concrete answer. It depends on way too many variables, and setting the limit at 10, 15, 20, or 30 GHz (and similarly at the high frequency end) will to a large extent depend on personal preferences and biases.

Rather than focusing too hard on the specific physics in question at either side, to us it seems more fruitful to consider signal-to-noise ratio, which really is what this is all about. The CMB uncertainty is essentially the convolution of the individual component uncertainties and the condition number of the mixing matrix. So the goal is always both to reduce these two variables as much as possible. Then it all becomes a question of economy: What's the cheapest way of doing that? Is it by extending the frequency range (which may require new technology or a larger telescope), or is it by adding more detectors near the foreground minimum?

Of course, the economy argument very quickly leads to a desire for extreme frequencies. On the low-frequency side in polarization, which is dominated by synchrotron with a frequency scaling of nu^{-3}, a single 10 GHz detector will have (10/30)^-3 = 27 times the signal level of a 30 GHz detector. And all of that goes linearly into the S/N budget, increasing directly the total S/N by a factor of 27. In other words, one single 10 GHz detector will have the same effective S/N as (27^2 =) 729 30 GHz detectors, since Gaussian noise adds in quadrature, and increasing the detector count affects N, not S. Of course, one 30 GHz detector requires 1/9'th of the focal plane area compared to a 10 GHz detector, but you still need 729/9 = 81 times the focal plane area with a 30 GHz system compared to a 10 GHz system to achieve the same effective S/N – which is a *lot* more expensive.

So, the point here really is just that the nu^-3 scaling makes it really economical to achieve high S/N by going to lower frequencies. But where is the actual cut-off? Impossible to say, without knowing more about the actual instrument design and the precise physics. But we don't have any reason to think that something very strange happens to the synchrotron spectrum above 10 GHz at least, so if you can fit a 10 GHz detector into your focalplane without requiring a massive redesign, you very likely want to do so, yes. A system with, say one 10 GHz detector and two 15 GHz detectors would be absolutely brilliant.

The same arguments apply to the high-frequency tail: You want to go as high as possible, without detector issues or thermal dust modelling killing you. Higher than 1 THz would be great. However, on that side there is another issue to have in mind, and that is the super-exponential drop-off of the CMB spectrum above 400 GHz. The single most important thing is to exploit that, which reduces the condition number of the mixing matrix dramatically. (Essentially, it becomes easier to distinguish between CMB and dust.) So one gains very quickly by going from 400 to 500 GHz, but only more slowly after that, since one then only reduces the conditional dust uncertainty. That's definitely useful – but less useful than breaking the degeneracy between CMB and dust.

So, an ideal instrument would be something like 10 to 1000 GHz, as one gains quickly S/N by going to the extreme edges. On the opposite side, you definitely do not want to narrower than 30 to 500 GHz, as then the mixing matrix condition number quickly blows up, and the confusion penalty becomes large. But, really, this is all just a matter of economy: One *can* do this with 60-250 GHz. It just becomes really, really expensive to fit in the millions of detectors required to achieve the necessary S/N, when a couple of extreme frequency detectors do the same job.

So that's the selling argument we would recommend using: It's not that extreme frequency are *required* to do the job. It's just that it's much, much cheaper than a narrow range instrument.

Thanks,

Ingunn and Hans Kristian

foregroundstelecon20210909.txt · Last modified: 2021/09/10 11:33 by hanany