Cosmology with the Square Kilometre Array

A large fraction of my time over the last 18 months has been spent working out parts of the cosmology science case for the Square Kilometre Array, a gigantic new radio telescope that will be built (mostly) across South Africa and Australia over the coming decade. It’s been in the works since the early 90’s and – after surviving the compulsory planning, political wrangling, and cost-cutting phases that all Big Science projects are subjected to – will soon be moving to the stage where metal is actually put into the ground. (Well, soon-ish – the first phase of construction is due for completion in 2023.)

Infographic: SKA will have 8x the sensitivity of LOFAR.

A detailed science case for the SKA was developed around a decade ago, but of course a lot has changed since then. There was a conference in Sicily around this time last year where preliminary updates on all sorts of scientific possibilities were presented, which were then fleshed out into more detailed chapters for the conference proceedings. While a lot of the chapters were put on arXiv in January, it’s good to see that all of them have now been published (online, for free). This is, effectively, the new SKA science book, and it’s interesting to see how it’s grown since its first incarnation.

My contribution has mostly been the stuff on using surveys of neutral hydrogen (HI) to constrain cosmological parameters. I think it’s fair to say that most cosmologists haven’t paid too much attention to the SKA in recent years, apart from those working on the Epoch of Reionisation. This is presumably because it all seemed a bit futuristic; the headline “billion galaxy” spectroscopic redshift survey – one of the original motivations for the SKA – requires Phase 2 of the array, which isn’t due to enter operation until closer to 2030. Other (smaller) large-scale structure experiments will return interesting data long before this.

Artist's impression of the SKA1-MID dish array.

We’ve recently realised that we can do a lot of competitive cosmology with Phase 1 though, using a couple of different survey methods. One option is to perform a continuum survey [pdf], which can be used to detect extremely large numbers of galaxies, albeit without the ability to measure their redshifts. HI spectroscopic galaxy surveys rely on detecting the redshifted 21cm line in the frequency spectrum of a galaxy, which requires narrow frequency channels (and thus high sensitivity/long integration times). This is time consuming, and Phase 1 of the SKA simply isn’t sensitive enough to detect a large enough number of galaxies in this way in a reasonable amount of time.

Radio galaxy spectra also exhibit a broad, relatively smooth continuum, however, which can be integrated over a wide frequency range, thus enabling the array to see many more (and fainter) galaxies for a given survey time. Redshift information can’t be extracted, as there are no features in the spectra whose shift can be measured, meaning that one essentially sees a 2D map of the galaxies, instead of the full 3D distribution. This loss of information is felt acutely for some purposes – precise constraints on the equation of state of dark energy, w(z), can’t be achieved, for example. But other questions – like whether the matter distribution violates statistical isotropy [pdf], or whether the initial conditions of the Universe were non-Gaussiancan be answered using this technique. The performance of SKA1 in these domains will be highly competitive.

Another option is to perform an intensity mapping survey. This gets around the sensitivity issue by detecting the integrated HI emission from many galaxies over a comparatively large patch of the sky. Redshift information is retained – the redshifted 21cm line is still the cause of the emission – but angular resolution is sacrificed, so that individual galaxies cannot be detected. The resulting maps are of the large-scale matter distribution as traced by the HI distribution. Since the large-scale information is what cosmologists are usually looking for (for example, the baryon acoustic scale, which is used to measure cosmological distances, is something like 10,000 times the size of an individual galaxy), the loss of small angular scales is not so severe, and so this technique can be used to precisely measure quantities like w(z). We explored the relative performance of intensity mapping surveys in a paper last year, and found that, while not quite as good as its spectroscopic galaxy survey contemporaries like Euclid, SKA1 will still be able to put strong (and useful!) constraints on dark energy and other cosmological parameters. This is contingent on solving a number of sticky problems to do with foreground contamination and instrumental effects, however.

The comoving volumes and redshift ranges covered by various future surveys.

The thing I’m probably most excited about is the possibility of measuring the matter distribution on extremely large-scales, though. This will let us study perturbation modes of order the cosmological horizon at relatively late times (redshifts below ~3), where a bunch of neat relativistic effects kick in. These can be used to test fundamental physics in exciting new ways – we can get new handles on inflation, dark energy, and the nature of gravity using them. With collaborators, I recently put out two papers on this topic – one more general forecast paper, where we look at the detectability of these effects with various survey techniques, and another where we tried to figure out how these effects would change if the theory of gravity was something other than General Relativity. To see these modes, you need an extremely large survey, over a wide redshift range and survey area – and this is just what the SKA will be able to provide, in Phase 1 as well as Phase 2. While it turns out that a photometric galaxy survey with LSST (also a prospect for ~2030) will give the best constraints on the parameters we considered, an intensity mapping survey with SKA1 isn’t far behind, and can happen much sooner.

Cool stuff, no?

About Phil Bull

I'm a Lecturer in Cosmology at Queen Mary University of London. My research focuses on the effects of inhomogeneities on the evolution of the Universe and how we measure it. I'm also keen on stochastic processes, scientific computing, the philosophy of science, and open source stuff. View all posts by Phil Bull

3 responses to “Cosmology with the Square Kilometre Array

  • Phillip Helbig

    “this technique can be used to precisely measure quantities like w(z).”

    Usually, one has some simple parametrization of w(z), which of course might not be like the real form of the function at all, or one has some function from a specific model. What is the best way to get some model-independent information on w(z)?

    Or should one concentrate on other things as long as one can fit all (correct) observational data with one value (0.7 or so) of the cosmological constant (i.e. w=-1, independent of z)?

    Of course, there might be a cosmological constant and some other “dark energy” with a different equation of state, much like there might be dark matter and some modification of GR, although such situations are usually not what those who work on such non-standard theories would like to see.

    • Phil Bull

      Yes, I’ve done some work on this problem (and there was an SKA paper on model-independent constraints; see here [pdf]). I don’t think model-independent is the way to go – you can always come up with a “designer model” that will fit the data, and without some physics input you don’t have a good measure of how fine-tuned it is. In a paper from last year, we tried an alternative approach: starting with a simple but very broad class of dark energy theories, we mapped out the prior space of some observational parameters given a set of reasonable “physicality” constraints (while allowing a massive amount of freedom in the parameters of the model itself). We found that the “likely” region for those parameters – which were actually w_0 and w_a – is quite restricted. Cutting into that prior with future observations would allow firm physical conclusions to be drawn, whereas it’s less clear (to me at least) what you learn from most model-independent constraints. You can also do a rigorous Bayesian model comparison analysis if you have a model.

  • Phillip Helbig

    At the Moriond cosmology meeting in 2012, Richard Battye pointed out in his summary that the dispersion between various measurements of w, or between various measurements and -1, is much smaller than the average error bars on the measurements. In other words, people had found -1.01 +/- 0.2, -0.9 +/- 0.6, -1.0 +/- 0.4 etc.

Leave a comment