The Aquila consortiumThe Aquila consortium aims at understanding the Universe.
https://www.aquila-consortium.org/
Thu, 13 Jun 2019 20:10:56 +0200Thu, 13 Jun 2019 20:10:56 +0200Jekyll v3.8.3Algorithms for likelihood-free cosmological data analysis<h1 id="overview">Overview</h1>
<p>The extraction of physical information from wide and deep astronomical surveys relies on statistical techniques to compare models and observations. A common scenario in cosmology is when we can generate synthetic data through forward simulations, but cannot explicitly formulate the likelihood of the model. The generative process can be extremely general (a noisy non-linear dynamical system involving an unrestricted number of latent variables) and is often computationally expensive. Likelihood-free inference (LFI) provides a framework for performing Bayesian inference in this context, by replacing likelihood calculations with data model evaluations. In its simplest form, LFI takes the form of likelihood-free rejection sampling (LFRS), which tends to be (i) extremely expensive, since many simulated data sets get rejected, and (ii) very limited in the number of parameters that can be treated.</p>
<p>In two recent articles, we presented methodological advances, aiming at fitting cosmological data with “black-box” numerical models. Each of them addresses one of the shortcomings of LFRS. The first approach, BOLFI, is intended for specific cosmological models (with <script type="math/tex">n \lesssim 10</script> parameters) and a general exploration of parameter space. It combines Gaussian process regression of the distance between observed and simulated data with Bayesian optimization. As a result, the number of required simulations is reduced by several orders of magnitude with respect to LFRS. The second approach, SELFI, allows the inference of <script type="math/tex">n \gtrsim 100</script> parameters (as is necessary for a model-independent parametrization of theory) while assuming stronger prior constraints in parameter space. It relies on a Taylor expansion of the simulator to build an effective posterior distribution. The resulting algorithm allows LFI in much higher-dimensional settings than LFRS.</p>
<h1 id="likelihood-free-inference-of-black-box-data-models">Likelihood-free inference of black-box data models</h1>
<p>Simulator-based statistical models are usually given in terms of numerical “black-boxes”. They provide realistic predictions for artificial observations when provided with all necessary input parameters. These consist of target parameters as well as nuisance parameters such as initial phases, noise realization, sample variance, etc. This “latent space” can often be hundred-to-multi-million dimensional. Once all input parameters are fixed, the black-box typically consists of a simulation step and a data compression step. Black-box models can be written in a hierarchical form and conveniently represented graphically (figure 1).</p>
<p class="figure"><img src="/assets/posts/lfi/black-box_bhm.png" alt="Hierarchical representation of a black-box data model" />
<em>Hierarchical representation of a typical black-box data model. The rounded green boxes represent probability distributions and the purple square represent deterministic functions. For more details, see figure 1 in Leclercq et al. 2019.<sup id="fnref:3"><a href="#fn:3" class="footnote">1</a></sup></em></p>
<p>The goal of LFI is to find suitable approximations that allow an estimation of the probability distribution of target parameters conditional on observed data summaries, using only black-box evaluations.</p>
<h1 id="bolfi-bayesian-optimization-for-likelihood-free-inference">BOLFI: Bayesian Optimization for Likelihood-Free Inference</h1>
<p>BOLFI (Bayesian Optimization for Likelihood-Free Inference<sup id="fnref:1"><a href="#fn:1" class="footnote">2</a></sup><sup id="fnref:2"><a href="#fn:2" class="footnote">3</a></sup>) is a cutting-edge machine learning algorithm for LFI under the constraint of a very limited simulation budget (typically a few thousand), suitable when the problem has a sufficiently small number of target parameters (<script type="math/tex">n \lesssim 10</script>). Conventional approaches such as LFRS generally require too many simulations, due to their lack of knowledge about how the parameters affect the distance between observed and simulated data. As a response, BOLFI combines Gaussian process regression of this distance to build a surrogate surface with Bayesian Optimization to actively acquire training data (figure 2).</p>
<p class="figure wide"><img src="/assets/posts/lfi/bayesian_optimization.png" alt="Bayesian optimization" />
<em>Illustration of four consecutive steps of Bayesian optimization to learn a test function. For each step, the top panel shows the training data points (red dots) and the Gaussian process regression (blue line and shaded region). The bottom panel shows the acquisition function (solid green line). The next acquisition point, i.e. where to run a simulation to be added to the training set, is shown in orange. For more details, see figure 4 in Leclercq 2018.<sup id="fnref:2:1"><a href="#fn:2" class="footnote">3</a></sup></em></p>
<p>The target parameter space is explored efficiently and in all generality. We extended the method to use the optimal acquisition function for the purpose of minimizing the expected uncertainty in the approximate posterior density, in the parametric approach to likelihood approximation. As a result, the number of required simulations is typically reduced by two to three orders of magnitude, and the proposed acquisition function produces more accurate posterior approximations, as compared to LFRS.</p>
<h1 id="selfi-simulator-expansion-for-likelihood-free-inference">SELFI: Simulator Expansion for Likelihood-Free Inference</h1>
<p>Another limitation of conventional approaches to LFI is their inability to scale with the number of target parameters. In order to address problems of high-dimensional inference from black-box data models, we introduced SELFI (Simulator Expansion for Likelihood-Free Inference<sup id="fnref:3:1"><a href="#fn:3" class="footnote">1</a></sup>). Our approach builds upon a novel effective likelihood and upon the linearization of the simulator around an expansion point in parameter space. The workload with SELFI consists of evaluating the covariance matrix and the gradient of data summaries at the expansion point (figure 3). Contrary to likelihood-based Markov Chain Monte Carlo (MCMC) techniques and to BOLFI, it is fixed <em>a priori</em> and perfectly parallel.</p>
<p class="figure wide"><img src="/assets/posts/lfi/covariance_gradient.png" alt="Covariance and gradient of the black-box" />
<em>Covariance matrix (left) and gradient (right) of data summaries at the expansion point, evaluated through black-box realizations only. These are the only two ingredients necessary to apply SELFI. For more details, see figures 6 and 7 in Leclercq et al. 2019.<sup id="fnref:3:2"><a href="#fn:3" class="footnote">1</a></sup></em></p>
<p>The effective posterior of the target parameters is then obtained through simple “filter equations,” the form of which is analogous to a Wiener filter. SELFI allows the solution of inference tasks from black-box data models, in much higher dimension than conventional approaches to LFI.</p>
<h1 id="cosmological-applications-key-results">Cosmological applications: key results</h1>
<p>In respective papers, we presented the first applications of BOLFI and SELFI to cosmological data analysis.</p>
<h2 id="supernova-cosmology-with-bolfi">Supernova cosmology with BOLFI</h2>
<p>We applied BOLFI to the inference of cosmological parameters from the Joint Lightcurve Analysis (JLA) supernovae data. The model contains two cosmological parameters (the matter density of the Universe <script type="math/tex">\Omega_m</script> and the equation of state of dark energy <script type="math/tex">w</script>) and four nuisance parameters, which are marginalized over. The posterior contours obtained with MCMC, LFRS, and BOLFI are represented in figure 4.</p>
<p class="figure wide"><img src="/assets/posts/lfi/bolfi_jla.png" alt="Supernova cosmology with BOLFI" />
<em>Prior and posterior distributions for the joint inference of the matter density of the Universe, <script type="math/tex">\Omega_m</script>, and the dark energy equation of state, <script type="math/tex">w</script>, from the JLA supernovae data set. BOLFI (red posterior) reduces the number of necessary simulations by two orders of magnitude with respect to LFRS (green posterior) and three orders of magnitude with respect to MCMC (orange posterior). For more details, see figure 7 in Leclercq 2018.<sup id="fnref:2:2"><a href="#fn:2" class="footnote">3</a></sup></em></p>
<p>As can be observed, BOLFI is able to precisely recover the true posterior with as few as 6,000 simulations, which constitutes a reduction by two orders of magnitude with respect to LFRS and three orders of magnitude with respect to MCMC. This reduction in the number of required simulations accelerates the inference massively.</p>
<h2 id="primordial-power-spectrum-and-cosmological-parameters-inference-with-selfi">Primordial power spectrum and cosmological parameters inference with SELFI</h2>
<p>We applied SELFI to a realistic synthetic galaxy survey, with a data model accounting for physical structure formation and incomplete and noisy observations. This data model is provided by the publicly-available <strong>Simbelmynë</strong> code, a hierarchical probabilistic simulator of galaxy survey data.<sup id="fnref:4"><a href="#fn:4" class="footnote">4</a></sup> Through this application, we showed that the use of non-linear numerical models allows the galaxy power spectrum to be fitted up to at least <script type="math/tex">k_\mathrm{max} = 0.5~h/\mathrm{Mpc}</script>, which represents an increase by a factor of <script type="math/tex">\sim~5</script> in the number of modes used, with respect to traditional techniques. The result is an unbiased inference of the primordial power spectrum (living in <script type="math/tex">n =100</script> dimensions) across the entire range of scales considered, including a high-fidelity reconstruction of baryon acoustic oscillations (figure 5).</p>
<p class="figure wide"><img src="/assets/posts/lfi/selfi_power_spectrum.png" alt="Primordial power spectrum reconstruction with SELFI" />
<em>Primordial power spectrum inference with SELFI from a realistic synthetic galaxy survey. In spite of survey complications which limit the information captured, the inference is unbiased and the signature of baryon acoustic oscillations is well reconstructed up to <script type="math/tex">k \approx 0.3~h/\mathrm{Mpc}</script>, with 5 inferred acoustic peaks, result which could be improved using more volume (this analysis uses <script type="math/tex">(1~\mathrm{Gpc}/h)^3</script>). For more details, see figure 10 in Leclercq et al 2019.<sup id="fnref:3:3"><a href="#fn:3" class="footnote">1</a></sup></em></p>
<p>The primordial power spectrum can be seen as a largely agnostic and model-independent parametrization of theory, relying only on weak assumptions (isotropy and gaussianity). Using the linearized black-box, it can be easily translated <em>a posteriori</em> to constraints on specific cosmological models without (or with minimal) loss of information. For instance, constraints on the parameters of the standard cosmological model, for two different synthetic data realizations (with different input cosmologies, phase and noise realizations), are shown in figure 6.</p>
<p class="figure wide"><img src="/assets/posts/lfi/selfi_cosmology.png" alt="Inference of cosmological parameters with SELFI" />
<em>Cosmological parameter inference using a linearized black-box model of galaxy surveys. The prior is shown in blue, and the effective posteriors for two different data realizations are shown in red and green.</em></p>
<p>We therefore obtain an unbiased and robust measurement of cosmological parameters.</p>
<div class="footnotes">
<ol>
<li id="fn:3">
<p>F. Leclercq, W. Enzi, J. Jasche & A. Heavens 2019, <em>Primordial power spectrum and cosmology from black-box galaxy surveys</em>, submitted to MNRAS, <a href="https://arxiv.org/pdf/1902.10149">arxiv 1902.10149</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" /> <a href="#fnref:3" class="reversefootnote">↩</a> <a href="#fnref:3:1" class="reversefootnote">↩<sup>2</sup></a> <a href="#fnref:3:2" class="reversefootnote">↩<sup>3</sup></a> <a href="#fnref:3:3" class="reversefootnote">↩<sup>4</sup></a></p>
</li>
<li id="fn:1">
<p>M. U. Gutmann & J. Corander 2016, <em>Bayesian Optimization for Likelihood-Free Inference of Simulator-Based Statistical Models</em>, Journal of Machine Learning Research <strong>17</strong>, 1 (2016), <a href="https://arxiv.org/pdf/1501.03291">arxiv 1501.03291</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" /> <a href="#fnref:1" class="reversefootnote">↩</a></p>
</li>
<li id="fn:2">
<p>F. Leclercq 2018, <em>Bayesian optimisation for likelihood-free cosmological inference</em>, Physical Review D <strong>98</strong>, 063511 (2018), <a href="https://arxiv.org/pdf/1805.07152">arxiv 1805.07152</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" /> <a href="#fnref:2" class="reversefootnote">↩</a> <a href="#fnref:2:1" class="reversefootnote">↩<sup>2</sup></a> <a href="#fnref:2:2" class="reversefootnote">↩<sup>3</sup></a></p>
</li>
<li id="fn:4">
<p>The Simbelmynë code: <a href="http://simbelmyne.florent-leclercq.eu">homepage</a> <a href="#fnref:4" class="reversefootnote">↩</a></p>
</li>
</ol>
</div>
Thu, 25 Apr 2019 00:00:00 +0200
https://www.aquila-consortium.org/method/lfi.html
https://www.aquila-consortium.org/method/lfi.htmlmethodPainting halos from 3D dark matter fields<h1 id="overview">Overview</h1>
<p>Investigating the formation and evolution of dark matter halos, as the key building blocks of cosmic
large-scale structure, is essential for constraining various cosmological models and further
understanding our Universe. The highly non-linear dynamics involved nevertheless renders this a
complex problem, with computationally costly simulations of gravitational structure formation
currently the only tool to compute the non-linear evolution from initial conditions, yielding mock
dark matter halo catalogues as the main output. However, running very large simulations of pure dark
matter to generate fake observations of the full Universe several times is not feasible, requiring a
large amount of memory and disk storage. A way to emulate such simulations, quickly and reliably,
would be of use to a wide community as a new method for data analysis and light cone production for
the next cosmological survey missions such as Euclid and Large Synoptic Survey Telescope. In this
context, we employ a deep learning approach to construct an emulator to learn the mapping from dark
matter density to halo fields.</p>
<h1 id="halo-painting-network">Halo painting network</h1>
<p>Our physical mapping network is inspired by a recently proposed variant of generative models, known
as generative adversarial networks (GANs). In particular, we will use the key ideas in training WGANs,
i.e. GANs optimized using the Wasserstein distance, to ensure that our network is able to paint halos
well. A schematic of this Wasserstein mapping framework is provided in Fig. 1. Our generator is the
halo painting network whose role is to learn the underlying non-linear relationship between the input
3D density field and the corresponding halo count distribution. Our critic provides as output the
approximately learned Wasserstein distance between the real and predicted halo distributions.
Intuitively, this Wasserstein distance can be interpreted as the amount of work required to transform
a given probability distribution into the desired target distribution. This distance therefore
corresponds the loss function that must be minimized to train the halo painting network.</p>
<p class="figure wide"><img src="/assets/posts/halo_painting/WGN_schematic.jpg" alt="Schematic representation of Wasserstein halo painting network" />
<em>Schematic representation of Wasserstein halo painting network implemented in this work.
The role of the generator is to learn the underlying non-linear relationship between the
input 3D density field and the corresponding halo count distribution. The difference
between the output of the critic for the real and predicted halo distributions is the
approximately learnt Wasserstein distance and is used as the loss function which must be
minimized to train the generator.</em></p>
<h1 id="remarkable-performance-of-halo-painting-emulator">Remarkable performance of halo painting emulator</h1>
<p>We showcased the performance our halo painting model using quantitative diagnostics. As a preliminary
qualitative assessment, we performed a visual comparison. Fig. 2 depicts the reference and predicted
halo distributions. Qualitative agreement is impressive, implying that the halo painting network is
capable of mapping the complex structures of the cosmic web, such as halos, filaments and voids, to
the corresponding distribution of halo counts.</p>
<p class="figure wide"><img src="/assets/posts/halo_painting/visual_comparison_N500.jpg" alt="Visual comparison" />
<em>Prediction of 3D halo field by our halo painting model for a slice of depth <script type="math/tex">\sim 100h^{-1}</script> Mpc
and side length of <script type="math/tex">\sim2000h^{-1}</script> Mpc. A blind validation dataset is shown in the top right
panel, with the predicted halo count depicted below it. The corresponding second order Lagrangian Perturbation Theory (2LPT) density field is
displayed in the top left panel, with the difference between the reference and predicted halo
distributions depicted in the lower left panel. A visual comparison of the reference and predicted
halo count distributions indicates qualitatively the efficacy of our halo painting network.</em></p>
<h2 id="power-spectrum">Power spectrum</h2>
<p>As quantitative assessment, the standard practice in cosmology is to use summary statistics.
These summary statistics provide a reliable metric to evaluate our halo painting network in
terms of their capacity to encode essential information. Assuming the cosmological density field
is approximately a Gaussian random field, as is the case on the large scales or at earlier times,
the power spectrum provides a sufficient description of the field. We therefore demonstrated
the capability of our network in reproducing the power spectrum of the reference halos. The left
panel of Fig. 3 illustrates the extremely close agreement of the 3D power spectra of the reference
and predicted halo fields.</p>
<p>We investigated the influence of the fiducial cosmology adopted for the simulations on the efficacy
of our halo mapping model. In the right panel of Fig. 3, we show the network predictions for two
cosmology variants in terms of their respective transfer functions, which is the ratio of the
square root of the ratio of the predicted to reference power spectra. The corresponding transfer
functions show a deviation of about <script type="math/tex">10\%</script> from the reference power spectra of their respective
real halo distributions on the smallest and largest scales. This shows that our halo painting model
is slightly sensitive to the underlying cosmology at the level of the power spectrum.</p>
<p class="figure wide"><img src="/assets/posts/halo_painting/Pk_cosmo_variation.jpg" alt="3D power spectra of reference and predicted halo fields" />
<em>Left panel: Summary statistics of the 3D power spectra of the reference and predicted halo fields
for one thousand randomly selected patches. The solid lines indicate their respective means, while
the shaded regions indicate their respective <script type="math/tex">1\sigma</script> confidence regions, i.e. 68\% probability
volume. The above diagnostics demonstrate the ability of our halo painting model to reproduce the
characteristic statistics of the reference halo fields and therefore provide substantial
quantitative evidence for the performance of our neural network in mapping 3D density fields to
their corresponding halo distributions. Right panel: The corresponding transfer functions highlight
the consistency between the power spectra reconstructed from the predicted and real halo fields for
the three cosmology variants, with the deviation from their respective reference spectra being below
<script type="math/tex">10\%</script>.</em></p>
<h2 id="bispectrum">Bispectrum</h2>
<p>The non-linear dynamics involved in gravitational evolution of cosmic structures contributes to a
certain degree of non-Gaussianity of the cosmic density field on the small scales. Higher-order
statistics are therefore required to characterize this non-Gaussian field. We used the bispectrum
to quantify the spatial distribution of the density and halo fields. The bispectra reconstructed
from the second order Lagrangian Perturbation Theory (2LPT), reference and predicted halo fields are displayed in Fig. 4. In particular, we show
the bispectra for a given small- and large-scale configurations. The 2LPT halo field corresponds
to a statistical description of the halo distribution, derived from the 2LPT density field, which
is valid, by construction, at the level of two-point statistics and on large scales. This allows
us to make a fair comparison between the clustering of the respective halo fields. The left panels
of Fig. 4 demonstrate that our halo painting network reproduces the non-linear halo field both on
the small and large scales, and is therefore capable of mapping the complex cosmic structures
apparent in the reference halo field. Our network predictions also show a significant improvement
over the corresponding 2LPT halo fields. In the right panels of Fig. 4, we find that there is a
more significant dependence of our network on the fiducial cosmology at higher order statistics.</p>
<p class="figure wide"><img src="/assets/posts/halo_painting/bispectrum_cosmo_variation.jpg" alt="3D bispectra of reference and predicted halo fields" />
<em>Left panels: Summary statistics of the 3D bispectra of the 2LPT, reference and predicted halo
fields for a given small- and large-scale configurations, as indicated by their respective titles.
In both cases, there is a close agreement between the bispectra from the reference and predicted
halo distributions. Our network predictions are a significant improvement over the corresponding
2LPT halo fields. Right panels: Deviation from the 3D bispectra of the reference halo distributions
of the corresponding predictions for the two cosmology variants. The above bispectrum diagnostics
show that our network is more sensitive to the fiducial cosmology than at the level of power spectrum.
The <script type="math/tex">1\sigma</script> confidence regions for five hundred randomly selected patches are depicted in each panel.</em></p>
<h1 id="key-advantages">Key advantages</h1>
<ul>
<li>Extremely efficient once trained. Our emulator is capable of rapidly predicting simulations of halo
distribution based on a computationally cheap cosmic density field. For instance, the network
prediction for a <script type="math/tex">256^3</script> simulation size requires roughly one second on the NVIDIA Quadro P6000.</li>
<li>Can predict the 3D halo distribution for any arbitrary simulation box size. A large simulation box,
therefore, does not require tiling of smaller sub-elements. More importantly, this implies that our
neural network can be trained on smaller simulations and subsequently used to predict large halo
distributions.</li>
<li>Encodes mass information of halos, such that our method can predict the mass distribution of halos.</li>
<li>Allows us to bypass ad hoc galaxy bias models and work in terms of better understood models.</li>
</ul>
<h1 id="potential-applications">Potential applications</h1>
<ul>
<li>Fast generation of mock halo catalogues and light cone production. This would be useful for the data
analysis of upcoming large galaxy surveys of unprecedented sizes.</li>
<li>To fill in small-scale structure at a high resolution from low resolution large-scale simulations.</li>
<li>As a component in Bayesian forward modelling techniques for large-scale structure inference (cf. BORG)
or cosmological parameter inference (cf. ALTAIR) to accelerate the scientific process, rendering
detailed and high-resolution analyses feasible. This would provide statistically interpretable results,
while maintaining the scientific rigour.</li>
</ul>
<h1 id="references">References</h1>
<ul>
<li>D. Kodi Ramanah, T. Charnock & G. Lavaux, 2019, submitted to PRD, <a href="https://arxiv.org/pdf/1903.10524">arxiv 1903.10524</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" /></li>
<li>A notebook tutorial to paint the halos of the article: <a href="https://nbviewer.jupyter.org/github/doogesh/halo_painting/blob/master/wasserstein_halo_mapping_network.ipynb">notebook</a></li>
<li>Source code repository: <a href="https://github.com/doogesh/halo_painting">https://github.com/doogesh/halo_painting</a></li>
</ul>
Sun, 31 Mar 2019 00:00:00 +0100
https://www.aquila-consortium.org/method/halo-painting.html
https://www.aquila-consortium.org/method/halo-painting.htmlmethodBayesian treatment of unknown foregrounds<h1 id="overview">Overview</h1>
<p>To probe the Universe on the cosmological scales, we employ large galaxy redshift
catalogues which encode the spatial distribution of galaxies. However, these galaxy
surveys are contaminated by various effects, such as the contamination from dust,
stars and the atmosphere, commonly referred to as foregrounds. Conventional methods
for the treatment of such contaminations rely on a sufficiently precise estimate of
the map of expected foreground contaminants to account for them in the statistical
analysis. Such approaches exploit the fact that the sources and mechanisms involved
in the generation of these contaminants are well-known.</p>
<p>But how can we ensure robust cosmological inference from galaxy surveys if we are
facing as yet unknown foreground contaminations? In particular, the next-generation
of surveys (e.g. <a href="https://www.euclid-ec.org/">Euclid</a>, <a href="https://www.lsst.org/">LSST</a>)
will not be limited by noise but by such systematic effects. We propose a novel
likelihood<sup id="fnref:K"><a href="#fn:K" class="footnote">1</a></sup> which accurately accounts for and corrects effects of unknown foreground
contaminations. Robust likelihood approaches, as presented below, have a potentially
crucial role in optimizing the scientific returns of state-of-the-art surveys.</p>
<h1 id="robust-likelihood">Robust likelihood</h1>
<p>The underlying conceptual framework of our novel likelihood relies on the
marginalization of the unknown large-scale foreground contamination amplitudes. To
this end, we need to label voxels having the same foreground modulation and this is
encoded via a colour indexing scheme that groups the voxels into a collection of
angular patches. This requires the construction of a sky map which is divided into
regions of a given angular scale, with each region denoted by a specific colour, as
illustrated in Fig. 1 (a). The corresponding representation on a 3D grid results in
a 3D distribution of patches, with the a given slice of the coloured grid depicted
in Fig. 1 (b). The collection of voxels belonging to a particular patch is employed
in the computation of the robust likelihood.</p>
<p>Our proposed data model is conceptually straightforward and provides a maximally
ignorant approach to deal with unknown systematics, with the colouring scheme being
independent of any prior foreground information. As such, the numerical implementation
of our novel likelihood is generic and does not require any adjustments to the other
components in the forward modelling framework of BORG (Bayesian Origin Reconstruction
from Galaxies) for the inference of non-linear cosmic structures.</p>
<p class="figure wide"><img src="/assets/posts/robust/colours.jpg" alt="Colour indexing scheme on the sphere" />
<em>(a) Schematic to illustrate the colour indexing of the survey elements. Colours are
assigned to patches of a given angular scale. (b) Slice through the 3D coloured box
resulting from the extrusion of the colour indexing scheme on the left panel onto a
3D grid. This collection of coloured patches is subsequently employed in the
computation of the robust likelihood.</em></p>
<h1 id="comparison-with-a-standard-poissonian-likelihood-analysis">Comparison with a standard Poissonian likelihood analysis</h1>
<p>We showcase the application of our robust likelihood to a mock data set with
significant foreground contaminations and evaluated its performance via a comparison
with an analysis employing a standard Poissonian likelihood, as typically used in
modern large-scale structure analyses. The results illustrated below clearly
demonstrate the efficacy of our proposed likelihood in robustly dealing with unknown
foreground contaminations for the inference of non-linearly evolved dark matter
density fields and the underlying cosmological power spectra from deep galaxy
redshift surveys.</p>
<h2 id="inferred-dark-matter-density-fields">Inferred dark matter density fields</h2>
<p>We first study the impact of the large-scale contamination on the inferred non-linearly
evolved density field. We compare the ensemble mean density fields and
corresponding standard deviations for the two Markov chains obtained using BORG with
the Poissonian and novel likelihoods, respectively, illustrated in the top and bottom
panels of Fig. 2, for a particular slice of the 3D density field. As can be deduced from
the top left panel of Fig. 2, the standard Poissonian analysis results in spurious
effects in the density field, particularly close to the boundaries of the survey since
these are the regions that are the most affected by the dust contamination. In contrast,
our novel likelihood analysis yields a homogeneous density distribution through the
entire observed domain, with the filamentary nature of the present-day density field
clearly seen. From this visual comparison, it is evident that our novel likelihood is
more robust against unknown large-scale contaminations.</p>
<p class="figure wide"><img src="/assets/posts/robust/panels_density.png" alt="Inferred density fields" />
<em>Mean and estimated uncertainty of the non-linearly evolved density fields, computed
from the sampled realizations of the respective Markov chains obtained from both the
Poissonian (upper panels) and novel likelihood (lower panels) analyses, with the same
slice through the 3D fields being depicted. Unlike our robust data model, the standard
Poissonian analysis yields some artefacts in the reconstructed density field,
particularly near the survey boundary, where the foreground contamination is stronger.</em></p>
<h2 id="reconstructed-matter-power-spectra">Reconstructed matter power spectra</h2>
<p>From the realizations of our inferred 3D initial density field, we can reconstruct the
corresponding matter power spectra and compare them to the prior cosmological power
spectrum adopted for the mock generation. The top panels of Fig. 3 illustrates the
inferred power spectra for both likelihood analyses, with the bottom panels displaying
the ratio of the a posteriori power spectra to the prior power spectrum. While the
standard Poissonian analysis yields excessive power on the large scales due to the
artefacts in the inferred density field, the analysis with our novel likelihood allows
us to recover an unbiased power spectrum across the full range of Fourier modes.</p>
<p class="figure wide"><img src="/assets/posts/robust/Pk.jpg" alt="Reconstructed power spectra from likelihood analysis" />
<em>Reconstructed power spectra from the inferred initial conditions from the BORG analysis
for the robust likelihood (left panel) and the Poissonian likelihood (right panel).
The power spectra of the individual realizations, after the initial burn-in phase, from
the robust likelihood analysis possess the correct power across all scales considered,
demonstrating that the foregrounds have been properly accounted for. In contrast, the
standard Poissonian analysis exhibits spurious power artefacts due to the unknown
foreground contaminations, yielding excessive power on these scales.</em></p>
<div class="footnotes">
<ol>
<li id="fn:K">
<p>N. Porqueres, D. Kodi Ramanah, J. Jasche, G. Lavaux, 2018, submitted to A&A, <a href="https://arxiv.org/pdf/1812.05113">arxiv 1808.07496</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" /> <a href="#fnref:K" class="reversefootnote">↩</a></p>
</li>
</ol>
</div>
Fri, 14 Dec 2018 00:00:00 +0100
https://www.aquila-consortium.org/method/robust.html
https://www.aquila-consortium.org/method/robust.htmlmethodPrecision cosmology with expansion<h1 id="overview">Overview</h1>
<p>The exploration of the Universe at large relies mostly on the use of large
galaxy surveys, i.e. compilation of the position and optical properties of
galaxies in the sky. These surveys are either photometric, when only wide band
observations are available, or spectroscopic, for which the emission of each
galaxies have been finely described at different wavelength. From the luminous
properties we derive the ‘redshift’ of each galaxy, i.e. its total apparent
receding velocity.</p>
<p>Sophisticated and optimal data analysis techniques for cosmological inference
from galaxy redshift surveys are in increasing demand to cope with the present
and upcoming avalanches of cosmological data (e.g.
<a href="https://www.euclid-ec.org/">Euclid</a>, <a href="https://www.darkenergysurvey.org/">DES</a>,
<a href="https://www.desi.lbl.gov/">DESI</a>), and therefore optimize the scientific
returns of the missions. This is all the more critical that each survey brings
us closer to a full census of the galaxy distribution in our patch of Universe.
We are thus running out of exploitable information on our Universe. In our
latest article<sup id="fnref:K"><a href="#fn:K" class="footnote">1</a></sup> (also slides are availables<sup id="fnref:T"><a href="#fn:T" class="footnote">2</a></sup>), we present, for the first time, a non-linear Bayesian
inference framework to constrain cosmological parameters using a kind of
anisotropy visible in galaxy redshift surveys, via an application of the
Alcock-Paczyński (AP) test. This novel approach extracts several orders of
magnitude more information from the cosmological expansion compared to classical
approaches, to infer cosmological parameters and jointly reconstruct the
underlying 3D dark matter density field.</p>
<h1 id="alcock-paczyński-test">Alcock-Paczyński test</h1>
<div class="figure movie">
<div class="holder_video">
<video class="video-js" controls="" loop="" preload="metadata" data-setup="{"fluid":true}"><source src="/assets/posts/altair/cosmo_larger_ellipse_N256_small.mp4" type="video/mp4" /> <p class="vjs-no-js">To view this video please enable javascript, and consider upgrading to a web browser that <a href="https://videojs.com/html5-video-support/" target="_blank">supports HTML5 video</a>.</p> </video>
</div>
<em>A closed trajectory in the (<script type="math/tex">\Omega_{\mathrm{m}}</script>, <script type="math/tex">w_0</script>) plane, depicting the cosmological dependence of the cosmic expansion history for a fixed set of density initial conditions and powerspectrum.</em>
</div>
<p>The Alcock-Paczyński (AP) test is a cosmological test of the expansion of the
Universe and its geometry. The main advantage is that it is independent of the
evolution of galaxies but depends only on the geometry of the Universe. The
assumption of incorrect cosmological parameters in data analysis yields
distortions in the appearance of any spherical object or isotropic statistical
distribution. The AP test provides a pathway to exploit this resulting spurious
anisotropy to constrain the cosmological parameters. In this work, we invoke the
AP test to ensure that the underlying geometrical properties of isotropy of the
Universe are maintained. As such, the key underlying assumption relies purely on
the geometrical properties of the cosmological principle.</p>
<h1 id="inference-machinery">Inference machinery</h1>
<p>To encode the AP test, we developed an extension to the hierarchical Bayesian
inference machinery of BORG (Bayesian Origin Reconstruction from Galaxies),
originally developed for the non-linear reconstruction of large-scale
structures. Our physical model of the non-linearly evolved density field, as
probed by galaxy surveys, employs Lagrangian perturbation theory (LPT) to
connect Gaussian initial conditions to the final density field, followed by a
coordinate transformation to obtain the redshift space representation for
comparison with data. We implement a sophisticated Hamiltonian Monte Carlo
sampler to generate realizations of 3D primordial and present-day matter
fluctuations from a non-Gaussian LPT-Poissonian density posterior given a set of
observations. Our augmented framework with cosmological applications is
designated as ALTAIR (ALcock-Paczyński consTrAIned Reconstruction).</p>
<p>The essence of this AP test can be summarized as follows: The Bayesian inference
machinery explores the various cosmological expansion histories and selects the
cosmology-dependent evolution pathways which yield isotropic correlations of the
galaxy density field in comoving coordinates, thereby constraining cosmology. In particular, we sample
the present-day values of matter density and dark energy equation of parameters,
i.e. <script type="math/tex">\Omega_{\mathrm{m}}</script> and <script type="math/tex">w_0</script>, respectively. The reconstruction
scheme employed in ALTAIR is depicted in Figure 2.</p>
<p class="figure wide"><img src="/assets/posts/altair/reconstruction_schematic.jpg" alt="Schematic of the reconstruction pipeline" />
<em>This schematic illustrates the reconstruction pipeline of ALTAIR. The forward
model consists of a chain of various components for the non-linear evolution
from initial conditions and the subsequent transformation from comoving to
redshift space for the application of the AP test. This consequently transforms
the initial density field into a set of predicted observables, i.e. a galaxy
distribution in redshift space, for comparison with data via a likelihood or
posterior analysis.</em></p>
<h1 id="key-results">Key results</h1>
<p>We have showcased the performance of ALTAIR on a mock galaxy catalogue, that
emulates the features of the SDSS-III survey. The main aspects of our
investigation are summarized below.</p>
<h2 id="tight-cosmological-constraints">Tight cosmological constraints</h2>
<p>The marginal and joint posterior distributions for the cosmological parameters
are displayed in Figure 4, demonstrating the capability of ALTAIR to infer tight
constraints. Our AP test fully exploits the high information content from the
cosmic expansion as a result of probing a deep redshift range, where the
distortion is more pronounced.</p>
<p class="figure wide"><img src="/assets/posts/altair/seaborn_subplot_posteriors.jpg" alt="Cosmological constraints" />
<em>The marginal and joint posteriors for <script type="math/tex">\Omega_{\mathrm{m}}</script> and <script type="math/tex">w_0</script>
illustrate the potential of ALTAIR to yield tight cosmological constraints from
present and next-generation galaxy redshift surveys.</em></p>
<p>With baryon acoustic oscillations (BAOs) being a robust standard ruler, the AP
test has been utilized for the simultaneous measurement of the Hubble parameter
and angular diameter distance of distant galaxies. Therefore, as a comparison,
we depict the corresponding constraints obtained via BAO measurements from the
SDSS-III (Date Release 12) in Figure 4. These BAO constraints have not been
combined with Planck measurements, which would significantly tighten the
constraints. Nevertheless, this highlights the significant potential
constraining power of our AP test, compared to standard BAO analyses, while
being at least as robust.</p>
<p class="figure"><img src="/assets/posts/altair/error_ellipses_BAO_altair_inset.jpg" alt="Comparison of cosmological constraints from BAO measurements and our implementation of AP test" />
<em>Comparison of cosmological constraints from BAO measurements (SDSS-III, DR12)
and our implementation of AP test in ALTAIR. The ellipses denote their
respective 1-sigma confidence regions, centered on the fiducial cosmological
parameters. Note that the BAO constraints have not been combined with Planck
CMB measurements. This demonstrates the potential constraining power of our AP
test compared to standard BAO analyses, with the inset focusing on the ALTAIR
constraints where the fiducial cosmology is depicted in dashed lines.</em></p>
<h2 id="robustness-to-a-misspecified-model">Robustness to a misspecified model</h2>
<p>The main strength of our implementation of the AP test lies in its robustness to
a misspecified model and its inherent approximations, thereby near-optimally
exploiting the model predictions, without relying on its accuracy in modelling
the scale dependence of the correlations of the field.</p>
<p>We demonstrated this robustness of our AP test by employing a modified prior
power spectrum in the inference procedure. By adopting a different cosmology
(<script type="math/tex">\Omega_{\mathrm{m}} = 0.40</script> and <script type="math/tex">w_0 = -0.85</script>), we modify the shape of the
power spectrum, and subsequently apply ALTAIR on the same mock catalogue. As
shown in Figure 5, we recover the fiducial cosmological parameters employed in
the mock generation, although with slightly larger uncertainties than for the
original run by roughly 15%. This test case therefore explicitly highlights the
robustness of our implementation of the AP test to a misspecified model since it
does not optimize the information from the scale dependence of the correlations
of the density field, but rather from the isotropy of the field.</p>
<p class="figure wide"><img src="/assets/posts/altair/seaborn_subplot_posteriors_diff_Pk.jpg" alt="Cosmological constraints with modified prior" />
<em>Same as Figure 3, but employing a different prior power spectrum
(<script type="math/tex">\Omega_{\mathrm{m}} = 0.40</script> and <script type="math/tex">w_0 = -0.85</script>). By recovering the
fiducial cosmological parameters employed in the mock generation, this test
case explicitly highlights the robustness of our approach to the shape of the
prior power spectrum adopted. The corresponding uncertainties are slightly
larger than for the original run by around 15%.</em></p>
<h2 id="extremely-weak-dependence-on-galaxy-bias">Extremely weak dependence on galaxy bias</h2>
<p>The robustness of our method to model misspecification yields another key
aspect, which is that the cosmological constraints show extremely weak
dependence on the currently unresolved phenomenon of galaxy bias. This yields
two crucial advantages:</p>
<ul>
<li>
<p>This is especially interesting as the lack of a sufficient description of this
bias remains a potential limiting factor for standard approaches.</p>
</li>
<li>
<p>This also implies that our method does not depend on the absolute density
fluctuation amplitudes. This is therefore among the first methods to extract a
large amount of information from statistics other than that of direct density
contrast correlations, without relying on the power spectrum or bispectrum,
thereby providing complementary information to state-of-the-art techniques.</p>
</li>
</ul>
<div class="footnotes">
<ol>
<li id="fn:K">
<p>D. Kodi Ramanah, G. Lavaux, J. Jasche & B. D. Wandelt, 2018, submitted to A&A, <a href="https://arxiv.org/pdf/1808.07496">arxiv 1808.07496</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" /> <a href="#fnref:K" class="reversefootnote">↩</a></p>
</li>
<li id="fn:T">
<p>Talk by Doogesh Kodi Ramanah <a href="/assets/talks/DKR_Oxford_JC2018.pdf">(slides)</a> <a href="#fnref:T" class="reversefootnote">↩</a></p>
</li>
</ol>
</div>
Mon, 27 Aug 2018 00:00:00 +0200
https://www.aquila-consortium.org/method/altair.html
https://www.aquila-consortium.org/method/altair.htmlmethodFifth force on galaxy cluster scale<h1 id="overview">Overview</h1>
<p>Although the current cosmological paradigm – <script type="math/tex">\Lambda</script>CDM – is remarkably successful at explaining a great range of observations, a number of puzzles suggest that it may need to be extended. Generic extensions introduce new fields alongside the metric tensor of General Relativity, which couple to matter and induce new interactions between objects. Called “fifth forces” because they supplement the four known fundamental forces of nature, these new interactions are the smoking guns of new physics.</p>
<p>Physicists have been searching for fifth forces in the Solar System and laboratory for several decades, placing ever tighter constraints on their strength and range. Recently however it’s become clear that many classes of extensions to <script type="math/tex">\Lambda</script>CDM would not be expected to produce observable deviations from General Relativity in these regimes. This is due to a property of the field equations known as <em>screening</em>, which implies that the fifth force effectively decouples from matter in high density regions such as the interior of the Milky Way. To probe the fifth forces of screened theories we need therefore tests beyond the Milky Way, in the low density environments of the Universe at large.</p>
<h1 id="cosmic-cartography">Cosmic Cartography</h1>
<p>A first step to testing screening is to identify which regions of the local Universe would be expected to be screened or unscreened in specific theories, based on the regions’ densities or gravitational field strengths. To do this, we combined the BORG-PM algorithm<sup id="fnref:BORGPM"><a href="#fn:BORGPM" class="footnote">1</a></sup> with a model of small-scale structure to reconstruct three measures of the gravitational field – Newtonian potential, acceleration and spacetime curvature – out to redshift <script type="math/tex">\sim0.05</script> <sup id="fnref:D18a"><a href="#fn:D18a" class="footnote">2</a></sup>. Figure 1 shows a slice through the Newtonian potential field: blue regions are those of weak gravitational field, which are most likely to harbour unscreened galaxies within which a fifth force is manifest. Newtonian potential is specifically relevant to the “chameleon” and “symmetron” screening mechanisms; acceleration and curvature govern the degree of screening under the “kinetic” and “Vainshtein” mechanisms respectively. Our maps – publicly available on Desmond’s <a href="https://www2.physics.ox.ac.uk/contacts/people/desmond">website</a> – provide each of these screening proxies at any point in space within <script type="math/tex">\sim 200 h^{-1}</script> Mpc.</p>
<p class="figure"><img src="/assets/posts/fifth_force/fig1.png" alt="Gravitational potential" />
<em>Contour plot of the gravitational potential across a 300 Mpc x 300 Mpc slice of the local universe (1 Mpc = 3.26 million light-years). The Milky Way is located at x=y=0. From Desmond et al 2018(a)<sup id="fnref:D18a:1"><a href="#fn:D18a" class="footnote">2</a></sup>.</em></p>
<h1 id="searching-for-new-forces">Searching for new forces</h1>
<p class="figure"><img src="/assets/posts/fifth_force/fig2a.png" alt="Conservative analysis" />
<em>A conservative analysis of the separation of stars and gas in galaxies in different gravitational environments produces precise constraints on the strength and range of a screened or unscreened fifth force. The region above the line is excluded. From Desmond et al 2018(b)<sup id="fnref:D18b"><a href="#fn:D18b" class="footnote">3</a></sup>.</em></p>
<p>Now knowing which galaxies ought to be screened and which not, we can search for observational differences between them. These differences arise because stars in otherwise unscreened galaxies are themselves dense, and therefore self-screen. Thus while gas and dark matter interact with surrounding mass via a fifth force, the stars do not, so that the various components of galaxies fall at different rates in an external field. In particular, the stellar disk lags behind the gas disk and dark matter halo in the direction of the exernal fifth force field. This has two observational consequences, which we have studied in detail:</p>
<ul>
<li>
<p>An offset between the centroids of optical (stellar) and HI (gas) emission<sup id="fnref:D18b:1"><a href="#fn:D18b" class="footnote">3</a></sup> <sup id="fnref:D18c"><a href="#fn:D18c" class="footnote">4</a></sup></p>
</li>
<li>
<p>A U-shaped warp in the stellar disk, bending away from the direction of the fifth force <sup id="fnref:D18d"><a href="#fn:D18d" class="footnote">5</a></sup></p>
</li>
</ul>
<p>In both cases we achieve sensitivity to fifth forces with strength ~1% that of gravity, for ranges <script type="math/tex">\sim0.5-50</script> Mpc. Assuming highly conservative observational uncertainties we place the strongest constraints to date on fifth-force properties at the scale of galaxies and their environments, as shown in Figure 2. Using a more realistic model for observational uncertainties, the analyses provide independent yet fully-compatible evidence for a screened fifth force of range <script type="math/tex">\lambda_C \simeq 2</script> Mpc and strength <script type="math/tex">\Delta G/G_N \simeq 0.02</script> (Figure 3). This is well below the detection threshold of any previous experiment. We caution however that baryonic physics may confound this inference; we will explore this in future work, alongside devising novel probes of other types of fundamental physics with our inference framework, such as dark matter self-interactions (Pardo et al 2018, in prep).</p>
<p class="figure"><img src="/assets/posts/fifth_force/fig3.png" alt="Less conservative analysis" />
<em>A less conservative analysis suggests the action of a screened fifth force operating on scales $\sim2$ Mpc, shown here from the study of galactic warps. The plot shows the increase in goodness-of-fit of the model over General Relativity as a function of fifth-force range. The dashed lines show the results of analysing mock data with a fifth-force signal injected by hand. From Desmond et al 2018(d)<sup id="fnref:D18d:1"><a href="#fn:D18d" class="footnote">5</a></sup>.</em></p>
<div class="footnotes">
<ol>
<li id="fn:BORGPM">
<p>See <a href="/method/2018/07/24/borgpm.html">BORG-PM post</a> and Jasche & Lavaux, 2018, submitted to A&A, <a href="https://arxiv.org/pdf/1806.11117">1806.11117</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" />. <a href="#fnref:BORGPM" class="reversefootnote">↩</a></p>
</li>
<li id="fn:D18a">
<p><a href="http://dx.doi.org/10.1093/mnras/stx3062">MNRAS 474, 3152-3161</a> <img class="inline-logo svg" src="/assets/images/newspaper-solid.svg" alt="journal" />, <a href="https://arxiv.org/abs/1705.02420">arXiv:1705.02420</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" />. <a href="#fnref:D18a" class="reversefootnote">↩</a> <a href="#fnref:D18a:1" class="reversefootnote">↩<sup>2</sup></a></p>
</li>
<li id="fn:D18b">
<p>MNRAS Letters submitted, <a href="https://arxiv.org/abs/1802.07206">arXiv:1802.07206</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" />. <a href="#fnref:D18b" class="reversefootnote">↩</a> <a href="#fnref:D18b:1" class="reversefootnote">↩<sup>2</sup></a></p>
</li>
<li id="fn:D18c">
<p>PRD submitted, <a href="https://arxiv.org/abs/1807.01482">arXiv:1807.01482</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" />. <a href="#fnref:D18c" class="reversefootnote">↩</a></p>
</li>
<li id="fn:D18d">
<p>PRD submitted, <a href="https://arxiv.org/abs/1807.11742">arXiv:1807.11742</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" />. <a href="#fnref:D18d" class="reversefootnote">↩</a> <a href="#fnref:D18d:1" class="reversefootnote">↩<sup>2</sup></a></p>
</li>
</ol>
</div>
Thu, 16 Aug 2018 00:00:00 +0200
https://www.aquila-consortium.org/method/observations/fifth_force.html
https://www.aquila-consortium.org/method/observations/fifth_force.htmlmethodobservationsThe BORG Particle-Mesh model<h1 id="overview-of-the-problem">Overview of the problem</h1>
<p>Accurate analyses of present and next-generation cosmological galaxy surveys
require new ways to handle effects of non-linear gravitational structure
formation processes in data. To address these needs we present an extension of
our previously developed algorithm for Bayesian Origin Reconstruction from
Galaxies to analyse matter clustering at non-linear scales in observations. This
is achieved by incorporating a numerical particle mesh model of gravitational
structure formation into our Bayesian inference framework.</p>
<h1 id="a-new-technology">A new technology</h1>
<p>The algorithm simultaneously infers the three-dimensional primordial matter
fluctuations from which present non-linear observations formed and provides
reconstructions of velocity fields and structure formation histories. The
physical forward modeling approach automatically accounts for the non-Gaussian
features in gravitationally evolved matter density fields and addresses the
redshift space distortion problem associated with peculiar motions of observed
galaxies. Our algorithm employs a hierarchical Bayes approach to jointly account
for various observational effects, such as unknown galaxy biases, selection
effects, and observational noise. Corresponding parameters of the data model are
marginalized out via a sophisticated Markov Chain Monte Carlo approach relying
on a combination of a multiple block sampling framework and an efficient
implementation of a Hamiltonian Monte Carlo sampler. We demonstrate the
performance of the method by applying it to the 2M++ galaxy compilation, tracing
the matter distribution of the Nearby Universe. We show accurate and detailed
inferences of the three-dimensional non-linear dark matter distribution of the
Nearby Universe. As exemplified in the case of the Coma cluster, our method
provides complementary mass estimates that are compatible with those obtained
from weak lensing and X-ray observations. For the first time, we also present a
reconstruction of the vorticity of the non-linear velocity field from
observations. In summary, our method provides plausible and very detailed
inferences of the dark matter and velocity fields of our cosmic neighbourhood.</p>
<p class="figure wide"><img src="/assets/posts/borgpm/chrono_sg.jpg" alt="Chronocosmography of the Nearby Universe" />
<em>This picture illustrates the capability to infer one plausible history of the
formation of Large scale structures. The history reads from left to right, top
to bottom. The ultimate snapshot shows the galaxies overlaying the inferred
density field.</em></p>
<h1 id="applications-in-cosmology">Applications in cosmology</h1>
<p>Our method has applications in all fields in cosmology, either for direct
measurements of underlying physical parameters or for comparing and correlating
with other observations of same part of the Universe. In that work, we have only
focused on three aspects: the measurement of masses of clusters and superclusters
of galaxies, the properties of the peculiar velocity field on large scales and
the study of claimed anomalies in the density fluctuations.</p>
<h2 id="cluster-mass-measurements">Cluster mass measurements</h2>
<p>The first direct application is the measurement of the mass of clusters of galaxies. We have defined this mass in the simplest possible fashion: the total mass enclosed within a radius $r$, or mathematically speaking:
<script type="math/tex">M(r) = \int_0^{R_\mathrm{max}} \rho(r)~\mathrm{d}r\,.</script>
We use for reference the mass enclosed if the the Universe content was strictly homogeneously distributed, or mathematically:
<script type="math/tex">M_\mathrm{mean}(r) = \frac{4\pi}{3} \rho_\mathrm{mean} r^3\,.</script></p>
<p>We showcase our estimator by focusing on one well studied object: the Coma cluster.
The performance of our estimator is given in the Figure below. We clearly observe
the compatibility of the measurement provided (solid lines and filled regions)
through BORG-PM inference with the other probes considered in that study.</p>
<p class="figure"><img src="/assets/posts/borgpm/coma_mass.jpg" alt="Coma mass profile" />
<em>The above pictures shows the mass profile, i.e. the mass enclosed within a given
distance of the object, derived through different methods and data of
the same cluster of galaxies: Coma. The BORG-PM method is given by the solid red
line (mean mass profile), and gray/dark gray filled regions for the 68% and 95%
limit. The other probes are given with their references and typical enclosed radius.</em></p>
<p>The advantage of our method is that this measurement can be freely reproduced for any structure within the observational boundaries. We have simply isolated a structure in the volume and asks about the mass.</p>
<h2 id="peculiar-velocity-field">Peculiar velocity field</h2>
<p>The second direct result of the analysis is the derivation of the peculiar velocity field
for the covered volume. Peculiar velocity field is notoriously complicated to get
right. Among the reasons, we find:</p>
<ul>
<li>large scale correlations leading to high sensitivity to boundary effects</li>
<li>requirement to have an unbiased total matter density field.</li>
<li>systematic effect arising from the use of redshifts to derive the tracer positions
and their contribution to the mass density (this is so-called Malmquist bias). The tracers
have also specific radial selection properties yielding more systematic effects.</li>
</ul>
<p>Classic methods have most relied on linear perturbation theory of density fluctuations to
derive estimators of these fields. The BORG-PM method allows a self-consistent derivation of
these fields including non-linearities. This allows for the first time to have a model of
completely non-linear fields like</p>
<p class="figure"><img src="/assets/posts/borgpm/pecvel.jpg" alt="Peculiar velocity field" />
<em>Peculiar velocity field picture</em></p>
<h1 id="the-future">The Future</h1>
<p>Some other applications are showcases in the paper (e.g. density anomalies, velocity field vorticity). We have only scratched the surface of the possibilities opened by this kind of inference. We invite the interested reader to have a closer look at the article and see recent related work, notably on the <a href="/method/observations/fifth_force.html">fifth work gravity</a> and <a href="/method/altair.html">Alcock Pasczyński</a> effects.</p>
<h1 id="references">References</h1>
<ul>
<li>J. Jasche & G. Lavaux, 2018, submitted to A&A, <a href="https://arxiv.org/pdf/1806.11117">arxiv 1806.11117</a> <img class="inline-logo" src="/assets/images/arxiv.png" alt="arxiv" /></li>
</ul>
Tue, 24 Jul 2018 00:00:00 +0200
https://www.aquila-consortium.org/method/borgpm.html
https://www.aquila-consortium.org/method/borgpm.htmlmethod