Millennium Run
The Millennium Run, or Millennium Simulation (referring to its size
Overview
A basic scientific method for testing theories in cosmology is to evaluate their consequences for the observable parts of the universe. One piece of observational evidence is the distribution of matter, including galaxies and intergalactic gas, which are observed today. Light emitted from more distant matter must travel longer in order to reach Earth, meaning looking at distant objects is like looking further back in time. This means the evolution in time of the matter distribution in the universe can also be observed directly.
The Millennium Simulation was run in 2005 by the
Since the completion of the Millennium Run simulation in 2005, a series of ever more sophisticated and higher fidelity simulations of the formation of the galaxy population have been built within its stored output and have been made publicly available over the internet. In addition to improving the treatment of the astrophysics of galaxy formation, recent versions have adjusted the parameters of the underlying cosmological model to reflect changing ideas about their precise values. To date (mid-2018) more than 950 published papers have made use of data from the Millennium Run, making it, at least by this measure, the highest impact astrophysical simulation of all time.[3]
Size of the simulation
For the first scientific results, published on June 2, 2005, the Millennium Simulation traced 21603, or just over 10 billion, "particles." These are not particles in the
First results
The Sloan Digital Sky Survey had challenged the current understanding of cosmology by finding black hole candidates in very bright quasars at large distances. This meant that they were created much earlier than initially expected. In successfully managing to produce quasars at early times, the Millennium Simulation demonstrated that these objects do not contradict our models of the evolution of the universe.
Millennium II
In 2009, the same group ran the 'Millennium II' simulation (MS-II) on a smaller cube (about 400 million light years on a side), with the same number of particles but with each particle representing 6.9 million solar masses. This is a rather harder numerical task since splitting the computational domain between processors becomes harder when dense clumps of matter are present. MS-II used 1.4 million CPU hours over 2048 cores (i.e. about a month) on the Power-6 computer at Garching; a simulation was also run with the same initial conditions and fewer particles to check that features in the higher-resolution run were also seen at lower resolution.
Millennium XXL
In 2010, the 'Millennium XXL' simulation (MXXL) was performed, this time using a much larger cube (over 13 billion light years on a side), and 67203 particles each representing 7 billion times the mass of the Sun. The MXXL spans a cosmological volume 216 and 27,000 times the size of the Millennium and the MS-II simulation boxes, respectively. The simulation was run on JUROPA, one of the top 15 supercomputers in the world in 2010. It used more than 12,000 cores for an equivalent of 300 years CPU time, 30 terabytes of RAM and generated more than 100 terabytes of data.[5] Cosmologists use the MXXL simulation to study the distribution of galaxies and dark matter halos on very large scales and how the rarest and most massive structures in the universe came about.
Millennium Run Observatory
In 2012, the Millennium Run Observatory (MRObs) project was launched. The MRObs is a theoretical virtual observatory that integrates detailed predictions for the dark matter (from the Millennium simulations) and for the galaxies (from semi-analytical models) with a virtual telescope to synthesize artificial observations. Astrophysicists use these virtual observations to study how the predictions from the Millennium simulations compare to the real universe, to plan future observational surveys, and to calibrate the techniques used by astronomers to analyze real observations. A first set of virtual observations produced by the MRObs have been released to the astronomical community for analysis through the MRObs Web portal. The virtual universe can also be accessed through a new online tool, the MRObs browser, which allows users to interact with the Millennium Run Relational Database where the properties of millions of dark matter halos and their galaxies from the Millennium project are being stored. Upgrades to the MRObs framework, and its extension to other types of simulations, are currently being planned.
See also
References
- ^ a b c
Springel, Volker; et al. (2005). "Simulations of the formation, evolution, and clustering of galaxies and quasars" (PDF). S2CID 4383030.
- ^ "MPA :: Current Research Highlight :: August 2004". Retrieved 2009-05-28.
- ^ "The Millennium Simulation public page". Retrieved 2017-02-15.
- ^ "Millennium Simulation - The Largest Ever Model of the Universe". Retrieved 2009-05-28.
- ^ "The Millennium-XXL Project: Simulating the Galaxy Population in Dark Energy Universes". Retrieved 2013-07-02.
Further reading
- Springel, Volker; et al. (2005). "Simulations of the formation, evolution, and clustering of galaxies and quasars" (PDF). S2CID 4383030.
- Boylan-Kolchin, Michael; et al. (2009). "Resolving Cosmic Structure Formation with the Millennium-II Simulation". S2CID 9703617.
- Angulo, Raul; et al. (2012). "Scaling relations for galaxy clusters in the Millennium-XXL simulation". Monthly Notices of the Royal Astronomical Society. 426 (3): 2046–2062. S2CID 53692799.
- Overzier, Roderik; et al. (2012). "The Millennium Run Observatory: First Light". Monthly Notices of the Royal Astronomical Society. 428 (1): 778–803. S2CID 119219960.
- Lemson, Gerard; Virgo Consortium (2006). "Halo and Galaxy Formation Histories from the Millennium Simulation: Public release of a VO-oriented and SQL-queryable database for studying the evolution of galaxies in the LambdaCDM cosmogony". )