* Astronomy

Members Login
Username 
 
Password 
    Remember Me  
Post Info TOPIC: The Millennium Run


L

Posts: 131433
Date:
RE: The Millennium Run
Permalink  
 


The Millennium Simulation, also known as the Millennium Run, is a giant simulation of the whole known Universe.

Read more



__________________


L

Posts: 131433
Date:
Permalink  
 

Title: Satellite Galaxies and Fossil Groups in the Millennium Simulation
Authors: L. V. Sales, J. F. Navarro, D. G. Lambas, S. D. M. White, D. J. Croton
(Version v4)

We use a semianalytic galaxy catalogue constructed from the Millennium Simulation to study the satellites of isolated galaxies in the LCDM cosmogony. This sample (~80,000 bright primaries, surrounded by ~178,000 satellites) allows the characterisation, with minimal statistical uncertainty, of the dynamical properties of satellite/primary galaxy systems in a LCDM universe. We find that, overall, the satellite population traces the dark matter rather well: its spatial distribution and kinematics may be approximated by an NFW profile with a mildly anisotropic velocity distribution. Their spatial distribution is also mildly anisotropic, with a well-defined "anti-Holmberg" effect that reflects the misalignment between the major axis and angular momentum of the host halo. The isolation criteria for our primaries picks not only galaxies in sparse environments, but also a number of primaries at the centre of ''fossil'' groups. We find that the abundance and luminosity function of these unusual systems are in reasonable agreement with the few available observational constraints. We recover the expected L_{host} \sigma_{sat}³ relation for LCDM models for truly-isolated primaries. Less strict primary selection, however, leads to substantial modification of the scaling relation. Our analysis also highlights a number of difficulties afflicting studies that rely on blind stacking of satellite systems to constrain the mean halo mass of the primary galaxies.

Read more (351kb, PDF)

__________________


L

Posts: 131433
Date:
Cosmological Simulation
Permalink  
 


(Reposting of old news)
By incorporating the physics of black holes into a highly sophisticated model running on a powerful supercomputing system, an international team of scientists has produced an unprecedented simulation of cosmic evolution that verifies and deepens our understanding of relationships between black holes and the galaxies in which they reside. Called BHCosmo, the simulation shows that black holes are integral to the structure of the cosmos and may help guide users of future telescopes, showing them what to look for as they aim to locate the earliest cosmic events and untangle the history of the universe.     
The research team is led by Carnegie Mellon University and includes scientists from the Harvard-Smithsonian Centre for Astrophysics and the Max Planck Institute for Astrophysics in Germany.

Read more 

__________________


L

Posts: 131433
Date:
RE: The Millennium Run
Permalink  
 


Large-scale structure of the Universe: a 3-D simulation.

[youtube=http://youtube.com/watch?v=FFlzyxSQhTc]


__________________


L

Posts: 131433
Date:
Millennium Simulation
Permalink  
 


Title: Satellite Galaxies and Fossil Groups in the Millennium Simulation
Authors: L. V. Sales, J. F. Navarro, D. G. Lambas, S. D. M. White, D. J. Croton
(Version v2)

We use a semianalytic galaxy catalogue constructed from the Millennium Simulation to study the satellites of isolated galaxies in the LCDM cosmogony. This sample (~80,000 bright primaries, surrounded by ~178,000 satellites) allows the characterisation, with minimal statistical uncertainty, of the dynamical properties of satellite/primary galaxy systems in a LCDM universe. We find that, overall, the satellite population traces the dark matter rather well: its spatial distribution and kinematics may be approximated by an NFW profile with a mildly anisotropic velocity distribution. Their spatial distribution is also mildly anisotropic, with a well-defined "anti-Holmberg" effect that reflects the misalignment between the major axis and angular momentum of the host halo. The isolation criteria for our primaries picks not only galaxies in sparse environments, but also a number of primaries at the centre of ''fossil'' groups. We find that the abundance and luminosity function of these unusual systems are in reasonable agreement with the few available observational constraints. We recover the expected L_{host} \sigma_{sat}³ relation for LCDM models for truly-isolated primaries. Less strict primary selection, however, leads to substantial modification of the scaling relation. Our analysis also highlights a number of difficulties afflicting studies that rely on blind stacking of satellite systems to constrain the mean halo mass of the primary galaxies.


Download (61kb, 560 x 560)

Read more (350kb, PDF)

-- Edited by Blobrana at 12:21, 2007-06-18

__________________


L

Posts: 131433
Date:
Virtual supernova
Permalink  
 


The dramatic death of a white dwarf star in a violent explosion has been simulated on computers for the first time. Because distant exploding stars are used to track the expansion rate of the cosmos, astronomers say the feat could help in the quest for the ultimate fate of the universe.
However, the result is controversial because another team has failed to replicate it

Read more

__________________


L

Posts: 131433
Date:
RE: The Millennium Run
Permalink  
 


The Universe has guzzled its way through about 20 per cent of its normal matter, or original fuel reserves, according to findings from a survey of the nearby Universe by an international team of astronomers involving researchers at The Australian National University.

The survey, to be released at the General Assembly of the International Astronomical Union in Prague today, revealed that about 20 per cent of the normal matter or fuel that was produced by the Big Bang 14 billion years ago is now in stars, a further 0.1 per cent lies in dust expelled from massive stars (and from which solid structures like the Earth and humans are made), and about 0.01 per cent is in super-massive black holes.
The survey data, which forms a 21st century database called the Millennium Galaxy Catalogue, was gathered from over 100 nights of telescope time in Australia, the Canary Islands and Chile, and contains over ten thousand giant galaxies, each of these containing 10 million to 10 billion stars.
According to the survey leader Dr Simon Driver of St Andrews University, Scotland, the remaining material is almost completely in gaseous form lying both within and between the galaxies, forming a reservoir from which future generations of stars may develop.

I guess the simplest prognosis is that the Universe will be able to form stars for a further 70 billion years or so after which it will start to go dark. However, unlike our stewardship of the Earth the Universe is definitely tightening its belt with a steady decline in the rate at which new stars are forming" - Dr Simon Driver.

Dr Alister Graham, an astronomer at The Australian National University who worked on the survey, said that the team of researchers were able determine how much of matter is in the stars through a ‘cosmic stocktake.’

We needed to measure the stellar mass within a representative volume of the local Universe. This required accurate and complete distance information for all the galaxies of stars that we imaged. This is where the Australian telescopes played a key role" - Dr Alister Graham.

One of the unique aspects of this program was the careful separation of a galaxy's stars into its central bulge component and surrounding disc-like structure. This allowed the researchers to determine that, on average, roughly half of the stars in galaxies reside in discs and the other half in bulges.

Measuring the concentration of stars in each galaxy's bulge is what enabled us to determine their central super-massive black hole masses. Some of these are up to one million billion times more massive than the Earth. Once we had these masses it was a simple task of summing them up to determine how much of the Universe's matter is locked away in black holes at the centres of galaxies" - Dr Alister Graham.

Dr Graham said next-generation telescopes such as the Giant Magellan Telescope, currently in production, will enable astronomers to directly measure black hole masses in galaxies ten times further away and thus ten times further back in time.

In effect, we’ll soon be able to observe how galaxies and their black holes evolved into what we see around us today" - Dr Alister Graham.

Other members of the research team include Paul Allen and Ewan Cameron of The Australian National University, Jochen Liske of the European Southern Observatory, and Roberto De Propris of the Cerro Tololo Inter-American Observatory.
The Millennium Galaxy Catalogue consists of data from the Anglo-Australian Telescope, The Australian National University's 2.3 m telescope at Siding Spring Observatory, the Isaac Newton Telescope and the Telescopio Nazionale Galileo at the Spanish Observatorio del Roque de Los Muchachos of the Instituto de Astrofisica de Canarias, and also from the Gemini and ESO New Technology Telescopes in Chile.
Financial support for this project was jointly provided through grants from the Australian Research Council and the United Kingdom's Particle Physics and Astrophysics Research Council.

Source

__________________


L

Posts: 131433
Date:
Computer Simulations
Permalink  
 


A wispy collection of atoms and molecules fuels the vast cosmic maelstroms produced by colliding galaxies and merging supermassive black holes, according to some of the most extensive supercomputer simulations ever conducted.

"We found that gas is essential in driving the co-evolution of galaxies and supermassive black holes" - Stelios Kazantzidis, Fellow in the University’s Kavli Institute for Cosmological Physics. He and his collaborators published their results in the April 2005 issue of The Astrophysical Journal and in February on astro-ph, an online repository of astronomical research papers. They also are preparing another study.

The collaboration includes Lucio Mayer from the Swiss Federal Institute of Technology, Zurich; Monica Colpi, University Milano-Bicocca; Piero Madau, University of California, Santa Cruz; Thomas Quinn, University of Washington; and James Wadsley, McMaster University.

"This type of work became possible only recently thanks to the increased power of supercomputers. The combination of both code and hardware improvement makes it possible to simulate in a few months time what had required several years of computation time only four to five years ago." - Lucio Mayer.

Improvements in the development of computer code that describes the relevant physics also helped, he said.
The findings are good news for NASA’s proposed LISA (Laser Interferometer Space Antenna) mission. Scheduled for launch in 2015, LISA’s primary objective is to search the early universe for gravitational waves. These waves, never directly detected, are predicted in Einstein’s theory of general relativity.

"At very early times in the universe there was a lot of gas in the galaxies, and as the Universe evolves, the gas is consumed by star formation. And large amounts of gas mean more colliding galaxies and merging supermassive black holes. This is important because LISA is detecting gravitational waves. And the strongest source of gravitational waves in the universe will be from colliding supermassive black holes" - Stelios Kazantzidis.

Many galaxies, including the Milky Way galaxy that contains the sun, harbour supermassive black holes at their centre. These black holes are so gravitationally powerful that nothing, including light, can escape their grasp.
Today the Milky Way moves quietly through space by itself, but one day it will collide with its nearest neighbour, the Andromeda galaxy.
Nevertheless, the Milky Way served as a handy model for the galaxies in the merging supermassive black hole simulations. Kazantzidis’s team simulated the collisions of 25 galaxy pairs to identify the key factors leading to supermassive black hole mergers.
For these mergers to occur, the host galaxies must merge first. Two gas-poor galaxies may or may not merge, depending on the structure of the galaxies. But whenever gas-rich galaxies collide in the simulations, supermassive black-hole mergers typically followed.

"The more supermassive black holes that you predict will merge, the larger number of sources that LISA will be able to detect" - Stelios Kazantzidis.

As two galaxies begin to collide, the gas they contain loses energy and migrates to their respective cores. This process increases the density and stability of the galactic cores. When these cores merge, the supermassive black holes they host also merge. When these cores become disrupted, their supermassive black holes fail to merge.
Each simulation conducted by Kazantzidis consumed approximately a month of supercomputing time at the University of Zurich, the Canadian Institute for Theoretical Astrophysics, or the Pittsburgh Supercomputing Centre.
The simulations are the first to simultaneously track physical phenomena over vastly differing scales of time and space.

"The computer can focus most of its power in the region of the system when many things are happening and are happening at a faster pace than somewhere else" - Lucio Mayer.

When galaxies collide, the billions of stars contained in them fly past one another at great distances. But their combined gravity fields do collide, manifesting as cosmic brakes to the two galaxies’ respective journeys. The galaxies separate, but they come back together, again and again for a billion years. At each step in the process, the galaxies lose speed and energy.

"They come closer and closer and closer until the end, when they merge" - Stelios Kazantzidis.

The simulations have produced effects that astronomers have observed in telescopic observations of colliding galaxies. Most notable among these is the formation of tidal tails, a stream of stars and gas that is ejected by the strong tidal forces into a linear trajectory during the collision.
On a smaller scale, astronomers also observe that colliding galaxies display increased nuclear activity as indicated by brighter cores and increased star formation.
Despite the success of the simulations, Kazantzidis and his team still work to improve their results.

"It’s a struggle every day to increase the accuracy of the computation" - Stelios Kazantzidis.

Source


__________________


L

Posts: 131433
Date:
The Millennium Run
Permalink  
 


One of the biggest computer simulations ever run is illuminating the deepest mysteries of the universe

The simulation by an international team is the biggest ever attempted and shows how structures in the Universe changed and grew over billions of years.
The Millennium Run, as it is dubbed, could help explain observations made by astronomers and shed more light on the Universe's elusive dark energy field.
"What's unique about the simulation is its scope and the level of detail"
Prof Carlos Frenk, University of Durham

The group, dubbed the Virgo Consortium—a name borrowed from the galaxy cluster closest to our own—is creating the largest and most detailed computer model of the universe ever made. While other groups have simulated chunks of the cosmos, the Virgo simulation is going for the whole thing. The cosmologists' best theories about the universe's matter distribution and galaxy formation will become equations, numbers, variables, and other parameters in simulations running on one of Germany's most powerful supercomputers, an IBM Unix cluster at the Max Planck Society's Computing Centre in Garching, near Munich.
The machine, a cluster of powerful IBM Unix computers, has a total of 812 processors and 2 terabytes of memory, for a peak performance of 4.2 teraflops, or trillions of calculations per second.

The fundamental challenge for the Virgo team is to approximate that reality in a way that is both feasible to compute and fine-grained enough to yield useful insights. The Virgo astrophysicists have tackled it by coming up with a representation of that epoch's distribution of matter using 10 billion mass points, many more than any other simulation has ever attempted to use.
THE DIMENSIONLESS POINTS have no real physical meaning; they are just simulation elements, a way of modelling the universe's matter content. Each point is made up of normal and dark matter in proportion to the best current estimates, having a mass a billion times that of our sun, or 2000 trillion trillion trillion (239) kilograms. (The 10 billion particles together account for only 0.003 percent of the observable universe's total mass, but since the universe is homogeneous on the largest scales, the model is more than enough to be representative of the full extent of the cosmos.)



"We have learned more about the Universe in the last 10 or 20 years than in the whole of human civilisation," - Professor Carlos Frenk, Ogden professor of fundamental physics at the University of Durham and co-author on the Nature report.
"We are now able, using the biggest, fastest supercomputers in the world, to recreate the whole of cosmic history," - Professor Carlos Frenk.
The researchers looked at how the Universe evolved under the influence of the mysterious material called dark matter.

TO BEGIN CREATING THEIR MODEL of the universe, the researchers faced two basic questions: at what moment, precisely, should they start the simulation? And what are the universe's conditions at that very moment? Fortunately, cosmologists believe they have these answers.
According to the inflationary universe theory put forward in the 1980s by Alan Guth, of the Massachusetts Institute of Technology, and Andrei Linde, of Stanford University, the universe swelled at an extraordinarily rapid rate during a tiny fraction of a second immediately after the Big Bang. This exponentially fast expansion amplified minute, quantum-scale fluctuations that existed in the primordial energy field that filled the very early universe. These fluctuations caused matter to clump, and later, gravity created denser and denser aggregates.

The result is that these aggregates, which began as unimaginably small energy fluctuations in that primeval universe much smaller than a proton, ultimately evolved into the giant structures that compose the universe's sponge like web of matter. Even more surprising, most of the mass in this web of matter is not the ordinary stuff we know that makes up galaxies, stars, planets, and people.
After many experiments and calculations throughout the past decade, most cosmologists now agree on the astounding fact that some 85 percent of the matter in the universe consists of a mysterious substance known as dark matter that cannot be seen directly. They infer its presence by tracking the motions of stars and galaxies: stars are attracted to the centres of galaxies, and galaxies to the centres of galaxy clusters, by gravitational forces that are far greater than visible matter alone can possibly account for. Something else must be out there.

This shadowy substance is made up not of the familiar quarks, electrons, and their derivatives—atoms and molecules—but of some particle that has so far eluded experimenters. Candidates include axions, photinos, neutralinos—all yet to be discovered—among other particles predicted by theorists. The upshot is that, because dark matter is not visible, what astronomers have observed are the contours of the universe's great web, revealed by the light of the stars and galaxy clusters that formed onto the web's nodes, the junctions at which large amounts of matter accumulate.
It's kind of like inferring the shape of a Christmas tree in a pitch-dark room from the positions of the lights strung on it. These stars, galaxies, and other objects that we can see were born from dense aggregates of normal matter embedded in the dark matter of the web.

Dark matter model
According to cosmological theory, soon after the Big Bang, cold dark matter formed the first large structures in the Universe, which then collapsed under their own weight to form vast halos.
The gravitational pull of these halos sucked in normal matter, providing a focus for the formation of galaxies.
The simulation tracked some 10 billion dark matter particles over roughly 13 billion years of cosmic evolution. It incorporated data from satellite observations of the heat left over from the Big Bang, information on the make-up of the Universe and current understanding of the laws of physics on Earth.
"What's unique about the simulation is its scope and the level of detail with which we can re-create the cosmic structures we see around us," - Professor Carlos Frenk.
"Now we have the Millennium Run simulations, we have the predictions of the theory in enough detail that we can see if there is a meshing together of how the world looks on the larger scale and the way we expect it should look according to our theories. It's a way to check our theories." - Sir Martin Rees, Astronomer Royal.

Energy problem
Comparisons between the results of the simulation and astronomical observations are already helping shed light on some unsolved cosmic mysteries.
Some astronomers have previously questioned how radio-sources in distant galaxies called quasars could have formed so quickly after the Big Bang under the cold dark matter model.

The Millennium Run simulation demonstrates that such structures form naturally under the model in numbers consistent with data from the Sloan Digital Sky Survey.
The virtual universe may also shed light on the nature of dark energy, which makes up about 73% of the known Universe, and which, Frenk says, is the "number one unsolved problem in physics today - if not science itself".

"Our simulations tell us where to go looking for clues to learn about dark energy. If we want to learn about this we need to look at galaxy clusters, which encode information about the identity of dark energy," - Professor Carlos Frenk.
One of the major cosmological features that the Virgo team relies on to formulate its simulations is something called the cosmic microwave background, a feeble radiation remnant of the Big Bang that astronomers have now studied in great detail. This radiation, a key piece of evidence supporting the inflation theory, was emitted 380 000 years after the Big Bang when protons combined with free electrons to form neutral hydrogen atoms.

If the cosmic particle soup were absolutely smooth, with evenly distributed hydrogen atoms, this radiation would also be smooth all over the place—always the same wherever you look. But as cosmologists pointed their detectors to different parts of the sky, they found small variations in the cosmic microwave background.
These variations were recently minutely detailed by NASA's Wilkinson Microwave Anisotropy Probe, whose first scientific results were made public early last year. In a triumph of modern cosmology, the measured variations correspond precisely with the predictions of inflation theory.

The cosmic microwave background, therefore, gives cosmologists a fairly good picture of the distribution of matter when the universe, with an estimated current age of 13.7 billion years, was still in its infancy, only 380 000 years old. That's the starting point the Virgo group has chosen for its simulations. The main one, the first of a series, dubbed the Millennium Run, was completed this past June.
When data is fully processed within the next few months, the Millennium Run will reveal with unprecedented detail how the cosmos's broad distribution of matter came to be.




http://www.virgo.dur.ac.uk/

__________________
Page 1 of 1  sorted by
Quick Reply

Please log in to post quick replies.



Create your own FREE Forum
Report Abuse
Powered by ActiveBoard