Home Interviews Simon White | ||||||||||
|
||||||||||
Drillingsraum: Last friday you celebrated your 60th birthday. Therefore all the best wishes for your future. We would like to give you two presents: The first one is a Millennium-Rum. So you'll have something to drink when you celebrate new findings in a future simulation. Prof. Dr. Simon White: Oh wow, thank you. Looks like a good way to celebrate a birthday. (laughs) Drillingsraum: The second present has something to do with the curvature of spacetime. Prof. Dr. Simon White: Is it the Nobel Prize? Drillingsraum: We'll come back to that later, look forward to it. First, let us talk a little about your work. In 2005, the first Millennium-Run was performed. The intention was to simulate the structure-formation in the universe. Would you mind summarizing the most important results from this simulation? Prof. Dr. Simon White: Actually it was completed in 2004, it took a long time to do the analysis. The Millennium Run was an attempt to simulate the formation of the structure in a large region of the universe, with sufficient resolution to see how individual objects like our own galaxy could form. What was new in this simulation was not so much the technique itself, which had been used before in smaller simulations. It was rather the overall size, which made it possible to see the formation of galaxies like our own throughout a very large region of space and thereby study if the population of galaxies that we find around us today is actually consistent with what we would expect, starting from the initial conditions seen directly in the microwave background. Drillingsraum: To induce structure formation, an initial condition was to add small fluctuations in the distribution of the dark matter density. You already mentioned it: From measurements of the cosmic microwave background we know what these fluctuations should have looked like in the early universe. But where do these fluctuations originally come from? Prof. Dr. Simon White: Well, we don't know for sure. Important are the statistical properties of the fluctuations that we see at a time when the universe was about 400.000 years old and a very large factor smaller than at its present age. At that time there were no stars and no galaxies. No structures were larger than individual atoms. The material was almost uniform, but not quite. The deviations from uniformity were small fluctuations which you can think of as a soundwave propagating in the material that was present then. These soundwaves caused small variations in temperature and density and those can be directly seen in the microwave background. So the microwave background maps a direct image of these soundwaves at the time where the universe was only |
||||||||||
|
||||||||||
soundwaves that we see at high redshifts? These must have been caused by something at even earlier times in the universe. We cannot see back that far directly, so we have to try and infer what caused this fluctuations from their statistical properties. The statistical properties that are measured directly in the microwave maps seem consistant with the idea, that all these fluctuations were actually caused at a very early epoch of the universes history, a tiny fraction of a second after the initial big bang itself. They may actually be a reflection of a quantum mechanical zero-point fluctuation during the so called „inflation“, in which the universe is supposed to grow in size by a very large factor in a short period of time. Drillingsraum: The Millennium-Run in 2005 was followed by a second one, and recently the supercomputers finished their work with the Millennium-XXL-Project. Would you please tell us a little bit about the upgrades in these new simulations and what results you expect or have already achieved? Prof. Dr. Simon White: What was novel in the original Millennium Run was first of all the overall size, which was roughly a factor of 10 larger than previous calculations. And also the fact, that we implemented techniques which allowed us to follow the actual formation of visible galaxies in a rough but physically based manner. We were able to predict not only the distribution of the unseen dark matter component of the universe, but also where the things we can actually see should be and what their properties should be. With time the limitations in the original simulation became evident: You would like to have a lot of detail, so you would like to have a lot of resolution. But at the same time you would like to sample a large volume to see a fair fraction of the universe. In practice you always have to make a compromise, and we did the best we could with the Millennium Run. For some things its resolution was not good enough, for example: We didn't have a fine enough resolution to see the formation of the smallest galaxies, and so the second Millennium Run, or we call it the „Millennium 2“, focussed on a smaller region of space. It was a similar calculation in terms of the computer power, but it had a finer resolution and we used that to try to study the formation of the small galaxies. This second Millennium Run was a very successful program and gave interesting insights on how galaxies actually form and what processes are important |
||||||||||
|
||||||||||
large surveys of the universe. People now want to carry those up, for example with the Euclid Satellite, which was selected by ESA yesterday for the next phase of development. This plans to survey the hole visible universe and so to study the statistical properties expected for the galaxy population in such a large volume were the motivation for the Millennium-XXL, which studies a volume of space which is 200 times larger than that in the original Millennium Simulation. The Millennium XXL actually calculated the forces between 300 billion particles, and integrated their equations of motion. Drillingsraum: The quality of such a simulation is given by the agreement between its results and the actual structure of the universe. To what extent does it therefore make sense to derive future developments of the universe from simulations? Prof. Dr. Simon White: Well, these simulations only followed the dark matter component. The reason for this is that the dark matter is the dominant gravitating component of the universe today, so it drives the formation of structures but only interacts with other material these recent times through gravity, and this simplifies the treatment. This of course means, that we have a very precise simulation of a thing that we can't see, and a much less precise simulation of the things that we can see, which are the galaxies and the other components of the universe that are made of ordinary matter. So, much of the work that still remains to be done is to understand how the visible parts of the universe develop in time, how the galaxies form in detail, and which processes are responsible for shaping their properties. There already are some surprises: One of them, which came in the original Millennium Simulation, was the realisation that to understand the properties of the visible galaxies we have to understand the effects of the black holes in their centres. The actual population of galaxies was shaped by the development of the black holes in their cores. It is not true, that this small object in the center is divorced from the rest of the galaxy, even though the black holes only contain a tenth of a percent of the stellar mass of the galaxy, a tiny fraction. Nevertheless, processes around them can influence the galaxy as a whole and apparently shape the population and terminate the formation of big galaxies. Drillingsraum: The Millennium-Simulation is sort of a universe within a universe. But: How far does the simulated universe actually match the real universe? Did the simulations maybe create some structures, that do not fit to our observations at all? Prof. Dr. Simon White: I think there are no definitive conflicts so far, but there are certainly questions where aspects of the simulated universe don't agree with what is observed. At the moment most of these questions don't belong to the large scale structure of the universe, say the distribution of material on scales bigger than galaxies, but to the properties of galaxies themselves and at particular to their central regions. There are a number of issues related to this, because the structure that is found in the centres of objects in the simulations |
||||||||||
|
||||||||||
formation of the things we see, but they are too complex to follow their development in detail. It is very difficult to decide exactly how the formation of stars and black holes will change the structure in the inner most regions of galaxies. Until we can't be sure of that, we can't be certain of the discrepancy between what we see and what we predict from the simulations. We can't be sure whether that is just due to these astrophysical processes or that it would might reflect some deeper problem. |
||||||||||
<< backward | 1 | 2 | 3 | 4 | forward >> | ||||||||||
Home | About | Physics | Interviews | Impressum | ||||||||||