122 research outputs found

    Models of hierarchical: galaxy formation

    Get PDF
    A semi-analytic galaxy formation model, N-body GALFORM, is developed which uses outputs from an N-body simulation to follow the merger histories of dark matter halos and treats baryonic processes using the semi-analytic model of Cole et al. We find that, apart from limited mass resolution, the only significant differences between this model and the Monte-Carlo based model of Cole et al. are due to known inaccuracies in the distribution of halo progenitor masses in the Monte-Carlo method. N-body GALFORM is used to compare Smooth Particle Hydrodynamics (SPH) and semi-analytic calculations of radiative cooling in the absence of star formation. We consider two cases: firstly, a simulation of a representative volume of the Universe with relatively poor mass resolution, and, secondly, a high resolution simulation of the formation of a single galaxy. We find good agreement between the models in terms of the mass of gas which cools in each halo, the masses of individual galaxies, and the spatial distribution of the galaxies. The semi-analytic model is then compared with a realistic, high-resolution galaxy simulation which includes prescriptions for star formation and feedback. A semi-analytic model without feedback is found to best reproduce the masses of the simulated galaxy and its progenitors. This model is used to populate a large volume with semi-analytic galaxies. The resulting luminosity function has an order of magnitude too many galaxies at high and low luminosities. We conclude that, while SPH and semi-analytic cooling calculations are largely consistent and therefore likely to be reasonably reliable, current numerical models of galaxy formation still contain major uncertainties due to the treatment of feedback, which will lead them to predict very different galaxy populations. Further work is required to find simulation algorithms which can simultaneously produce realistic individual galaxies and a population with reasonable statistical properties

    The FLAMINGO project: revisiting the S8 tension and the role of baryonic physics

    Get PDF
    A number of recent studies have found evidence for a tension between observations of large-scale structure (LSS) and the predictions of the standard model of cosmology with the cosmological parameters fit to the cosmic microwave background (CMB). The origin of this ‘S8 tension’ remains unclear, but possibilities include new physics beyond the standard model, unaccounted for systematic errors in the observational measurements and/or uncertainties in the role that baryons play. Here, we carefully examine the latter possibility using the new FLAMINGO suite of large-volume cosmological hydrodynamical simulations. We project the simulations onto observable harmonic space and compare with observational measurements of the power and cross-power spectra of cosmic shear, CMB lensing, and the thermal Sunyaev-Zel’dovich (tSZ) effect. We explore the dependence of the predictions on box size and resolution and cosmological parameters, including the neutrino mass, and the efficiency and nature of baryonic ‘feedback’. Despite the wide range of astrophysical behaviours simulated, we find that baryonic effects are not sufficiently large to remove the S8 tension. Consistent with recent studies, we find the CMB lensing power spectrum is in excellent agreement with the standard model, while the cosmic shear power spectrum, tSZ effect power spectrum, and the cross-spectra between shear, CMB lensing, and the tSZ effect are all in varying degrees of tension with the CMB-specified standard model. These results suggest that some mechanism is required to slow the growth of fluctuations at late times and/or on non-linear scales, but that it is unlikely that baryon physics is driving this modification

    Physics Mining of Multi-Source Data Sets

    Get PDF
    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it

    Cosmic cookery : making a stereoscopic 3D animated movie.

    Get PDF
    This paper describes our experience making a short stereoscopic movie visualizing the development of structure in the universe during the 13.7 billion years from the Big Bang to the present day. Aimed at a general audience for the Royal Society's 2005 Summer Science Exhibition, the movie illustrates how the latest cosmological theories based on dark matter and dark energy are capable of producing structures as complex as spiral galaxies and allows the viewer to directly compare observations from the real universe with theoretical results. 3D is an inherent feature of the cosmology data sets and stereoscopic visualization provides a natural way to present the images to the viewer, in addition to allowing researchers to visualize these vast, complex data sets. The presentation of the movie used passive, linearly polarized projection onto a 2m wide screen but it was also required to playback on a Sharp RD3D display and in anaglyph projection at venues without dedicated stereoscopic display equipment. Additionally lenticular prints were made from key images in the movie. We discuss the following technical challenges during the stereoscopic production process; 1) Controlling the depth presentation, 2) Editing the stereoscopic sequences, 3) Generating compressed movies in display speci¯c formats. We conclude that the generation of high quality stereoscopic movie content using desktop tools and equipment is feasible. This does require careful quality control and manual intervention but we believe these overheads are worthwhile when presenting inherently 3D data as the result is signi¯cantly increased impact and better understanding of complex 3D scenes

    Lessons learned from 104 years of mobile observatories [poster]

    Get PDF
    Poster session IN13B-1211 presented 10 December 2007 at the AGU Fall Meeting, 10–14 December 2007, San Francisco, CA, USAAs the oceanographic community ventures into a new era of integrated observatories, it may be helpful to look back on the era of "mobile observatories" to see what Cyberinfrastructure lessons might be learned. For example, SIO has been operating research vessels for 104 years, supporting a wide range of disciplines: marine geology and geophysics, physical oceanography, geochemistry, biology, seismology, ecology, fisheries, and acoustics. In the last 6 years progress has been made with diverse data types, formats and media, resulting in a fully-searchable online SIOExplorer Digital Library of more than 800 cruises (http://SIOExplorer.ucsd.edu). Public access to SIOExplorer is considerable, with 795,351 files (206 GB) downloaded last year. During the last 3 years the efforts have been extended to WHOI, with a "Multi-Institution Testbed for Scalable Digital Archiving" funded by the Library of Congress and NSF (IIS 0455998). The project has created a prototype digital library of data from both institutions, including cruises, Alvin submersible dives, and ROVs. In the process, the team encountered technical and cultural issues that will be facing the observatory community in the near future. Technological Lessons Learned: Shipboard data from multiple institutions are extraordinarily diverse, and provide a good training ground for observatories. Data are gathered from a wide range of authorities, laboratories, servers and media, with little documentation. Conflicting versions exist, generated by alternative processes. Domain- and institution-specific issues were addressed during initial staging. Data files were categorized and metadata harvested with automated procedures. With our second-generation approach to staging, we achieve higher levels of automation with greater use of controlled vocabularies. Database and XML- based procedures deal with the diversity of raw metadata values and map them to agreed-upon standard values, in collaboration with the Marine Metadata Interoperability (MMI) community. All objects are tagged with an expert level, thus serving an educational audience, as well as research users. After staging, publication into the digital library is completely automated. The technical challenges have been largely overcome, thanks to a scalable, federated digital library architecture from the San Diego Supercomputer Center, implemented at SIO, WHOI and other sites. The metadata design is flexible, supporting modular blocks of metadata tailored to the needs of instruments, samples, documents, derived products, cruises or dives, as appropriate. Controlled metadata vocabularies, with content and definitions negotiated by all parties, are critical. Metadata may be mapped to required external standards and formats, as needed. Cultural Lessons Learned: The cultural challenges have been more formidable than expected. They became most apparent during attempts to categorize and stage digital data objects across two institutions, each with their own naming conventions and practices, generally undocumented, and evolving across decades. Whether the questions concerned data ownership, collection techniques, data diversity or institutional practices, the solution involved a joint discussion with scientists, data managers, technicians and archivists, working together. Because metadata discussions go on endlessly, significant benefit comes from dictionaries with definitions of all community-authorized metadata values.Funding provided by the Library of Congress and NSF (IIS 0455998
    • …
    corecore