60,133 research outputs found

    Rapid Simulations of Halo and Subhalo Clustering

    Full text link
    The analysis of cosmological galaxy surveys requires realistic simulations for their interpretation. Forward modelling is a powerful method to simulate galaxy clustering without the need for an underlying complex model. This approach requires fast cosmological simulations with a high resolution and large volume, to resolve small dark matter halos associated to single galaxies. In this work, we present fast halo and subhalo clustering simulations based on the Lagrangian perturbation theory code PINOCCHIO, which generates halos and merger trees. The subhalo progenitors are extracted from the merger history and the survival of subhalos is modelled. We introduce a new fitting function for the subhalo merger time, which includes a redshift dependence of the fitting parameters. The spatial distribution of subhalos within their hosts is modelled using a number density profile. We compare our simulations with the halo finder ROCKSTAR applied to the full N-body code GADGET-2. The subhalo velocity function and the correlation function of halos and subhalos are in good agreement. We investigate the effect of the chosen number density profile on the resulting subhalo clustering. Our simulation is approximate yet realistic and significantly faster compared to a full N-body simulation combined with a halo finder. The fast halo and subhalo clustering simulations offer good prospects for galaxy forward models using subhalo abundance matching.Comment: 28 pages, 10 figures, Accepted for publication in JCA

    Evaluation of phylogenetic reconstruction methods using bacterial whole genomes: a simulation based study

    Get PDF
    Background: Phylogenetic reconstruction is a necessary first step in many analyses which use whole genome sequence data from bacterial populations. There are many available methods to infer phylogenies, and these have various advantages and disadvantages, but few unbiased comparisons of the range of approaches have been made. Methods: We simulated data from a defined "true tree" using a realistic evolutionary model. We built phylogenies from this data using a range of methods, and compared reconstructed trees to the true tree using two measures, noting the computational time needed for different phylogenetic reconstructions. We also used real data from Streptococcus pneumoniae alignments to compare individual core gene trees to a core genome tree. Results: We found that, as expected, maximum likelihood trees from good quality alignments were the most accurate, but also the most computationally intensive. Using less accurate phylogenetic reconstruction methods, we were able to obtain results of comparable accuracy; we found that approximate results can rapidly be obtained using genetic distance based methods. In real data we found that highly conserved core genes, such as those involved in translation, gave an inaccurate tree topology, whereas genes involved in recombination events gave inaccurate branch lengths. We also show a tree-of-trees, relating the results of different phylogenetic reconstructions to each other. Conclusions: We recommend three approaches, depending on requirements for accuracy and computational time. Quicker approaches that do not perform full maximum likelihood optimisation may be useful for many analyses requiring a phylogeny, as generating a high quality input alignment is likely to be the major limiting factor of accurate tree topology. We have publicly released our simulated data and code to enable further comparisons

    Soft set theory based decision support system for mining electronic government dataset

    Get PDF
    Electronic government (e-gov) is applied to support performance and create more efficient and effective public services. Grouping data in soft-set theory can be considered as a decision-making technique for determining the maturity level of e-government use. So far, the uncertainty of the data obtained through the questionnaire has not been maximally used as an appropriate reference for the government in determining the direction of future e-gov development policy. This study presents the maximum attribute relative (MAR) based on soft set theory to classify attribute options. The results show that facilitation conditions (FC) are the highest variable in influencing people to use e-government, followed by performance expectancy (PE) and system quality (SQ). The results provide useful information for decision makers to make policies about their citizens and potentially provide recommendations on how to design and develop e-government systems in improving public services

    Efficient intra- and inter-night linking of asteroid detections using kd-trees

    Get PDF
    The Panoramic Survey Telescope And Rapid Response System (Pan-STARRS) under development at the University of Hawaii's Institute for Astronomy is creating the first fully automated end-to-end Moving Object Processing System (MOPS) in the world. It will be capable of identifying detections of moving objects in our solar system and linking those detections within and between nights, attributing those detections to known objects, calculating initial and differentially-corrected orbits for linked detections, precovering detections when they exist, and orbit identification. Here we describe new kd-tree and variable-tree algorithms that allow fast, efficient, scalable linking of intra and inter-night detections. Using a pseudo-realistic simulation of the Pan-STARRS survey strategy incorporating weather, astrometric accuracy and false detections we have achieved nearly 100% efficiency and accuracy for intra-night linking and nearly 100% efficiency for inter-night linking within a lunation. At realistic sky-plane densities for both real and false detections the intra-night linking of detections into `tracks' currently has an accuracy of 0.3%. Successful tests of the MOPS on real source detections from the Spacewatch asteroid survey indicate that the MOPS is capable of identifying asteroids in real data.Comment: Accepted to Icaru

    Active Learning of Multiple Source Multiple Destination Topologies

    Get PDF
    We consider the problem of inferring the topology of a network with MM sources and NN receivers (hereafter referred to as an MM-by-NN network), by sending probes between the sources and receivers. Prior work has shown that this problem can be decomposed into two parts: first, infer smaller subnetwork components (i.e., 11-by-NN's or 22-by-22's) and then merge these components to identify the MM-by-NN topology. In this paper, we focus on the second part, which had previously received less attention in the literature. In particular, we assume that a 11-by-NN topology is given and that all 22-by-22 components can be queried and learned using end-to-end probes. The problem is which 22-by-22's to query and how to merge them with the given 11-by-NN, so as to exactly identify the 22-by-NN topology, and optimize a number of performance metrics, including the number of queries (which directly translates into measurement bandwidth), time complexity, and memory usage. We provide a lower bound, ⌈N2⌉\lceil \frac{N}{2} \rceil, on the number of 22-by-22's required by any active learning algorithm and propose two greedy algorithms. The first algorithm follows the framework of multiple hypothesis testing, in particular Generalized Binary Search (GBS), since our problem is one of active learning, from 22-by-22 queries. The second algorithm is called the Receiver Elimination Algorithm (REA) and follows a bottom-up approach: at every step, it selects two receivers, queries the corresponding 22-by-22, and merges it with the given 11-by-NN; it requires exactly N−1N-1 steps, which is much less than all (N2)\binom{N}{2} possible 22-by-22's. Simulation results over synthetic and realistic topologies demonstrate that both algorithms correctly identify the 22-by-NN topology and are near-optimal, but REA is more efficient in practice

    Human motion modeling and simulation by anatomical approach

    Get PDF
    To instantly generate desired infinite realistic human motion is still a great challenge in virtual human simulation. In this paper, the novel emotion effected motion classification and anatomical motion classification are presented, as well as motion capture and parameterization methods. The framework for a novel anatomical approach to model human motion in a HTR (Hierarchical Translations and Rotations) file format is also described. This novel anatomical approach in human motion modelling has the potential to generate desired infinite human motion from a compact motion database. An architecture for the real-time generation of new motions is also propose
    • …
    corecore