4,857 research outputs found

    The Two-Component Virial Theorem and the Physical Properties of Stellar Systems

    Get PDF
    Motivated by present indirect evidences that galaxies are surrounded by dark matter halos, we investigate whether their physical properties can be described by a formulation of the virial theorem which explicitly takes into account the gravitational potential term representing the interaction of the dark halo with the barionic or luminous component. Our analysis shows that the application of such a ``two-component virial theorem'' not only accounts for the scaling relations displayed, in particular, by elliptical galaxies, but also for the observed properties of all virialized stellar systems, ranging from globular clusters to galaxy clusters.Comment: 13 pages, 2 figures, LaTeX, corrected few typos. This version matches the published versio

    Structure and dynamics of the supercluster of galaxies SC0028-0005

    Get PDF
    According to the standard cosmological scenario, superclusters are objects that have just passed the turn around point and are collapsing. The dynamics of very few superclusters have been analysed up to now. In this paper we study the supercluster SC0028-0005, at redshift 0.22, identify the most prominent groups and/or clusters that make up the supercluster, and investigate the dynamic state of this structure. For the membership identification, we have used photometric and spectroscopic data from SDSS-DR10, finding 6 main structures in a flat spatial distribution. We have also used a deep multi-band observation with MegaCam/CFHT to estimate de mass distribution through the weak-lensing effect. For the dynamical analysis, we have determined the relative distances along the line of sight within the supercluster using the Fundamental Plane of early-type galaxies. Finally, we have computed the peculiar velocities of each of the main structures. The 3D distribution suggests that SC0028-005 is indeed a collapsing supercluster, supporting the formation scenario of these structures. Using the spherical collapse model, we estimate that the mass within r=10r = 10~Mpc should lie between 4 and 16×1015M16 \times 10^{15} M_\odot. The farthest detected members of the supercluster suggest that within 60\sim 60~Mpc the density contrast is δ3\delta \sim 3 with respect to the critical density at z=0.22z=0.22, implying a total mass of 4.6\sim 4.6--16×1017M16 \times 10^{17} M_\odot, most of which in the form of low-mass galaxy groups or smaller substructures.Comment: 12 pages, 9 figures, Accepted for publication in MNRA

    Function Allocation for Humans and Automation in the Context of Team Dynamics

    Get PDF
    AbstractWithin Human Factors Engineering, a decision-making process called function allocation (FA) is used during the design life cycle of complex systems to distribute the system functions, typically identified through a functional requirements analysis, to all human and automated machine agents (or teammates) involved in controlling the system. Most FA methods make allocation decisions primarily by comparing the capabilities of humans and automation, and then by considering secondary factors such as cost, regulations, and the health and safety of workers. The primary analysis of the strengths and weaknesses of humans and machines, however, is almost always considered in terms of individual human or machine capabilities. Yet, FA is fundamentally about teamwork in that the goal of the FA decision-making process is to determine the optimal allocations of functions among agents. Given this framing of FA, and the increasing use of and sophistication of automation, there are two related social psychological issues that current FA methods need to address more thoroughly. First, many principles for effective human teamwork are not considered as central decision points or in the iterative hypothesis and testing phase in most FA methods, despite the fact that social factors have numerous positive and negative effects on individual and team capabilities. Second, social psychological factors affecting team performance can be difficult to translate to automated agents, and most FA methods currently do not account for this effect. The implications for these issues are discussed

    Bibliografía de los peces de agua dulce de la Argentina Suplemento 1996-2002.

    Get PDF
    Between 1981 and 1995, we published five bibliographic lists (López et al., 1981, 1982,1987, 1989 y 1995) that included the publications referred to Argentine freshwater fishes and related general information published since Ringuelet et al. (1967). We included Uruguayan papers until 1989, when it became apparent that our access to those materials was not complete. Other bibliographic collections were published on several subjects (López et al., 1991, 1993, y Ferriz et al., 1998). In the foreword to the 1995 list, López stated that it was difficult to assess the actual usefulness of the lists, since they were seldom quoted in research papers. This consideration, along with the wide success of electronic databases, caused us to discontinue the series, since its only goal had been to increase the knowledge and access to local research papers and foreign publications of interest to local researchers. Regrettably, our experience indicates that access to scientific literature is still, if not difficult, somewhat arbitrary. Apart from that, the continuous work on diverse research lines at División Zoología Vertebrados of Museo de La Plata leads to the accumulation of a multiplicity of information which may be sorted out for the use of others without much difficulty. At present, electronically-supported databases appear as the simplest way to do this. The future will show if this method is more efficient than the preceding one. In this issue we have included papers published between 1996 and 2002; and it is our purpose to update the list on a yearly basis. Papers included cover Argentine fish fauna and some related subjects of more general interest. Naturally, it is hardly surprising that some involuntary omissions will occur when addressing this subject. Any corrections and/or additions will be incorporated in following versions; and all information is most welcome

    Signals of confinement in Green functions of SU(2) Yang-Mills theory

    Full text link
    The vortex picture of confinement is employed to explore the signals of confinement in Yang-Mills Green functions. By using SU(2) lattice gauge theory, it has been well established that the removal of the center vortices from the lattice configurations results in the loss of confinement. The running coupling constant, the gluon and the ghost form factors are studied in Landau gauge for both cases, the full and the vortex removed theory. In the latter case, a strong suppression of the running coupling constant and the gluon form factor at low momenta is observed. At the same time, the singularity of the ghost form factor at vanishing momentum disappears. This observation establishes an intimate correlation between the ghost singularity and confinement. The result also shows that a removal of the vortices generates a theory for which Zwanziger's horizon condition for confinement is no longer satisfied.Comment: 4 pages, 4 figure

    Disaggregated Computing. An Evaluation of Current Trends for Datacentres

    Get PDF
    Next generation data centers will likely be based on the emerging paradigm of disaggregated function-blocks-as-a-unit departing from the current state of mainboard-as-a-unit. Multiple functional blocks or bricks such as compute, memory and peripheral will be spread through the entire system and interconnected together via one or multiple high speed networks. The amount of memory available will be very large distributed among multiple bricks. This new architecture brings various benefits that are desirable in today’s data centers such as fine-grained technology upgrade cycles, fine-grained resource allocation, and access to a larger amount of memory and accelerators. An analysis of the impact and benefits of memory disaggregation is presented in this paper. One of the biggest challenges when analyzing these architectures is that memory accesses should be modeled correctly in order to obtain accurate results. However, modeling every memory access would generate a high overhead that can make the simulation unfeasible for real data center applications. A model to represent and analyze memory disaggregation has been designed and a statistics-based queuing-based full system simulator was developed to rapidly and accurately analyze applications performance in disaggregated systems. With a mean error of 10%, simulation results pointed out that the network layers may introduce overheads that degrade applications’ performance up to 66%. Initial results also suggest that low memory access bandwidth may degrade up to 20% applications’ performance.This project has received funding from the European Unions Horizon 2020 research and innovation programme under grant agreement No 687632 (dReDBox project) and TIN2015-65316-P - Computacion de Altas Prestaciones VII.Peer ReviewedPostprint (published version

    Reachability problems for PAMs

    Get PDF
    Piecewise affine maps (PAMs) are frequently used as a reference model to show the openness of the reachability questions in other systems. The reachability problem for one-dimentional PAM is still open even if we define it with only two intervals. As the main contribution of this paper we introduce new techniques for solving reachability problems based on p-adic norms and weights as well as showing decidability for two classes of maps. Then we show the connections between topological properties for PAM's orbits, reachability problems and representation of numbers in a rational base system. Finally we show a particular instance where the uniform distribution of the original orbit may not remain uniform or even dense after making regular shifts and taking a fractional part in that sequence.Comment: 16 page

    Selective flexibility of side-chain residues improves VEGFR-2 docking score using autodock vina

    Get PDF
    Selective side-chain residue flexibility is an option available on AutoDock Vina docking software. This approach is promising as it attempts to provide a more realistic ligand-protein interaction environment, without an unmanageable increase in computer processing time. However, studies validating this approach are still scarce. VEGFR-2 (vascular endothelial growth factor receptor 2), a known protein target for anti-angiogenic agents, was used in this study. Four residues present in the VEGFR-2 kinase site were selected and made flexible: Lys866, Glu885, Cys917 and Asp1044. The docking scores for all possible combinations of flexible residues were compared to the docking scores using a rigid conformation. The best overall docking scores were obtained using the Glu883 flexible conformation, with pearson and spearman rank correlation values of 0.568 and 0.543, respectively, and a 51% increase in computer processing time. Using different VEGFR-2 X-ray structures a similar trend was observed with Glu885 flexible conformation presenting the best scores. This study demonstrates that careful use of selective side-chain residue flexibility can improve AutoDock Vina docking score accuracy, without a significant increase in computer processing time. This methodology proved to be a valuable tool in drug design when using VEGFR-2 but will also probably be useful if applied to other protein targets.The authors are grateful to Foundation for Science and Technology (Portugal) and COMPETE/QREN/EU for financial support through research project PTDC/QUI-QUI/111060/2009 and Rui M.V. Abreu thanks to SFRH/PROTEC/49450/2009 grant
    corecore