1,705 research outputs found

    Dancing with black holes

    Full text link
    We describe efforts over the last six years to implement regularization methods suitable for studying one or more interacting black holes by direct N-body simulations. Three different methods have been adapted to large-N systems: (i) Time-Transformed Leapfrog, (ii) Wheel-Spoke, and (iii) Algorithmic Regularization. These methods have been tried out with some success on GRAPE-type computers. Special emphasis has also been devoted to including post-Newtonian terms, with application to moderately massive black holes in stellar clusters. Some examples of simulations leading to coalescence by gravitational radiation will be presented to illustrate the practical usefulness of such methods.Comment: 8 figures, 10 pages, to appear in "Dynamical Evolution of Dense Stellar Systems", ed. E. Vesperin

    Accelerating NBODY6 with Graphics Processing Units

    Full text link
    We describe the use of Graphics Processing Units (GPUs) for speeding up the code NBODY6 which is widely used for direct NN-body simulations. Over the years, the N2N^2 nature of the direct force calculation has proved a barrier for extending the particle number. Following an early introduction of force polynomials and individual time-steps, the calculation cost was first reduced by the introduction of a neighbour scheme. After a decade of GRAPE computers which speeded up the force calculation further, we are now in the era of GPUs where relatively small hardware systems are highly cost-effective. A significant gain in efficiency is achieved by employing the GPU to obtain the so-called regular force which typically involves some 99 percent of the particles, while the remaining local forces are evaluated on the host. However, the latter operation is performed up to 20 times more frequently and may still account for a significant cost. This effort is reduced by parallel SSE/AVX procedures where each interaction term is calculated using mainly single precision. We also discuss further strategies connected with coordinate and velocity prediction required by the integration scheme. This leaves hard binaries and multiple close encounters which are treated by several regularization methods. The present nbody6-GPU code is well balanced for simulations in the particle range 104−2×10510^4-2 \times 10^5 for a dual GPU system attached to a standard PC.Comment: 8 pages, 3 figures, 2 tables, MNRAS accepte

    6th and 8th Order Hermite Integrator for N-body Simulations

    Full text link
    We present sixth- and eighth-order Hermite integrators for astrophysical NN-body simulations, which use the derivatives of accelerations up to second order ({\it snap}) and third order ({\it crackle}). These schemes do not require previous values for the corrector, and require only one previous value to construct the predictor. Thus, they are fairly easy to implemente. The additional cost of the calculation of the higher order derivatives is not very high. Even for the eighth-order scheme, the number of floating-point operations for force calculation is only about two times larger than that for traditional fourth-order Hermite scheme. The sixth order scheme is better than the traditional fourth order scheme for most cases. When the required accuracy is very high, the eighth-order one is the best. These high-order schemes have several practical advantages. For example, they allow a larger number of particles to be integrated in parallel than the fourth-order scheme does, resulting in higher execution efficiency in both general-purpose parallel computers and GRAPE systems.Comment: 21 pages, 6 figures, New Astronomy accepte

    Mergers and ejections of black holes in globular clusters

    Full text link
    We report on results of fully consistent N-body simulations of globular cluster models with N = 100 000 members containing neutron stars and black holes. Using the improved `algorithmic regularization' method of Hellstrom and Mikkola for compact subsystems, the new code NBODY7 enables for the first time general relativistic coalescence to be achieved for post-Newtonian terms and realistic parameters. Following an early stage of mass segregation, a few black holes form a small dense core which usually leads to the formation of one dominant binary. The subsequent evolution by dynamical shrinkage involves the competing processes of ejection and mergers by radiation energy loss. Unless the binary is ejected, long-lived triple systems often exhibit Kozai cycles with extremely high inner eccentricity (e > 0.999) which may terminate in coalescence at a few Schwarzschild radii. A characteristic feature is that ordinary stars as well as black holes and even BH binaries are ejected with high velocities. On the basis of the models studied so far, the results suggest a limited growth of a few remaining stellar mass black holes in globular clusters.Comment: 8 pages, 9 figures, accepted MNRAS, small typo correcte

    Star Cluster Simulations: The State of the Art

    Get PDF
    This paper concentrates on four key tools for performing star cluster simulations developed during the last decade which are sufficient to handle all the relevant dynamical aspects. First we discuss briefly the Hermite integration scheme which is simple to use and highly efficient for advancing the single particles. The main numerical challenge is in dealing with weakly and strongly perturbed hard binaries. A new treatment of the classical Kustaanheimo-Stiefel two-body regularization has proved to be more accurate for studying binaries than previous algorithms based on divided differences or Hermite integration. This formulation employs a Taylor series expansion combined with the Stumpff functions, still with one force evaluation per step, which gives exact solutions for unperturbed motion and is at least comparable to the polynomial methods for large perturbations. Strong interactions between hard binaries and single stars or other binaries are studied by chain regularization which ensures a non-biased outcome for chaotic motions. A new semi-analytical stability criterion for hierarchical systems has been adopted and the long-term effects on the inner binary are now treated by averaging techniques for cases of interest. These modifications describe consistent changes of the orbital variables due to large Kozai cycles and tidal dissipation. The range of astrophysical processes which can now be considered by N-body simulations include tidal capture, circularization, mass transfer by Roche-lobe overflow as well as physical collisions, where the masses and radii of individual stars are modelled by synthetic stellar evolution.Comment: Accepted by Cel. Mech. Dyn. Astron., 12 pages including figur

    The Formation of a Bound Star Cluster: From the Orion Nebula Cluster to the Pleiades

    Get PDF
    (shortened) Direct N-body calculations are presented of the formation of Galactic clusters using GasEx, which is a variant of the code Nbody6. The calculations focus on the possible evolution of the Orion Nebula Cluster (ONC) by assuming that the embedded OB stars explosively drove out 2/3 of its mass in the form of gas about 0.4 Myr ago. A bound cluster forms readily and survives for 150 Myr despite additional mass loss from the large number of massive stars, and the Galactic tidal field. This is the very first time that cluster formation is obtained under such realistic conditions. The cluster contains about 1/3 of the initial 10^4 stars, and resembles the Pleiades Cluster to a remarkable degree, implying that an ONC-like cluster may have been a precursor of the Pleiades. This scenario predicts the present expansion velocity of the ONC, which will be measurable by upcoming astrometric space missions (DIVA and GAIA). These missions should also detect the original Pleiades members as an associated expanding young Galactic-field sub-population. The results arrived at here suggest that Galactic clusters form as the nuclei of expanding OB associations.Comment: MNRAS, in press, 36 pages, 15 figures; repl.vers. contains adjustments for consistency with published versio

    ErzÀhlmechanismen 2.0: Ein abermaliger Blick auf die Digitalliteratur

    Get PDF
    This paper offers a discussion on how narrative texts work mechanically, based on the storytelling mechanism theory constructed by Yan Zheng (2016) and cybertext theory introduced by Espen Aarseth (1997). It agrees with Zheng’s findings that there are no intrinsic differences between digital literature and literature presented on other platforms in terms of their mechanical textual behaviours. However, this paper also points out the limitations of Zheng’s theory in its previous stage. To fix the problem, this paper differentiates the storytelling mechanism theory from its theoretical origin, cybertext theory, avoids the problems found in both theories, and further develops Zheng’s typology of narrative texts to include 35 logical questions that can provide more meticulous views to enquire about a narrative text. Moreover, this paper constructs a map that can be used to demonstrate visually how a narrative text is produced with collaborative efforts from the three elements in the storytelling mechanism.Ovaj rad donosi raspravu o načinima mehaničkoga djelovanja pripovjednih tekstova, utemeljenu na teoriji pripovjednih mehanizama Yan Zheng (2016) i teoriji kiberteksta koju je predstavio Espen Aarseth (1997). Rad potvrđuje ranije Zhengine zaključke o tome da, kad je riječ o njihovu mehaničkome tekstualnome ponaĆĄanju, bitnije razlike između digitalne knjiĆŸevnosti i knjiĆŸevnosti predstavljene pomoću drugih platformi ne postoje. Međutim, rad također ukazuje na ograničenja ranije Zhengine teorije. U svrhu unaprjeđenja te ranije teorije u radu se uvodi razlika između teorije pripovjednih mehanizama i njezina teorijskoga polaziĆĄta – teorije kiberteksta, izbjegavaju se problemi uočeni u objema teorijama te se razvija Zhengina tipologija pripovjednoga teksta proĆĄirivanjem s pomoću 35 logičnih pitanja koja mogu ponuditi preciznija stajaliĆĄta za promiĆĄljanja o pripovjednome tekstu. Nadalje, rad konstruira kartu koju je moguće rabiti kao vizualni prikaz načina na koji se pripovjedni tekst suradničkim naporima gradi od triju elemenata pripovjednoga mehanizma.Im Beitrag werden anhand der Theorie ĂŒber die ErzĂ€hlmechanismen von Yan Zheng (2016) und der von Espen Aarseth (1997) vorgestellten Cybertexttheorie die mechanischen Wirkungsweisen von ErzĂ€hltexten besprochen. Es werden Zhengs frĂŒhere Schlussfolgerungen darĂŒber bestĂ€tigt, dass sich die digitale Literatur von der in Form anderer Plattformen vermittelten Literatur in Bezug auf deren mechanische textuelle Verhaltensweisen nicht wesentlich unterscheidet. Dennoch wird im Beitrag auf die BeschrĂ€nkungen von Zhengs frĂŒherer Theorie hingewiesen. Um diese Theorie weiterzuentwickeln, wird im Beitrag zwischen der Theorie der ErzĂ€hlmechanismen und deren theoretischem Ausgangspunkt – der Cybertexttheorie – unterschieden, die in beiden Theorien enthaltenen MĂ€ngel behoben und Zhengs ErzĂ€hltexttypologie mit Hilfe von 35 logischen Fragen erweitert, die zur exakteren Erforschung von ErzĂ€hltexten beitragen können. Es wird auch eine Karte als visuelle Darstellung der Art und Weise beigefĂŒgt, worauf man einen ErzĂ€hltext als Gemeinschaftsprojekt anhand dreier Elemente des ErzĂ€hlmechanismus aufbauen kann
    • 

    corecore