17,549 research outputs found

    Models for the 3-D axisymmetric gravitational potential of the Milky Way Galaxy - A detailed modelling of the Galactic disk

    Full text link
    Aims. Galaxy mass models based on simple and analytical functions for the density and potential pairs have been widely proposed in the literature. Disk models constrained by kinematic data alone give information on the global disk structure only very near the Galactic plane. We attempt to circumvent this issue by constructing disk mass models whose three-dimensional structures are constrained by a recent Galactic star counts model in the near-infrared and also by observations of the hydrogen distribution in the disk. Our main aim is to provide models for the gravitational potential of the Galaxy that are fully analytical but also with a more realistic description of the density distribution in the disk component. Methods. From the disk model directly based on the observations (here divided into the thin and thick stellar disks and the HI and H2_2 disks subcomponents), we produce fitted mass models by combining three Miyamoto-Nagai disk profiles of any "model order" (1, 2, or 3) for each disk subcomponent. The Miyamoto-Nagai disks are combined with models for the bulge and "dark halo" components and the total set of parameters is adjusted by observational kinematic constraints. A model which includes a ring density structure in the disk, beyond the solar Galactic radius, is also investigated. Results. The Galactic mass models return very good matches to the imposed observational constraints. In particular, the model with the ring density structure provides a greater contribution of the disk to the rotational support inside the solar circle. The gravitational potential models and their associated force-fields are described in analytically closed forms, and in addition, they are also compatible with our best knowledge of the stellar and gas distributions in the disk component. The gravitational potential models are suited for investigations of orbits in the Galactic disk.Comment: 22 pages, 13 figures, 11 tables, accepted for publication in A&

    Perfil sócio-econômico e tecnológico dos produtores de soja no Paraná.

    Get PDF
    Amostragenm e coleta dos dados; Modelo de analise; Analise e discussao dos dados: Distribuicao e uso da terra; Infra-estrutura economica e social; Mao-de-obra; Maquinas, equipamentos e animais; Sistemas de producao; Indicadores de administracao rural; Cooperativismo; Tecnologia e uso de insumos; Calcario; Fertilizantes; Adubacao verde; Descompactacao do solo; Metodos de manejo dos solos; Rotacao de culturas; Alternativas de inverno no Parana; Manipulacao de produtos quimicos; Uso de herbicidas e principais ervas daninhas; Pragas de soja e produtos quimicos utilizados; Doencas da soja e seu controle; Perdas na colheita; Sementes de soja.bitstream/item/77838/1/CNPSO-DOC.-38-89.pd

    Untangling Fine-Grained Code Changes

    Get PDF
    After working for some time, developers commit their code changes to a version control system. When doing so, they often bundle unrelated changes (e.g., bug fix and refactoring) in a single commit, thus creating a so-called tangled commit. Sharing tangled commits is problematic because it makes review, reversion, and integration of these commits harder and historical analyses of the project less reliable. Researchers have worked at untangling existing commits, i.e., finding which part of a commit relates to which task. In this paper, we contribute to this line of work in two ways: (1) A publicly available dataset of untangled code changes, created with the help of two developers who accurately split their code changes into self contained tasks over a period of four months; (2) a novel approach, EpiceaUntangler, to help developers share untangled commits (aka. atomic commits) by using fine-grained code change information. EpiceaUntangler is based and tested on the publicly available dataset, and further evaluated by deploying it to 7 developers, who used it for 2 weeks. We recorded a median success rate of 91% and average one of 75%, in automatically creating clusters of untangled fine-grained code changes

    The peculiar Na-O anticorrelation of the bulge globular cluster NGC 6440

    Full text link
    Context. Galactic Globular Clusters (GCs) are essential tools to understand the earliest epoch of the Milky Way, since they are among the oldest objects in the Universe and can be used to trace its formation and evolution. Current studies using high resolution spectroscopy for many stars in each of a large sample of GCs allow us to develop a detailed observational picture about their formation and their relation with the Galaxy. However, it is necessary to complete this picture by including GCs that belong to all major Galactic components, including the Bulge. Aims. Our aim is to perform a detailed chemical analyses of the bulge GC NGC 6440 in order to determine if this object has Multiple Populations (MPs) and investigate its relation with the Bulge of the Milky Way and with the other Galactic GCs, especially those associated with the Bulge, which are largely poorly studied. Methods. We determined the stellar parameters and the chemical abundances of light elements (Na, Al), iron-peak elements (Fe, Sc, Mn, Co, Ni), α\alpha-elements (O, Mg, Si, Ca, Ti) and heavy elements (Ba, Eu) in seven red giant members of NGC 6440 using high resolution spectroscopy from FLAMES@UVES. Results. We found a mean iron content of [Fe/H]=-0.50±\pm0.03 dex in agreement with other studies. We found no internal iron spread. On the other hand, Na and Al show a significant intrinsic spread, but the cluster has no significant O-Na anticorrelation nor exhibits a Mg-Al anticorrelation. The α\alpha-elements show good agreement with the Bulge field star trend, although they are at the high alpha end and are also higher than those of other GCs of comparable metallicity. The heavy elements are dominated by the r-process, indicating a strong contribution by SNeII. The chemical analysis suggests an origin similar to that of the Bulge field stars.Comment: 12 pages, 13 figures, Accepted for publication in A&

    Faster Than Real Time Tsunami Warning with Associated Hazard Uncertainties

    Get PDF
    Tsunamis are unpredictable events and catastrophic in their potential for destruction of human lives and economy. The unpredictability of their occurrence poses a challenge to the tsunami community, as it is difficult to obtain from the tsunamigenic records estimates of recurrence rates and severity. Accurate and efficient mathematical/computational modeling is thus called upon to provide tsunami forecasts and hazard assessments. Compounding this challenge for warning centres is the physical nature of tsunamis, which can travel at extremely high speeds in the open ocean or be generated close to the shoreline. Thus, tsunami forecasts must be not only accurate but also delivered under severe time constraints. In the immediate aftermath of a tsunamigenic earthquake event, there are uncertainties in the source such as location, rupture geometry, depth, magnitude. Ideally, these uncertainties should be represented in a tsunami warning. However in practice, quantifying the uncertainties in the hazard intensity (i.e., maximum tsunami amplitude) due to the uncertainties in the source is not feasible, since it requires a large number of high resolution simulations. We approximate the functionally complex and computationally expensive high resolution tsunami simulations with a simple and cheap statistical emulator. A workflow integrating the entire chain of components from the tsunami source to quantification of hazard uncertainties is developed here - quantification of uncertainties in tsunamigenic earthquake sources, high resolution simulation of tsunami scenarios using the GPU version of Volna-OP2 on a non-uniform mesh for an ensemble of sources, construction of an emulator using the simulations as training data, and prediction of hazard intensities with associated uncertainties using the emulator. Thus, using the massively parallelized finite volume tsunami code Volna-OP2 as the heart of the workflow, we use statistical emulation to compute uncertainties in hazard intensity at locations of interest. Such an integration also balances the trade-off between computationally expensive simulations and desired accuracy of uncertainties, within given time constraints. The developed workflow is fully generic and independent of the source (1945 Makran earthquake) studied here
    • …
    corecore