1,980 research outputs found

    The circumstellar disk of FS Tau B - A self-consistent model based on observations in the mid-infrared with NACO -

    Get PDF
    Protoplanetary disks are a byproduct of the star formation process. In the dense mid-plane of these disks, planetesimals and planets are expected to form. The first step in planet formation is the growth of dust particles from submicrometer-sized grains to macroscopic mm-sized aggregates. The grain growth is accompanied by radial drift and vertical segregation of the particles within the disk. To understand this essential evolutionary step, spatially resolved multi-wavelength observations as well as photometric data are necessary which reflect the properties of both disk and dust. We present the first spatially resolved image obtained with NACO at the VLT in the Lp_\text{p} band of the near edge-on protoplanetary disk FS Tau B. Based on this new image, a previously published Hubble image in H band and the spectral energy distribution from optical to millimeter wavelengths, we derive constraints on the spatial dust distribution and the progress of grain growth. For this purpose we perform a disk modeling using the radiative transfer code MC3D. Radial drift and vertical sedimentation of the dust are not considered. We find a best-fit model which features a disk extending from 2 AU2\,\text{AU} to several hundreds AU with a moderately decreasing surface density and Mdisk=2.8 × 10−2 M⊙M_\text{disk}=2.8\,\times\,10^{-2}\,\text{M}_\odot. The inclination amounts to i=80∘i=80^\circ. Our findings indicate that substantial dust grain growth has taken place and that grains of a size equal to or larger than 1 mm1\,\text{mm} are present in the disk. In conclusion, the parameters describing the vertical density distribution are better constrained than those describing the radial disk structure.Comment: 10 pages, 9 figures, 2 table

    Multi-wavelength modeling of young stellar objects with Monte Carlo methods

    Get PDF
    Die vorliegende Arbeit untersucht junge stellare Objekte (YSO), eine Klasse astrophysikalischer Systeme die in den letzten Jahrzehnten durch Entdeckung protoplanetarer Scheiben und Exoplaneten zunehmend in den Fokus der Forschung gerückt ist. Eine protoplanetare Scheibe besteht im Wesentlichen aus Gas und ~1% kosmischen Staub, der durch seine hohe Opazität die Ausbreitung elektromagnetischer Strahlung im Kontinuum dominiert und damit die thermische Struktur und das Erscheinungsbild des Objekts bestimmt. Diese Komponente unterliegt einer charakteristischen Differenzierung durch Sedimentation und Koagulation, die eng an die Evolution der Scheibe und der Entstehung von Planeten gekoppelt ist. Ziel dieser Arbeit ist die Entwicklung einer Methode zur Modellierung von YSOs, die simultane Anpassung von Beobachtungen aus verschiedenen Quellen ermöglicht und damit Mehrdeutigkeiten reduziert. Der Kern dieser Methode ist die Lösung des inversen Strahlungstransportproblems durch Rückführung auf ein Optimierungsproblem, das mit der Monte-Carlo-Methode Simulated Annealing (SA) gelöst werden kann. Der erste Teil der Arbeit behandelt neben den Grundlagen der Sternentstehung und Evolution von YSOs, die Eigenschaften des kosmischen Staubs und seine Rolle für die interstellare Extinktion. Die Strahlungstransportgleichung im Kontinuum wird aus einem kinetischen Modell für elastische Streuung abgeleitet. Der zweite Teil beschreibt die Methoden dieser Arbeit. Die Beobachtung mit astronomischen Instrumenten und die Reduktion von Aufnahmen wird im Rahmen der für diese Arbeit durchgeführten Kampagnen vorgestellt. Nach einer Einführung in Monte Carlo (MC) wird der Metropolis-Hastings-Algorithmus beschrieben, die Implementierung der daraus abgeleiteten Optimierungsmethode SA vorgestellt und eine kurze Darstellung der Lösung der Strahlungstransportgleichung mit MC gegeben. Abschließend wird Modellierung mit der Maximum-Likelihood-Methode behandelt und ein Vergleich der hier vorgestellten Implementierung von SA mit dem kanonischen Ansatz der Anpassung mit einer Datenbank durchgeführt. Der dritte Teil besteht aus Studien der drei Systeme HH 30, V4046 Sgr und DoAr 33 mit der in Teil II entwickelten Methode. Für jedes Objekt werden nach einer kurzen Einführung und Diskussion der wichtigsten Vorgängerstudien die verwendeten Beobachtungen vorgestellt, das gewählte Scheibenmodell motiviert und die Implementierung der Studie beschrieben. Abschließend werden die Ergebnisse zusammengefasst und diskutiert.The present work investigates Young Stellar Objects (YSO), a class of astrophysical objects that has entered the focus of current research during the last decades with the discovery of protoplanetary disks and exoplanets. A protoplanetary disk consists mostly of gas and ~1% cosmic dust, that dominates the propagation of electromagnetic radiation in the continuum due to its high opacity and hence determines the thermal structure and appearance of the object. This component is subject to a characteristic differentiation caused by sedimentation and coagulation that is coupled tightly to disk evolution and in particular the formation of planets. The aim of this work is the development of a method to model YSOs, that enables the simultaneous fitting of observations from different sources, thereby reducing degeneracies. The core of this method lies in the solution of the inverse radiation transfer problem by tracing it back to an optimization problem that can be solved using the Monte Carlo method Simulated Annealing (SA). The first part of this work deals with the fundamentals of star formation and evolution of YSOs, the properties of cosmic dust, and its role in interstellar extinction. The radiation transfer equation in the continuum is deduced from a kinetic model for elastic scattering. The second part describes the methods of this work. Observation with astronomical instruments and data reduction is introduced on the background of performed campaigns. After an introduction to Monte Carlo (MC), the Metropolis-Hastings algorithm is described, followed by the implementation of the derived optimization method SA and a short description of the solution of the radiative transfer equation using MC. The part is concluded with modeling using Maximum-Likelihood and a comparison of fitting with SA and the canonical data base approach. The third part consists of studies of the three systems HH 30, V4046 Sgr and DoAr 33 with the method developed in part II. Previous studies are discussed for every object after a short introduction, the used observations are presented, the chosen model is motivated and the implementation of the study is described. The studies close with a summary and discussion of the results

    Energy efficiency: what has research delivered in the last 40 years?

    Get PDF
    This article presents a critical assessment of 40 years of research that may be brought under the umbrella of energy efficiency, spanning different aggregations and domains-from individual producing and consuming agents to economy-wide effects to the role of innovation to the influence of policy. After 40 years of research, energy efficiency initiatives are generally perceived as highly effective. Innovation has contributed to lowering energy technology costs and increasing energy productivity. Energy efficiency programs in many cases have reduced energy use per unit of economic output and have been associated with net improvements in welfare, emission reductions, or both. Rebound effects at the macro level still warrant careful policy attention, as they may be nontrivial. Complexity of energy efficiency dynamics calls for further methodological and empirical advances, multidisciplinary approaches, and granular data at the service level for research in this field to be of greatest societal benefit

    Study of the B +→ J / ψ Λ ¯ p decay in proton-proton collisions at √s = 8 TeV

    Get PDF
    A study of the B +→ J / ψ Λ ¯ p decay using proton-proton collision data collected at s = 8 TeV by the CMS experiment at the LHC, corresponding to an integrated luminosity of 19.6 fb−1, is presented. The ratio of branching fractions B(B+→J/ψΛ¯p)/B(B+→J/ψK∗(892)+) is measured to be (1.054 ± 0.057(stat) ± 0.035(syst) ± 0.011(B))%, where the last uncertainty reflects the uncertainties in the world-average branching fractions of Λ ¯ and K*(892) + decays to reconstructed final states. The invariant mass distributions of the J / ψ Λ ¯ , J/ψp, and Λ ¯ p systems produced in the B +→ J / ψ Λ¯ p decay are investigated and found to be inconsistent with the pure phase space hypothesis. The analysis is extended by using a model-independent angular amplitude analysis, which shows that the observed invariant mass distributions are consistent with the contributions from excited kaons decaying to the Λ ¯ p system. [Figure not available: see fulltext.

    Search for new neutral Higgs bosons through the H → ZA→ ℓ+ℓ−b b ¯ process in pp collisions at √s = 13 TeV

    Get PDF
    This paper reports on a search for an extension to the scalar sector of the standard model, where a new CP-even (odd) boson decays to a Z boson and a lighter CP-odd (even) boson, and the latter further decays to a b quark pair. The Z boson is reconstructed via its decays to electron or muon pairs. The analysed data were recorded in proton-proton collisions at a center-of-mass energy s = 13 TeV, collected by the CMS experiment at the LHC during 2016, corresponding to an integrated luminosity of 35.9 fb−1. Data and predictions from the standard model are in agreement within the uncertainties. Upper limits at 95% confidence level are set on the production cross section times branching fraction, with masses of the new bosons up to 1000 GeV. The results are interpreted in the context of the two-Higgs-doublet model. [Figure not available: see fulltext.]
    • …
    corecore