1,018 research outputs found
Near-infrared adaptive optics imaging of high redshift quasars
The properties of high redshift quasar host galaxies are studied, in order to
investigate the connection between galaxy evolution, nuclear activity, and the
formation of supermassive black holes. We combine new near-infrared
observations of three high redshift quasars (2 < z < 3), obtained at the ESO
Very Large Telescope equipped with adaptive optics, with selected data from the
literature. For the three new objects we were able to detect and characterize
the properties of the host galaxy, found to be consistent with those of massive
elliptical galaxies of M(R) ~ -24.7 for the one radio loud quasar, and M(R) ~
-23.8 for the two radio quiet quasars. When combined with existing data at
lower redshift, these new observations depict a scenario where the host
galaxies of radio loud quasars are seen to follow the expected trend of
luminous (~5L*) elliptical galaxies undergoing passive evolution. This trend is
remarkably similar to that followed by radio galaxies at z > 1.5. Radio quiet
quasars hosts also follow a similar trend but at a lower average luminosity
(~0.5 mag dimmer). The data indicate that quasar host galaxies are already
fully formed at epochs as early as ~2 Gyr after the Big Bang and then passively
fade in luminosity to the present epoch.Comment: Accepted for publication in ApJ, 24 pages, 10 figure
Physical limitations of phased array antennas
In this paper, the bounds on the Q-factor, a quantity inversely proportional
to bandwidth, are derived and investigated for narrow-band phased array
antennas. Arrays in free space and above a ground plane are considered. The
Q-factor bound is determined by solving a minimization problem over the
electric current density. The support of these current densities is on an
element-enclosing region, and the bound holds for lossless antenna elements
enclosed in this region. The Q-factor minimization problem is formulated as a
quadratically constrained quadratic optimization problem that is solved either
by a semi-definite relaxation or an eigenvalue-based method. We illustrate
numerically how these bounds can be used to determine trade-off relations
between the Q-factor and other design specifications: element form-factor,
size, efficiency, scanning capabilities, and polarization purity.Comment: 12 pages, 11 figure
Temporal changes in outcome following intensive care unit treatment after traumatic brain injury : a 17-year experience in a large academic neurosurgical centre
Traumatic brain injury (TBI) is a major cause of morbidity and mortality. However, it remains undetermined whether long-term outcomes after TBI have improved over the past two decades. We conducted a retrospective analysis of consecutive TBI patients admitted to an academic neurosurgical ICU during 1999-2015. Primary outcomes of interest were 6-month all-cause mortality (available for all patients) and 6-month Glasgow Outcome Scale (GOS, available from 2005 onwards). GOS was dichotomized to favourable and unfavourable functional outcome. Temporal changes in outcome were assessed using multivariate logistic regression analysis, adjusting for age, sex, GCS motor score, pupillary light responsiveness, Marshall CT classification and major extracranial injury. Altogether, 3193 patients were included. During the study period, patient age and admission Glasgow Coma Scale score increased, while the overall TBI severity did not change. Overall unadjusted 6-month mortality was 25% and overall unadjusted unfavourable outcome (2005-2015) was 44%. There was no reduction in the adjusted odds of 6-month mortality (OR 0.98; 95% CI 0.96-1.00), but the adjusted odds of favourable functional outcome significantly increased (OR 1.08; 95% CI 1.04-1.11). Subgroup analysis showed outcome improvements only in specific subgroups (conservatively treated patients, moderate-to-severe TBI patients, middle-aged patients). During the past two decades, mortality after significant TBI has remained largely unchanged, but the odds of favourable functional outcome have increased significantly in specific subgroups, implying an improvement in quality of care. These developments have been paralleled by notable changes in patient characteristics, emphasizing the importance of continuous epidemiological monitoring.Peer reviewe
The cosmic evolution of quasar host galaxies
We present near-infrared imaging of the host galaxies of 17 quasars in the
redshift range 1 < z < 2, carried out at the ESO VLT UT1 8m telescope under
excellent seeing conditions (~0.4 arcsec). The sample includes radio-loud (RLQ)
and radio-quiet (RQQ) quasars with similar distribution of redshift and optical
luminosity. For all the observed objects but one we have derived the global
properties of the surrounding nebulosity. The host galaxies of both types of
quasars follow the expected trend in luminosity of massive ellipticals
undergoing simple passive evolution, but there is a systematic difference by a
factor ~2 in the host luminosity between RLQs and RQQs (M_K(RLQ) = -27.55 +-
0.12 and M_K(RQQ) = -26.83 +- 0.25). Comparison with quasar hosts at similar
and lower redshift indicates that the difference in the host luminosity between
RLQs and RQQs remains the same from z = 2 to the present epoch. No significant
correlation is found between the nuclear and the host luminosities. Assuming
that the host luminosity is proportional to the black hole mass, as observed in
nearby massive spheroids, these quasars emit at very different levels (spread
\~1.5dex) with respect to their Eddington luminosity and with the same
distribution for RLQs and RQQs. Apart from a factor of ~2 difference in
luminosity, the hosts of RLQs and RQQs appear to follow the same cosmic
evolution as massive inactive spheroids. Our results support a view where
nuclear activity can occur in all luminous ellipticals without producing a
significant change in their global properties and evolution. Quasar hosts
appear to be already well formed at z ~2, in disagreement with models for the
joint formation and evolution of galaxies and active nuclei based on the
hierarchical structure formation scenario.Comment: Astrophysical Journal, accepted; 34 page
Freezing and chemical preservatives alter the stable isotope values of carbon and nitrogen of the Asiatic clam (Corbicula fluminea)
We tested the impacts of most common sample preservation methods used for aquatic sample materials on the stable isotope ratios of carbon and nitrogen in clams, a typical baseline indicator organism for many aquatic food web studies utilising stable isotope analysis (SIA). In addition to common chemical preservatives ethanol and formalin, we also assessed the potential impacts of freezing on ÎŽÂčÂłC and ÎŽÂčâ”N values and compared the preserved samples against freshly dried and analysed samples. All preservation methods, including freezing, had significant impacts on ÎŽÂčÂłC and ÎŽÂčâ”N values and the effects in general were greater on the carbon isotope values (1.3-2.2% difference) than on the nitrogen isotope values (0.9-1.0% difference). However, the impacts produced by the preservation were rather consistent within each method during the whole 1 year experiment allowing these to be accounted for, if clams are intended for use in retrospective stable isotope studies
Unconscious trauma patients: outcome differences between southern Finland and Germany-lesson learned from trauma-registry comparisons.
PURPOSE: International trauma registry comparisons are scarce and lack standardised methodology. Recently, we performed a 6-year comparison between southern Finland and Germany. Because an outcome difference emerged in the subgroup of unconscious trauma patients, we aimed to identify factors associated with such difference and to further explore the role of trauma registries for evaluating trauma-care quality. METHODS: Unconscious patients [Glasgow Coma Scale (GCS) 3-8] with severe blunt trauma [Injury Severity Score (ISS) â„16] from Helsinki University Hospital's trauma registry (TR-THEL) and the German Trauma Registry (TR-DGU) were compared from 2006 to 2011. The primary outcome measure was 30-day in-hospital mortality. Expected mortality was calculated by Revised Injury Severity Classification (RISC) score. Patients were separated into clinically relevant subgroups, for which the standardised mortality ratios (SMR) were calculated and compared between the two trauma registries in order to identify patient groups explaining outcome differences. RESULTS: Of the 5243 patients from the TR-DGU and 398 from the TR-THEL included, nine subgroups were identified and analyzed separately. Poorer outcome appeared in the Finnish patients with penetrating head injury, and in Finnish patients under 60 years with isolated head injury [TR-DGU SMR = 1.06 (95 % CI = 0.94-1.18) vs. TR-THEL SMR = 2.35 (95 % CI = 1.20-3.50), p = 0.001 and TR-DGU SMR = 1.01 (95 % CI = 0.87-1.16) vs. TR-THEL SMR = 1.40 (95 % CI = 0.99-1.81), p = 0.030]. A closer analysis of these subgroups in the TR-THEL revealed early treatment limitations due to their very poor prognosis, which was not accounted for by the RISC. CONCLUSION: Trauma registry comparison has several pitfalls needing acknowledgement: the explanation for outcome differences between trauma systems can be a coincidence, a weakness in the scoring system, true variation in the standard of care, or hospitals' reluctance to include patients with hopeless prognosis in registry. We believe, however, that such comparisons are a feasible method for quality control.Peer reviewe
Recommended from our members
Multimedia delivery in the future internet
The term âNetworked Mediaâ implies that all kinds of media including text, image, 3D graphics, audio
and video are produced, distributed, shared, managed and consumed on-line through various networks,
like the Internet, Fiber, WiFi, WiMAX, GPRS, 3G and so on, in a convergent manner [1]. This white
paper is the contribution of the Media Delivery Platform (MDP) cluster and aims to cover the Networked
challenges of the Networked Media in the transition to the Future of the Internet.
Internet has evolved and changed the way we work and live. End users of the Internet have been confronted
with a bewildering range of media, services and applications and of technological innovations concerning
media formats, wireless networks, terminal types and capabilities. And there is little evidence that the pace
of this innovation is slowing. Today, over one billion of users access the Internet on regular basis, more
than 100 million users have downloaded at least one (multi)media file and over 47 millions of them do so
regularly, searching in more than 160 Exabytes1 of content. In the near future these numbers are expected
to exponentially rise. It is expected that the Internet content will be increased by at least a factor of 6, rising
to more than 990 Exabytes before 2012, fuelled mainly by the users themselves. Moreover, it is envisaged
that in a near- to mid-term future, the Internet will provide the means to share and distribute (new)
multimedia content and services with superior quality and striking flexibility, in a trusted and personalized
way, improving citizensâ quality of life, working conditions, edutainment and safety.
In this evolving environment, new transport protocols, new multimedia encoding schemes, cross-layer inthe
network adaptation, machine-to-machine communication (including RFIDs), rich 3D content as well as
community networks and the use of peer-to-peer (P2P) overlays are expected to generate new models of
interaction and cooperation, and be able to support enhanced perceived quality-of-experience (PQoE) and
innovative applications âon the moveâ, like virtual collaboration environments, personalised services/
media, virtual sport groups, on-line gaming, edutainment. In this context, the interaction with content
combined with interactive/multimedia search capabilities across distributed repositories, opportunistic P2P
networks and the dynamic adaptation to the characteristics of diverse mobile terminals are expected to
contribute towards such a vision.
Based on work that has taken place in a number of EC co-funded projects, in Framework Program 6 (FP6)
and Framework Program 7 (FP7), a group of experts and technology visionaries have voluntarily
contributed in this white paper aiming to describe the status, the state-of-the art, the challenges and the way
ahead in the area of Content Aware media delivery platforms
Silvicultural Interventions Drive the Changes in Soil Organic Carbon in Romanian Forests According to Two Model Simulations
We investigated the effects of forest management on the carbon (C) dynamics in Romanian forest soils, using two model simulations: CBM-CFS3 and Yasso15. Default parametrization of the models and harmonized litterfall simulated by CBM provided satisfactory results when compared to observed data from National Forest Inventory (NFI). We explored a stratification approach to investigate the improvement of soil C prediction. For stratification on forest types only, the NRMSE (i.e., normalized RMSE of simulated vs. NFI) was approximately 26%, for both models; the NRMSE values reduced to 13% when stratification was done based on climate only. Assuming the continuation of the current forest management practices for a period of 50 years, both models simulated a very small C sink during simulation period (0.05 MgC ha(-1) yr(-1)). Yet, a change towards extensive forest management practices would yield a constant, minor accumulation of soil C, while more intensive practices would yield a constant, minor loss of soil C. For the maximum wood supply scenario (entire volume increment is removed by silvicultural interventions during the simulated period) Yasso15 resulted in larger emissions (-0.3 MgC ha(-1) yr(-1)) than CBM (-0.1 MgC ha(-1) yr(-1)). Under 'no interventions' scenario, both models simulated a stable accumulation of C which was, nevertheless, larger in Yasso15 (0.35 MgC ha(-1) yr(-1)) compared to CBM-CSF (0.18 MgC ha(-1) yr(-1)). The simulation of C stock change showed a strong "start-up" effect during the first decade of the simulation, for both models, explained by the difference in litterfall applied to each scenario compared to the spinoff scenario. Stratification at regional scale based on climate and forest types, represented a reasonable spatial stratification, that improved the prediction of soil C stock and stock change.Peer reviewe
Fluctuating loops and glassy dynamics of a pinned line in two dimensions
We represent the slow, glassy equilibrium dynamics of a line in a
two-dimensional random potential landscape as driven by an array of
asymptotically independent two-state systems, or loops, fluctuating on all
length scales. The assumption of independence enables a fairly complete
analytic description. We obtain good agreement with Monte Carlo simulations
when the free energy barriers separating the two sides of a loop of size L are
drawn from a distribution whose width and mean scale as L^(1/3), in agreement
with recent results for scaling of such barriers.Comment: 11 pages, 4 Postscript figure
- âŠ