31,884 research outputs found
Changes in the trajectory of the radio jet in 0735+178?
We present multi-epoch 8.4 and 43 GHz Very Long Baseline Array images of the
BL Lac object 0735+178. The images confirm the presence of a twisted jet with
two sharp apparent bends of 90 within two milliarcseconds of the
core, resembling a helix in projection. The observed twisted geometry could be
the result of precession of the jet inlet, but is more likely produced by
pressure gradients in the external medium through which the jet propagates.
Quasi-stationary components are observed at the locations of the 90
bends, possibly produced by differential Doppler boosting. Identification of
components across epochs, since the earliest VLBI observations of this source
in 1979.2, proves difficult due to the sometimes large time gaps between
observations. One possible identification suggests the existence of
superluminal components following non--ballistic trajectories with velocities
up to . However, in images obtained after mid-1995,
components show a remarkable tendency to cluster near several jet positions,
suggesting a different scenario in which components have remained nearly
stationary in time at least since mid-1995. Comparison with the earlier
published data, covering more than 19 years of observations, suggests a
striking qualitative change in the jet trajectory sometime between mid-1992 and
mid-1995, with the twisted jet structure with stationary components becoming
apparent only at the later epochs. This would require a re-evaluation of the
physical parameters estimated for 0735+178, such as the observing viewing
angle, the plasma bulk Lorentz factor, and those deduced from these.Comment: 18 pages, 5 figures, accepted for publication in MNRA
A framework for the simulation of structural software evolution
This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2008 ACM.As functionality is added to an aging piece of software, its original design and structure will tend to erode. This can lead to high coupling, low cohesion and other undesirable effects associated with spaghetti architectures. The underlying forces that cause such degradation have been the subject of much research. However, progress in this field is slow, as its complexity makes it difficult to isolate the causal flows leading to these effects. This is further complicated by the difficulty of generating enough empirical data, in sufficient quantity, and attributing such data to specific points in the causal chain. This article describes a framework for simulating the structural evolution of software. A complete simulation model is built by incrementally adding modules to the framework, each of which contributes an individual evolutionary effect. These effects are then combined to form a multifaceted simulation that evolves a fictitious code base in a manner approximating real-world behavior. We describe the underlying principles and structures of our framework from a theoretical and user perspective; a validation of a simple set of evolutionary parameters is then provided and three empirical software studies generated from open-source software (OSS) are used to support claims and generated results. The research illustrates how simulation can be used to investigate a complex and under-researched area of the development cycle. It also shows the value of incorporating certain human traits into a simulation—factors that, in real-world system development, can significantly influence evolutionary structures
VLBI and Single Dish Monitoring of 3C84 in the Period of 2009-2011
The radio galaxy 3C 84 is a representative of gamma-ray-bright misaligned
active galactic nuclei (AGNs) and one of the best laboratories to study the
radio properties of the sub-pc jet in connection with the gamma-ray emission.
In order to identify possible radio counterparts of the gamma-ray emissions in
3C 84, we study the change in structure within the central 1 pc and the light
curve of sub-pc-size components C1, C2, and C3. We search for any correlation
between changes in the radio components and the gamma-ray flares by making use
of VLBI and single dish data. Throughout the radio monitoring spanning over two
GeV gamma-ray flares detected by the {\it Fermi}-LAT and the MAGIC Cherenkov
Telescope in the periods of 2009 April to May and 2010 June to August, total
flux density in radio band increases on average. This flux increase mostly
originates in C3. Although the gamma-ray flares span on the timescale of days
to weeks, no clear correlation with the radio light curve on this timescale is
found. Any new prominent components and change in morphology associated with
the gamma-ray flares are not found on the VLBI images.Comment: 6 pages, 3 figures, accepted for publication in MNRAS lette
The milliarcsecond-scale jet of PKS 0735+178 during quiescence
We present polarimetric 5 GHz to 43 GHz VLBI observations of the BL Lacertae
object PKS 0735+178, spanning March 1996 to May 2000. Comparison with previous
and later observations suggests that the overall kinematic and structural
properties of the jet are greatly influenced by its activity. Time intervals of
enhanced activity, as reported before 1993 and after 2000 by other studies, are
followed by highly superluminal motion along a rectilinear jet. In contrast the
less active state in which we performed our observations, shows subluminal or
slow superluminal jet features propagating through a twisted jet with two sharp
bends of about 90 deg. within the innermost three-milliarcsecond jet structure.
Proper motion estimates from the data presented here allow us to constrain the
jet viewing angle to values < 9 deg., and the bulk Lorentz factor to be between
2 and 4.Comment: 11 pages, 12 figures. Accepted for publication in A&
Building Program Vector Representations for Deep Learning
Deep learning has made significant breakthroughs in various fields of
artificial intelligence. Advantages of deep learning include the ability to
capture highly complicated features, weak involvement of human engineering,
etc. However, it is still virtually impossible to use deep learning to analyze
programs since deep architectures cannot be trained effectively with pure back
propagation. In this pioneering paper, we propose the "coding criterion" to
build program vector representations, which are the premise of deep learning
for program analysis. Our representation learning approach directly makes deep
learning a reality in this new field. We evaluate the learned vector
representations both qualitatively and quantitatively. We conclude, based on
the experiments, the coding criterion is successful in building program
representations. To evaluate whether deep learning is beneficial for program
analysis, we feed the representations to deep neural networks, and achieve
higher accuracy in the program classification task than "shallow" methods, such
as logistic regression and the support vector machine. This result confirms the
feasibility of deep learning to analyze programs. It also gives primary
evidence of its success in this new field. We believe deep learning will become
an outstanding technique for program analysis in the near future.Comment: This paper was submitted to ICSE'1
Imaging an Event Horizon: Mitigation of Source Variability of Sagittarius A*
The black hole in the center of the Galaxy, associated with the compact
source Sagittarius A* (Sgr A*), is predicted to cast a shadow upon the emission
of the surrounding plasma flow, which encodes the influence of general
relativity in the strong-field regime. The Event Horizon Telescope (EHT) is a
Very Long Baseline Interferometry (VLBI) network with a goal of imaging nearby
supermassive black holes (in particular Sgr A* and M87) with angular resolution
sufficient to observe strong gravity effects near the event horizon. General
relativistic magnetohydrodynamic (GRMHD) simulations show that radio emission
from Sgr A* exhibits vari- ability on timescales of minutes, much shorter than
the duration of a typical VLBI imaging experiment, which usually takes several
hours. A changing source structure during the observations, however, violates
one of the basic assumptions needed for aperture synthesis in radio
interferometry imaging to work. By simulating realistic EHT observations of a
model movie of Sgr A*, we demonstrate that an image of the average quiescent
emission, featuring the characteristic black hole shadow and photon ring
predicted by general relativity, can nonetheless be obtained by observing over
multiple days and subsequent processing of the visibilities (scaling,
averaging, and smoothing) before imaging. Moreover, it is shown that this
procedure can be combined with an existing method to mitigate the effects of
interstellar scattering. Taken together, these techniques allow the black hole
shadow in the Galactic center to be recovered on the reconstructed image.Comment: 10 pages, 12figures, accepted for publication in Ap
Current Results from the RRFID Kinematic Survey: Apparent Speeds from the First Five Years of Data
We present current results from our ongoing project to study the parsec-scale
relativistic jet kinematics of sources in the U.S. Naval Observatory's Radio
Reference Frame Image Database (RRFID). The RRFID consists of snapshot
observations using the VLBA plus up to 9 additional antennas at 8 and 2 GHz.
The Image Database currently contains about 3000 images of 450 sources from
1994 to 2004, with some sources having images at 20 epochs or more. We have now
completed analysis of the 8 GHz images for all sources observed at 3 or more
epochs from 1994 to 1998. The completed analysis comprises 966 images of 87
sources, or an average of 11 epochs per source. Apparent jet speeds have been
measured for these sources, and the resulting speed distribution has been
compared with results obtained by other large VLBI surveys. The measured
apparent speed distribution agrees with those found by the 2 cm survey and
Caltech-Jodrell Bank (CJ) survey; however, when a source-by-source comparison
is done with the 2 cm survey results, significant disagreement is found in the
apparent speed measurements for a number of sources. This disagreement can be
traced in most cases to either an insufficient time baseline for the current
RRFID results, or to apparent component mis-identification in the 2 cm survey
results caused by insufficient time sampling. These results emphasize the need
for long time baselines and dense time sampling for multi-epoch monitoring of
relativistic jets.Comment: 4 pages, To be published in the Proceedings of the 7th European VLBI
Network Symposiu
Multicriteria Analysis of Neural Network Forecasting Models: An Application to German Regional Labour Markets
This paper develops a flexible multi-dimensional assessment method for the comparison of different statistical-econometric techniques based on learning mechanisms with a view to analysing and forecasting regional labour markets. The aim of this paper is twofold. A first major objective is to explore the use of a standard choice tool, namely Multicriteria Analysis (MCA), in order to cope with the intrinsic methodological uncertainty on the choice of a suitable statistical-econometric learning technique for regional labour market analysis. MCA is applied here to support choices on the performance of various models -based on classes of Neural Network (NN) techniques-that serve to generate employment forecasts in West Germany at a regional/district level. A second objective of the paper is to analyse the methodological potential of a blend of approaches (NN-MCA) in order to extend the analysis framework to other economic research domains, where formal models are not available, but where a variety of statistical data is present. The paper offers a basis for a more balanced judgement of the performance of rival statistical tests
- …