5,352 research outputs found
Inside-out growth or inside-out quenching? clues from colour gradients of local galaxies
We constrain the spatial gradient of star formation history within galaxies
using the colour gradients in NUV-u and u-i for a local spatially-resolved
galaxy sample. By splitting each galaxy into an inner and an outer part, we
find that most galaxies show negative gradients in these two colours. We first
rule out dust extinction gradient and metallicity gradient as the dominant
source for the colour gradient. Then using stellar population models, we
explore variations in star formation history to explain the colour gradients.
As shown by our earlier work, a two-phase SFH consisting of an early secular
evolution (growth) phase and a subsequent rapid evolution (quenching) phase is
necessary to explain the observed colour distributions among galaxies. We
explore two different inside-out growth models and two different inside-out
quenching models by varying parameters of the SFH between inner and outer
regions of galaxies. Two of the models can explain the observed range of colour
gradients in NUV-u and u-i colours. We further distinguish them using an
additional constraint provided by the u-i colour gradient distribution, under
the assumption of constant galaxy formation rate and a common SFH followed by
most galaxies. We find the best model is an inside-out growth model in which
the inner region has a shorter e-folding time scale in the growth phase than
the outer region. More spatially resolved ultraviolet (UV) observations are
needed to improve the significance of the result.Comment: 11 pages, 7 figures, accepted for publication in MNRA
D: Decentralized Training over Decentralized Data
While training a machine learning model using multiple workers, each of which
collects data from their own data sources, it would be most useful when the
data collected from different workers can be {\em unique} and {\em different}.
Ironically, recent analysis of decentralized parallel stochastic gradient
descent (D-PSGD) relies on the assumption that the data hosted on different
workers are {\em not too different}. In this paper, we ask the question: {\em
Can we design a decentralized parallel stochastic gradient descent algorithm
that is less sensitive to the data variance across workers?} In this paper, we
present D, a novel decentralized parallel stochastic gradient descent
algorithm designed for large data variance \xr{among workers} (imprecisely,
"decentralized" data). The core of D is a variance blackuction extension of
the standard D-PSGD algorithm, which improves the convergence rate from
to where
denotes the variance among data on different workers. As a result, D is
robust to data variance among workers. We empirically evaluated D on image
classification tasks where each worker has access to only the data of a limited
set of labels, and find that D significantly outperforms D-PSGD
Design and Evaluation of Digital Baseband Converter Sub-channel Delay Compensation Method on Bandwidth Synthesis
The effect of sub-channel delay on bandwidth synthesis is investigated to eliminate the “phase step” phenomenon in bandwidth synthesis during the test of CDBE (Chinese Digital Backend). Through formula derivation, we realize that sub-channel delay may cause phase discontinuity between different sub-channels. Theoretical analysis shows that sub-channel delay can induce bandwidth synthesis error in group delay measurement of the linear system. Furthermore, in the differential delay measurement between two stations, bandwidth synthesis error may occur when the LO (Local Oscillator) frequency differences of corresponding sub-channels are not identical. Error-free conditions are discussed under different applications. The phase errors among different sub-channels can be removed manually. However, the most effective way is the compensation of sub-channel delay. A sub-channel delay calculation method based on Modelsim is proposed. The compensation method is detailed. Simulation and field experiments are presented to verify our approach
- …