2,440 research outputs found

    The evolution of the X-ray phase lags during the outbursts of the black hole candidate GX 339-4

    Get PDF
    Owing to the frequency and reproducibility of its outbursts, the black-hole candidate GX 339-4 has become the standard against which the outbursts of other black-hole candidate are matched up. Here we present the first systematic study of the evolution of the X-ray lags of the broad-band variability component (0.008-5 Hz) in GX 339-4 as a function of the position of the source in the hardness-intensity diagram. The hard photons always lag the soft ones, consistent with previous results. In the low-hard state the lags correlate with X-ray intensity, and as the source starts the transition to the intermediate/soft states, the lags first increase faster, and then appear to reach a maximum, although the exact evolution depends on the outburst and the energy band used to calculate the lags. The time of the maximum of the lags appears to coincide with a sudden drop of the Optical/NIR flux, the fractional RMS amplitude of the broadband component in the power spectrum, and the appearance of a thermal component in the X-ray spectra, strongly suggesting that the lags can be very useful to understand the physical changes that GX 339-4 undergoes during an outburst. We find strong evidence for a connection between the evolution of the cut-off energy of the hard component in the energy spectrum and the phase lags, suggesting that the average magnitude of the lags is correlated with the properties of the corona/jet rather than those of the disc. Finally, we show that the lags in GX 339-4 evolve in a similar manner to those of the black-hole candidate Cygnus X-1, suggesting similar phenomena could be observable in other black-hole systems.Comment: 13 pages, 8 figures, Accepted for publication in MNRA

    Discovery of a correlation between the frequency of the mHz quasi-periodic oscillations and the neutron-star temperature in the low-mass X-ray binary 4U 1636-53

    Get PDF
    We detected millihertz quasi-periodic oscillations (QPOs) in an XMM-Newton observation of the neutron-star low-mass X-ray binary 4U 1636-53. These QPOs have been interpreted as marginally-stable burning on the neutron-star surface. At the beginning of the observation the QPO was at around 8 mHz, together with a possible second harmonic. About 12 ks into the observation a type I X-ray burst occurred and the QPO disappeared; the QPO reappeared ~25 ks after the burst and it was present until the end of the observation. We divided the observation into four segments to study the evolution of the spectral properties of the source during intervals with and without mHz QPO. We find that the temperature of the neutron-star surface increases from the QPO segment to the non-QPO segment, and vice versa. We also find a strong correlation between the frequency of the mHz QPO and the temperature of a black-body component in the energy spectrum representing the temperature of neutron-star surface. Our results are consistent with previous results that the frequency of the mHz QPO depends on the variation of the heat flux from the neutron star crust, and therefore supports the suggestion that the observed QPO frequency drifts could be caused by the cooling of deeper layers.Comment: Accepted for publication in the MNRA

    Corpus specificity in LSA and Word2vec: the role of out-of-domain documents

    Full text link
    Latent Semantic Analysis (LSA) and Word2vec are some of the most widely used word embeddings. Despite the popularity of these techniques, the precise mechanisms by which they acquire new semantic relations between words remain unclear. In the present article we investigate whether LSA and Word2vec capacity to identify relevant semantic dimensions increases with size of corpus. One intuitive hypothesis is that the capacity to identify relevant dimensions should increase as the amount of data increases. However, if corpus size grow in topics which are not specific to the domain of interest, signal to noise ratio may weaken. Here we set to examine and distinguish these alternative hypothesis. To investigate the effect of corpus specificity and size in word-embeddings we study two ways for progressive elimination of documents: the elimination of random documents vs. the elimination of documents unrelated to a specific task. We show that Word2vec can take advantage of all the documents, obtaining its best performance when it is trained with the whole corpus. On the contrary, the specialization (removal of out-of-domain documents) of the training corpus, accompanied by a decrease of dimensionality, can increase LSA word-representation quality while speeding up the processing time. Furthermore, we show that the specialization without the decrease in LSA dimensionality can produce a strong performance reduction in specific tasks. From a cognitive-modeling point of view, we point out that LSA's word-knowledge acquisitions may not be efficiently exploiting higher-order co-occurrences and global relations, whereas Word2vec does

    Extreme learning machines for reverse engineering of gene regulatory networks from expression time series

    Get PDF
    The reconstruction of gene regulatory networks (GRNs) from genes profiles has a growing interest in bioinformatics for understanding the complex regulatory mechanisms in cellular systems. GRNs explicitly represent the cause-effect of regulation among a group of genes and its reconstruction is today a challenging computational problem. Several methods were proposed, but most of them require different input sources to provide an acceptable prediction. Thus, it is a great challenge to reconstruct a GRN only from temporal gene-expression data. Results: Extreme Learning Machine (ELM) is a new supervised neural model that has gained interest in the last years because of its higher learning rate and better performance than existing supervised models in terms of predictive power. This work proposes a novel approach for GRNs reconstruction in which ELMs are used for modeling the relationships between gene expression time series. Artificial datasets generated with the well-known benchmark tool used in DREAM competitions were used. Real datasets were used for validation of this novel proposal with well-known GRNs underlying the time series. The impact of increasing the size of GRNs was analyzed in detail for the compared methods. The results obtained confirm the superiority of the ELM approach against very recent state-of-the-art methods in the same experimental conditions.Fil: Rubiolo, Mariano. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Milone, Diego Humberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; ArgentinaFil: Stegmayer, Georgina. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Santa Fe. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional. Universidad Nacional del Litoral. Facultad de Ingeniería y Ciencias Hídricas. Instituto de Investigación en Señales, Sistemas e Inteligencia Computacional; Argentin
    • …
    corecore