273,058 research outputs found
Adaptive Online Sequential ELM for Concept Drift Tackling
A machine learning method needs to adapt to over time changes in the
environment. Such changes are known as concept drift. In this paper, we propose
concept drift tackling method as an enhancement of Online Sequential Extreme
Learning Machine (OS-ELM) and Constructive Enhancement OS-ELM (CEOS-ELM) by
adding adaptive capability for classification and regression problem. The
scheme is named as adaptive OS-ELM (AOS-ELM). It is a single classifier scheme
that works well to handle real drift, virtual drift, and hybrid drift. The
AOS-ELM also works well for sudden drift and recurrent context change type. The
scheme is a simple unified method implemented in simple lines of code. We
evaluated AOS-ELM on regression and classification problem by using concept
drift public data set (SEA and STAGGER) and other public data sets such as
MNIST, USPS, and IDS. Experiments show that our method gives higher kappa value
compared to the multiclassifier ELM ensemble. Even though AOS-ELM in practice
does not need hidden nodes increase, we address some issues related to the
increasing of the hidden nodes such as error condition and rank values. We
propose taking the rank of the pseudoinverse matrix as an indicator parameter
to detect underfitting condition.Comment: Hindawi Publishing. Computational Intelligence and Neuroscience
Volume 2016 (2016), Article ID 8091267, 17 pages Received 29 January 2016,
Accepted 17 May 2016. Special Issue on "Advances in Neural Networks and
Hybrid-Metaheuristics: Theory, Algorithms, and Novel Engineering
Applications". Academic Editor: Stefan Hauf
A survey on OFDM-based elastic core optical networking
Orthogonal frequency-division multiplexing (OFDM) is a modulation technology that has been widely adopted in many new and emerging broadband wireless and wireline communication systems. Due to its capability to transmit a high-speed data stream using multiple spectral-overlapped lower-speed subcarriers, OFDM technology offers superior advantages of high spectrum efficiency, robustness against inter-carrier and inter-symbol interference, adaptability to server channel conditions, etc. In recent years, there have been intensive studies on optical OFDM (O-OFDM) transmission technologies, and it is considered a promising technology for future ultra-high-speed optical transmission. Based on O-OFDM technology, a novel elastic optical network architecture with immense flexibility and scalability in spectrum allocation and data rate accommodation could be built to support diverse services and the rapid growth of Internet traffic in the future. In this paper, we present a comprehensive survey on OFDM-based elastic optical network technologies, including basic principles of OFDM, O-OFDM technologies, the architectures of OFDM-based elastic core optical networks, and related key enabling technologies. The main advantages and issues of OFDM-based elastic core optical networks that are under research are also discussed
Generalized Adaptive Network Coding Aided Successive Relaying Based Noncoherent Cooperation
A generalized adaptive network coding (GANC) scheme is conceived for a multi-user, multi-relay scenario, where the multiple users transmit independent information streams to a common destination with the aid of multiple relays. The proposed GANC scheme is developed from adaptive network coded cooperation (ANCC), which aims for a high flexibility in order to: 1) allow arbitrary channel coding schemes to serve as the cross-layer network coding regime; 2) provide any arbitrary trade-off between the throughput and reliability by adjusting the ratio of the source nodes and the cooperating relay nodes. Furthermore, we incorporate the proposed GANC scheme in a novel successive relaying aided network (SRAN) in order to recover the typical 50% half-duplex relaying-induced throughput loss. However, it is unrealistic to expect that in addition to carrying out all the relaying functions, the relays could additionally estimate the source-to-relay channels. Hence noncoherent detection is employed in order to obviate the power-hungry channel estimation. Finally, we intrinsically amalgamate our GANC scheme with the joint network-channel coding (JNCC) concept into a powerful three-stage concatenated architecture relying on iterative detection, which is specifically designed for the destination node (DN). The proposed scheme is also capable of adapting to rapidly time-varying network topologies, while relying on energy-efficient detection
A preliminary approach to intelligent x-ray imaging for baggage inspection at airports
Identifying explosives in baggage at airports relies on being able to characterize the materials that make up an X-ray image. If a suspicion is generated during the imaging process (step 1), the image data could be enhanced by adapting the scanning parameters (step 2). This paper addresses the first part of this problem and uses textural signatures to recognize and characterize materials and hence enabling system control. Directional Gabor-type filtering was applied to a series of different X-ray images. Images were processed in such a way as to simulate a line scanning geometry. Based on our experiments with images of industrial standards and our own samples it was found that different materials could be characterized in terms of the frequency range and orientation of the filters. It was also found that the signal strength generated by the filters could be used as an indicator of visibility and optimum imaging conditions predicted
Full Three Dimensional Orbits For Multiple Stars on Close Approaches to the Central Supermassive Black Hole
With the advent of adaptive optics on the W. M. Keck 10 m telescope, two
significant steps forward have been taken in building the case for a
supermassive black hole at the center of the Milky Way and understanding the
black hole's effect on its environment. Using adaptive optics and speckle
imaging to study the motions of stars in the plane of sky with +-~2 mas
precision over the past 7 years, we have obtained the first simultaneous
orbital solution for multiple stars. Among the included stars, three are newly
identified (S0-16, S0-19, S0-20). The most dramatic orbit is that of the newly
identified star S0-16, which passed a mere 60 AU from the central dark mass at
a velocity of 9,000 km/s in 1999. The orbital analysis results in a new central
dark mass estimate of 3.6(+-0.4)x10^6(D/8kpc)^3 Mo. This dramatically
strengthens the case for a black hole at the center of our Galaxy, by confining
the dark matter to within a radius of 0.0003 pc or 1,000 Rsh and thereby
increasing the inferred dark mass density by four orders of magnitude compared
to earlier estimates.
With the introduction of an adaptive-optics-fed spectrometer, we have
obtained the spectra of these high-velocity stars, which suggest that they are
massive (~15 Mo), young (<10 Myr) main sequence stars. This presents a major
challenge to star formation theories, given the strong tidal forces that
prevail over all distances reached by these stars in their current orbits and
the difficulty in migrating these stars inward during their lifetime from
further out where tidal forces should no longer preclude star formation.Comment: 7 pages, 5 figures (abridged abstract
Controlling the False Discovery Rate in Astrophysical Data Analysis
The False Discovery Rate (FDR) is a new statistical procedure to control the
number of mistakes made when performing multiple hypothesis tests, i.e. when
comparing many data against a given model hypothesis. The key advantage of FDR
is that it allows one to a priori control the average fraction of false
rejections made (when comparing to the null hypothesis) over the total number
of rejections performed. We compare FDR to the standard procedure of rejecting
all tests that do not match the null hypothesis above some arbitrarily chosen
confidence limit, e.g. 2 sigma, or at the 95% confidence level. When using FDR,
we find a similar rate of correct detections, but with significantly fewer
false detections. Moreover, the FDR procedure is quick and easy to compute and
can be trivially adapted to work with correlated data. The purpose of this
paper is to introduce the FDR procedure to the astrophysics community. We
illustrate the power of FDR through several astronomical examples, including
the detection of features against a smooth one-dimensional function, e.g.
seeing the ``baryon wiggles'' in a power spectrum of matter fluctuations, and
source pixel detection in imaging data. In this era of large datasets and high
precision measurements, FDR provides the means to adaptively control a
scientifically meaningful quantity -- the number of false discoveries made when
conducting multiple hypothesis tests.Comment: 15 pages, 9 figures. Submitted to A
Candidate Coronagraphic Detections of Protoplanetary Disks around Four Young Stars
We present potential detections of H-band scattered light emission around
four young star, selected from a total sample of 45 young stars observed with
the CIAO coronagraph of the Subaru telescope. Two CTTS, CI Tau and DI Cep, and
two WTTS, LkCa 14 and RXJ 0338.3+1020 were detected. In all four cases, the
extended emission is within the area of the residual PSF halo, and is revealed
only through careful data reduction. We compare the observed extended emission
with simulations of the scattered light emission, to evaluate the plausibility
and nature of the detected emission.Comment: 9 Figures, 40 page
- …