42,645 research outputs found

    The crossing number of the generalized Petersen graph P(10, 3) is six

    Full text link
    The crossing number of a graph is the least number of crossings of edges among all drawings of the graph in the plane. In this article, we prove that the crossing number of the generalized Petersen graph P(10, 3) is equal to 6.Comment: 11 pages, 31 figure

    An upper bound for the crossing number of bubble-sort graph Bn

    Full text link
    The crossing number of a graph G is the minimum number of pairwise intersections of edges in a drawing of G. Motivated by the recent work [Faria, L., Figueiredo, C.M.H. de, Sykora, O., Vrt'o, I.: An improved upper bound on the crossing number of the hypercube. J. Graph Theory 59, 145-161 (2008)], we give an upper bound of the crossing number of n-dimensional bubble-sort graph Bn.Comment: 20 pages, 10 figure

    Controllable high-fidelity quantum state transfer and entanglement generation in circuit QED

    Full text link
    We propose a scheme to realize controllable quantum state transfer and entanglement generation among transmon qubits in the typical circuit QED setup based on adiabatic passage. Through designing the time-dependent driven pulses applied on the transmon qubits, we find that fast quantum sate transfer can be achieved between arbitrary two qubits and quantum entanglement among the qubits also can also be engineered. Furthermore, we numerically analyzed the influence of the decoherence on our scheme with the current experimental accessible systematical parameters. The result shows that our scheme is very robust against both the cavity decay and qubit relaxation, the fidelities of the state transfer and entanglement preparation process could be very high. In addition, our scheme is also shown to be insensitive to the inhomogeneous of qubit-resonator coupling strengths.Comment: Accepted for publication at Scientific Report

    The crossing numbers of Km×PnK_m\times P_n and Km×CnK_m\times C_n

    Full text link
    The {\it crossing number} of a graph GG is the minimum number of pairwise intersections of edges in a drawing of GG. In this paper, we study the crossing numbers of Km×PnK_{m}\times P_n and Km×CnK_{m}\times C_n.Comment: 16 pages, 30 figure

    The Effectiveness of Instance Normalization: a Strong Baseline for Single Image Dehazing

    Full text link
    We propose a novel deep neural network architecture for the challenging problem of single image dehazing, which aims to recover the clear image from a degraded hazy image. Instead of relying on hand-crafted image priors or explicitly estimating the components of the widely used atmospheric scattering model, our end-to-end system directly generates the clear image from an input hazy image. The proposed network has an encoder-decoder architecture with skip connections and instance normalization. We adopt the convolutional layers of the pre-trained VGG network as encoder to exploit the representation power of deep features, and demonstrate the effectiveness of instance normalization for image dehazing. Our simple yet effective network outperforms the state-of-the-art methods by a large margin on the benchmark datasets

    The crossing numbers of Kn,n−nK2K_{n,n}-nK_2, Kn×P2K_{n}\times P_2, Kn×P3K_{n}\times P_3 and Kn×C4K_n\times C_4

    Full text link
    The crossing number of a graph GG is the minimum number of pairwise intersections of edges among all drawings of GG. In this paper, we study the crossing number of Kn,n−nK2K_{n,n}-nK_2, Kn×P2K_n\times P_2, Kn×P3K_n\times P_3 and Kn×C4K_n\times C_4.Comment: 14 pages, 33 figure

    A survey of sparse representation: algorithms and applications

    Full text link
    Sparse representation has attracted much attention from researchers in fields of signal processing, image processing, computer vision and pattern recognition. Sparse representation also has a good reputation in both theoretical research and practical applications. Many different algorithms have been proposed for sparse representation. The main purpose of this article is to provide a comprehensive study and an updated review on sparse representation and to supply a guidance for researchers. The taxonomy of sparse representation methods can be studied from various viewpoints. For example, in terms of different norm minimizations used in sparsity constraints, the methods can be roughly categorized into five groups: sparse representation with l0l_0-norm minimization, sparse representation with lpl_p-norm (0<<p<<1) minimization, sparse representation with l1l_1-norm minimization and sparse representation with l2,1l_{2,1}-norm minimization. In this paper, a comprehensive overview of sparse representation is provided. The available sparse representation algorithms can also be empirically categorized into four groups: greedy strategy approximation, constrained optimization, proximity algorithm-based optimization, and homotopy algorithm-based sparse representation. The rationales of different algorithms in each category are analyzed and a wide range of sparse representation applications are summarized, which could sufficiently reveal the potential nature of the sparse representation theory. Specifically, an experimentally comparative study of these sparse representation algorithms was presented. The Matlab code used in this paper can be available at: http://www.yongxu.org/lunwen.html.Comment: Published on IEEE Access, Vol. 3, pp. 490-530, 201

    Deep Spectral Clustering using Dual Autoencoder Network

    Full text link
    The clustering methods have recently absorbed even-increasing attention in learning and vision. Deep clustering combines embedding and clustering together to obtain optimal embedding subspace for clustering, which can be more effective compared with conventional clustering methods. In this paper, we propose a joint learning framework for discriminative embedding and spectral clustering. We first devise a dual autoencoder network, which enforces the reconstruction constraint for the latent representations and their noisy versions, to embed the inputs into a latent space for clustering. As such the learned latent representations can be more robust to noise. Then the mutual information estimation is utilized to provide more discriminative information from the inputs. Furthermore, a deep spectral clustering method is applied to embed the latent representations into the eigenspace and subsequently clusters them, which can fully exploit the relationship between inputs to achieve optimal clustering results. Experimental results on benchmark datasets show that our method can significantly outperform state-of-the-art clustering approaches

    Service Composition in Service-Oriented Wireless Sensor Networks with Persistent Queries

    Full text link
    Service-oriented wireless sensor network(WSN) has been recently proposed as an architecture to rapidly develop applications in WSNs. In WSNs, a query task may require a set of services and may be carried out repetitively with a given frequency during its lifetime. A service composition solution shall be provided for each execution of such a persistent query task. Due to the energy saving strategy, some sensors may be scheduled to be in sleep mode periodically. Thus, a service composition solution may not always be valid during the lifetime of a persistent query. When a query task needs to be conducted over a new service composition solution, a routing update procedure is involved which consumes energy. In this paper, we study service composition design which minimizes the number of service composition solutions during the lifetime of a persistent query. We also aim to minimize the total service composition cost when the minimum number of required service composition solutions is derived. A greedy algorithm and a dynamic programming algorithm are proposed to complete these two objectives respectively. The optimality of both algorithms provides the service composition solutions for a persistent query with minimum energy consumption.Comment: IEEE CCNC 200

    The crossing number of pancake graph P4P_4 is six

    Full text link
    The {\it crossing number} of a graph GG is the least number of pairwise crossings of edges among all the drawings of GG in the plane. The pancake graph is an important topology for interconnecting processors in parallel computers. In this paper, we prove the exact value of the crossing number of pancake graph P4P_4 is six.Comment: 10 pages, 11 figure
    • …
    corecore