597,996 research outputs found

    Robust PCA as Bilinear Decomposition with Outlier-Sparsity Regularization

    Full text link
    Principal component analysis (PCA) is widely used for dimensionality reduction, with well-documented merits in various applications involving high-dimensional data, including computer vision, preference measurement, and bioinformatics. In this context, the fresh look advocated here permeates benefits from variable selection and compressive sampling, to robustify PCA against outliers. A least-trimmed squares estimator of a low-rank bilinear factor analysis model is shown closely related to that obtained from an â„“0\ell_0-(pseudo)norm-regularized criterion encouraging sparsity in a matrix explicitly modeling the outliers. This connection suggests robust PCA schemes based on convex relaxation, which lead naturally to a family of robust estimators encompassing Huber's optimal M-class as a special case. Outliers are identified by tuning a regularization parameter, which amounts to controlling sparsity of the outlier matrix along the whole robustification path of (group) least-absolute shrinkage and selection operator (Lasso) solutions. Beyond its neat ties to robust statistics, the developed outlier-aware PCA framework is versatile to accommodate novel and scalable algorithms to: i) track the low-rank signal subspace robustly, as new data are acquired in real time; and ii) determine principal components robustly in (possibly) infinite-dimensional feature spaces. Synthetic and real data tests corroborate the effectiveness of the proposed robust PCA schemes, when used to identify aberrant responses in personality assessment surveys, as well as unveil communities in social networks, and intruders from video surveillance data.Comment: 30 pages, submitted to IEEE Transactions on Signal Processin

    Faithful qubit distribution assisted by one additional qubit against collective noise

    Full text link
    We propose a distribution scheme of polarization states of a single photon over collective-noise channel. By adding one extra photon with a fixed polarization, we can protect the state against collective noise via a parity-check measurement and post-selection. While the scheme succeeds only probabilistically, it is simpler and more flexible than the schemes utilizing decoherence-free subspace. An application to BB84 protocol through collective noise channel, which is robust to the Trojan horse attack, is also given.Comment: 4 pages, 3 figures; published version in Phys. Rev. Let

    Classical Field Approach to Quantum Weak Measurements

    Get PDF
    By generalizing the quantum weak measurement protocol to the case of quantum fields, we show that weak measurements probe an effective classical background field that describes the average field configuration in the spacetime region between pre- and post-selection boundary conditions. The classical field is itself a weak value of the corresponding quantum field operator and satisfies equations of motion that extremize an effective action. Weak measurements perturb this effective action, producing measurable changes to the classical field dynamics. As such, weakly measured effects always correspond to an effective classical field. This general result explains why these effects appear to be robust for pre- and post-selected ensembles, and why they can also be measured using classical field techniques that are not weak for individual excitations of the field.Comment: 6 pages, 2 figures, published versio

    A high-flux BEC source for mobile atom interferometers

    Get PDF
    Quantum sensors based on coherent matter-waves are precise measurement devices whose ultimate accuracy is achieved with Bose-Einstein condensates (BEC) in extended free fall. This is ideally realized in microgravity environments such as drop towers, ballistic rockets and space platforms. However, the transition from lab-based BEC machines to robust and mobile sources with comparable performance is a challenging endeavor. Here we report on the realization of a miniaturized setup, generating a flux of 4×1054 \times 10^5 quantum degenerate 87^{87}Rb atoms every 1.6 \,s. Ensembles of 1×1051 \times 10^5 atoms can be produced at a 1 \,Hz rate. This is achieved by loading a cold atomic beam directly into a multi-layer atom chip that is designed for efficient transfer from laser-cooled to magnetically trapped clouds. The attained flux of degenerate atoms is on par with current lab-based BEC experiments while offering significantly higher repetition rates. Additionally, the flux is approaching those of current interferometers employing Raman-type velocity selection of laser-cooled atoms. The compact and robust design allows for mobile operation in a variety of demanding environments and paves the way for transportable high-precision quantum sensors.Comment: 22 pages, 6 figure

    Can fuzzy Multi-Criteria Decision Making improve Strategic planning by balanced scorecard?

    Full text link
    Strategic management is momentous for organizational success and competitive advantage in an increasingly turbulent business environment. Balanced scorecard (BSC) is a framework for evaluating strategic management performance which translates strategy into action via various sets of performance measurement indicators. The main objective of this research is to develop a new fuzzy group Multi-Criteria Decision Making (MCDM) model for strategic plans selection process in the BSC. For this to happen, the current study has implemented linguistic extension of MCDM model for robust selection of strategic plans. The new linguistic reasoning for group decision making is able to aggregate subjective evaluation of the decision makers and hence create an opportunity to perform more robust strategic plans, despite of the vagueness and uncertainty of strategic plans selection process. A numerical example demonstrates possibilities for the improvement of BSC through applying the proposed model
    • …
    corecore