56,318 research outputs found

    Entanglement of four-qubit systems: a geometric atlas with polynomial compass II (the tame world)

    Full text link
    We propose a new approach to the geometry of the four-qubit entanglement classes depending on parameters. More precisely, we use invariant theory and algebraic geometry to describe various stratifications of the Hilbert space by SLOCC invariant algebraic varieties. The normal forms of the four-qubit classification of Verstraete {\em et al.} are interpreted as dense subsets of components of the dual variety of the set of separable states and an algorithm based on the invariants/covariants of the four-qubit quantum states is proposed to identify a state with a SLOCC equivalent normal form (up to qubits permutation).Comment: 49 pages, 16 figure

    DETERMINANTS OF THE FUNDING VOLATILITY OF INDONESIAN BANKS: A DYNAMIC MODEL

    Get PDF
    Illiquidity is at the core of the various currency and banking/financial crises of the 1990s. In the wake of the Asian crisis of 1997/98 the term "systemic liquidity" has been coined to refer to adequate arrangements and practices which permit efficient liquidity management and which provide a buffer during financial distress. A constructed balance-sheet-based variable that captures the essence of the risk from systemic liquidity is funding volatility ratio, FVR. Using data covering January 1990 to July 2003 and employing cointegration techniques, this study attempts to quantify the purported link between FVR and the measurable determinants of a balanced liquidity infrastructure for Indonesia, the country that suffered the most from the Asian crisis. A good fit is obtained for the dynamic regression model and estimates of short-run and long-run impacts and elasticities are computed. FVR is shown to be increasing in the rupiah-US dollar exchange rate, the Jakarta stock market index, interest rate and the number of banks, and decreasing in capital:asset ratio and foreign liabilities: total asset ratio. The best option for lowering the FVR in the short run is increasing bank capital; over the long term enduring increases in foreign-currency accounts and reduction in the number of banks seem to hold the best prospect for lowering the FVR.autoregressive distributed lag model, cointegration, funding volatility ratio, systemic liquidity, Financial Economics, C22,

    A Survey on Graph Kernels

    Get PDF
    Graph kernels have become an established and widely-used technique for solving classification tasks on graphs. This survey gives a comprehensive overview of techniques for kernel-based graph classification developed in the past 15 years. We describe and categorize graph kernels based on properties inherent to their design, such as the nature of their extracted graph features, their method of computation and their applicability to problems in practice. In an extensive experimental evaluation, we study the classification accuracy of a large suite of graph kernels on established benchmarks as well as new datasets. We compare the performance of popular kernels with several baseline methods and study the effect of applying a Gaussian RBF kernel to the metric induced by a graph kernel. In doing so, we find that simple baselines become competitive after this transformation on some datasets. Moreover, we study the extent to which existing graph kernels agree in their predictions (and prediction errors) and obtain a data-driven categorization of kernels as result. Finally, based on our experimental results, we derive a practitioner's guide to kernel-based graph classification

    Using VARs and TVP-VARs with many macroeconomic variables

    Get PDF
    This paper discusses the challenges faced by the empirical macroeconomist and methods for surmounting them. These challenges arise due to the fact that macroeconometric models potentially include a large number of variables and allow for time variation in parameters. These considerations lead to models which have a large number of parameters to estimate relative to the number of observations. A wide range of approaches are surveyed which aim to overcome the resulting problems. We stress the related themes of prior shrinkage, model averaging and model selection. Subsequently, we consider a particular modelling approach in detail. This involves the use of dynamic model selection methods with large TVP-VARs. A forecasting exercise involving a large US macroeconomic data set illustrates the practicality and empirical success of our approach

    There are Plane Spanners of Maximum Degree 4

    Full text link
    Let E be the complete Euclidean graph on a set of points embedded in the plane. Given a constant t >= 1, a spanning subgraph G of E is said to be a t-spanner, or simply a spanner, if for any pair of vertices u,v in E the distance between u and v in G is at most t times their distance in E. A spanner is plane if its edges do not cross. This paper considers the question: "What is the smallest maximum degree that can always be achieved for a plane spanner of E?" Without the planarity constraint, it is known that the answer is 3 which is thus the best known lower bound on the degree of any plane spanner. With the planarity requirement, the best known upper bound on the maximum degree is 6, the last in a long sequence of results improving the upper bound. In this paper we show that the complete Euclidean graph always contains a plane spanner of maximum degree at most 4 and make a big step toward closing the question. Our construction leads to an efficient algorithm for obtaining the spanner from Chew's L1-Delaunay triangulation

    Methodology for Process Improvement Through Basic Components and Focusing on the Resistance to Change.

    Get PDF
    This paper describes a multi-model methodology that implements a smooth and continuous process improvement, depending on the organization's business goals and allowing users to establish their improvement implementation pace. The methodology focuses on basic process components known as ‘best practices’. Besides, it covers following the topics: knowledge management and change management. The methodology description and the results of a case study on project management process are included

    Cyclical Changes in Short-Run Earnings Mobility in Canada, 1982-1996

    Get PDF
    The paper by Charles M. Beach and Ross Finnie represents the first attempt to quantify short-term or cyclical changes in earnings mobility in Canada. Mobility analysis can be seen as a complement to the analysis of income distribution. For a given degree of earnings inequality, more earnings mobility corresponds to securing greater labour market opportunity. Using longitudinal income-tax-based data, the authors divide the employed population into eight age/sex groups: entry workers (20–24), younger workers (25–34), prime-age workers (35–54), and older workers (55–64) for both sexes; and divide the earnings distribution into lower, middle and upper regions or earnings intervals based on median earnings levels for the distribution as a whole, and calculate the proportion of workers in each group for all years over the 1982–96 period. They also develop transition matrices that show the probability of moving from one earnings interval to another over a one-year period. They find that there have been major cyclical changes in earnings polarization and that these changes have been concentrated in recessions, notably in the 1990–92 downturn. They also find that men in particular experienced a marked decrease in their net probability of upward mobility in the earnings distribution during recessions, as the probability of moving up fell sharply as did the probability of moving down. The results of the paper are particularly relevant for an understanding of how earnings mobility may be affected by the current economic slowdown.Earnings Mobility, Income Mobility, Mobility, Income, Earnings, Distribution, Income Distribution, Earnings Distribution, Earnings Inequality, Recession, Cyclical

    Global Cosmological Parameters Determined Using Classical Double Radio Galaxies

    Get PDF
    A sample of 20 powerful extended radio galaxies with redshifts between zero and two were used to determine constraints on global cosmological parameters. Data for six radio sources were obtained from the VLA archive, analyzed, and combined with the sample of 14 radio galaxies used previously by Guerra & Daly to determine cosmological parameters. The results are consistent with our previous results, and indicate that the current value of the mean mass density of the universe is significantly less than the critical value. A universe with Ωm\Omega_m of unity is ruled out at 99.0% confidence, and the best fitting values of Ωm\Omega_m in matter are 0.100.10+0.250.10^{+0.25}_{-0.10} and 0.250.25+0.35-0.25^{+0.35}_{-0.25} assuming zero space curvature and zero cosmological constant, respectively. Note that identical results obtain when the low redshift bin, which includes Cygnus A, is excluded; these results are independent of whether the radio source Cygnus A is included. The method does not rely on a zero-redshift normalization. The radio properties of each source are also used to determine the density of the gas in the vicinity of the source, and the beam power of the source. The six new radio sources have physical characteristics similar to those found for the original 14 sources. The density of the gas around these radio sources is typical of gas in present day clusters of galaxies. The beam powers are typically about 1045erg s110^{45} \hbox{erg s}^{-1}.Comment: 39 pages includes 21 figures, accepted to Ap
    corecore