18,063 research outputs found

    Complementation, Local Complementation, and Switching in Binary Matroids

    Get PDF
    In 2004, Ehrenfeucht, Harju, and Rozenberg showed that any graph on a vertex set VV can be obtained from a complete graph on VV via a sequence of the operations of complementation, switching edges and non-edges at a vertex, and local complementation. The last operation involves taking the complement in the neighbourhood of a vertex. In this paper, we consider natural generalizations of these operations for binary matroids and explore their behaviour. We characterize all binary matroids obtainable from the binary projective geometry of rank rr under the operations of complementation and switching. Moreover, we show that not all binary matroids of rank at most rr can be obtained from a projective geometry of rank rr via a sequence of the three generalized operations. We introduce a fourth operation and show that, with this additional operation, we are able to obtain all binary matroids.Comment: Fixed an error in the proof of Theorem 5.3. Adv. in Appl. Math. (2020

    Learning and the Great Moderation

    Get PDF
    We study a stylized theory of the volatility reduction in the U.S. after 1984—the Great Moderation—which attributes part of the stabilization to less volatile shocks and another part to more difficult inference on the part of Bayesian households attempting to learn the latent state of the economy. We use a standard equilibrium business cycle model with technology following an unobserved regime-switching process. After 1984, according to Kim and Nelson (1999a), the variance of U.S. macroeconomic aggregates declined because boom and recession regimes moved closer together, keeping conditional variance unchanged. In our model this makes the signal extraction problem more difficult for Bayesian households, and in response they moderate their behavior, reinforcing the effect of the less volatile stochastic technology and contributing an extra measure of moderation to the economy. We construct example economies in which this learning effect accounts for about 30 percent of a volatility reduction of the magnitude observed in the postwar U.S. data.business cycles; regime-switching; Bayesian learning; information

    Changepoint Detection over Graphs with the Spectral Scan Statistic

    Full text link
    We consider the change-point detection problem of deciding, based on noisy measurements, whether an unknown signal over a given graph is constant or is instead piecewise constant over two connected induced subgraphs of relatively low cut size. We analyze the corresponding generalized likelihood ratio (GLR) statistics and relate it to the problem of finding a sparsest cut in a graph. We develop a tractable relaxation of the GLR statistic based on the combinatorial Laplacian of the graph, which we call the spectral scan statistic, and analyze its properties. We show how its performance as a testing procedure depends directly on the spectrum of the graph, and use this result to explicitly derive its asymptotic properties on few significant graph topologies. Finally, we demonstrate both theoretically and by simulations that the spectral scan statistic can outperform naive testing procedures based on edge thresholding and χ2\chi^2 testing

    Detecting Activations over Graphs using Spanning Tree Wavelet Bases

    Full text link
    We consider the detection of activations over graphs under Gaussian noise, where signals are piece-wise constant over the graph. Despite the wide applicability of such a detection algorithm, there has been little success in the development of computationally feasible methods with proveable theoretical guarantees for general graph topologies. We cast this as a hypothesis testing problem, and first provide a universal necessary condition for asymptotic distinguishability of the null and alternative hypotheses. We then introduce the spanning tree wavelet basis over graphs, a localized basis that reflects the topology of the graph, and prove that for any spanning tree, this approach can distinguish null from alternative in a low signal-to-noise regime. Lastly, we improve on this result and show that using the uniform spanning tree in the basis construction yields a randomized test with stronger theoretical guarantees that in many cases matches our necessary conditions. Specifically, we obtain near-optimal performance in edge transitive graphs, kk-nearest neighbor graphs, and ϵ\epsilon-graphs

    Inventory Mistakes and the Great Moderation

    Get PDF
    Why did the volatility of U.S. real GDP decline by more than the volatility of final sales with the Great Moderation in the mid-1980s? One possible explanation is that firms shifted their inventory behaviour towards a greater emphasis on production smoothing. We investigate the role of inventories in the Great Moderation by estimating an unobserved components model that identifies inventory and sales shocks and their propagation in the aggregate data. Our findings suggest little evidence of increased production smoothing. Instead, a reduction in inventory mistakes explains the excess volatility reduction in output relative to sales. The inventory mistakes are informational errors related to production that must be set in advance and their reduction also helps to explain the changed forecasting role of inventories since the mid-1980s. Our findings provide an optimistic prognosis for the continuation of the Great Moderation despite the dramatic movements in output during the recent economic crisis.inventories; unobserved components model; inventory mistakes; production smoothing; Great Moderation

    Worldwide macroeconomic stability and monetary policy rules

    Get PDF
    We study the interaction of multiple large economies in dynamic stochastic general equilibrium. Each economy has a monetary policymaker that attempts to control the economy through the use of a linear nominal interest rate feedback rule. We show how the determinacy of worldwide equilibrium depends on the joint behavior of policymakers worldwide. We also show how indeterminacy exposes all economies to endogenous volatility, even ones where monetary policy may be judged appropriate from a closed economy perspective. We construct and discuss two quantitative cases. In the 1970s, worldwide equilibrium was characterized by a two-dimensional indeterminacy, despite U.S. adherence to a version of the Taylor principle. In the last 15 years, worldwide equilibrium was still characterized by a one-dimensional indeterminacy, leaving all economies exposed to endogenous volatility. Our analysis provides a rationale for a type of international policy coordination, and the gains to coordination in the sense of avoiding indeterminacy may be large.Keynesian economics ; Monetary policy ; Inflation (Finance)

    Learning and the Great Moderation

    Get PDF
    We study a stylized theory of the volatility reduction in the U.S. after 1984—the Great Moderation—which attributes part of the stabilization to less volatile shocks and another part to more difficult inference on the part of Bayesian households attempting to learn the latent state of the economy. We use a standard equilibrium business cycle model with technology following an unobserved regime-switching process. After 1984, according to Kim and Nelson (1999a), the variance of U.S. macroeconomic aggregates declined because boom and recession regimes moved closer together, keeping conditional variance unchanged. In our model this makes the signal extraction problem more difficult for Bayesian households, and in response they moderate their behavior, reinforcing the effect of the less volatile stochastic technology and contributing an extra measure of moderation to the economy. We construct example economies in which this learning effect accounts for about 30 percent of a volatility reduction of the magnitude observed in the postwar U.S. data.Business cycles ; Time-series analysis
    corecore