10,480 research outputs found

    "Information-Based Economy" and Educational System

    Get PDF
    "Information-Based Economy", which is today's economy that is a proof and indicator of development level for the countries now on, comes on the scene with its new organizing model on its infrastructure, which is called "Information Society". The phenomenon of administration introduces to "e-Government" for reinforcing the roots of "Information-Based Economy" now. Having a systematic knowledge of the relation between "Information-Based Economy", "Information Society" and "e-Government" as a whole composes of the theme of this study. For this purpose, a questionnaire has been conducted in the Ministry of National Education, which is responsible for forming the society of the future, to understand whether there is a systematic knowledge on the relation between "Information-Based Economy", "Information Society" and "e-Government" as a whole. Moreover, it has been aimed to discover what the mental formulations of participants are. Questionnaire results reveal that there is no systematic knowledge on the relation between "Information-Based Economy", "Information Society" and "e-Government" as a whole in the Ministry of National Education, and that the participants are apt to perceive "e-Government" within the context in which they are in terms of professions, status and backgrounds. Questionnaire results also show that the responses given by the participants concerning "e-Government" are more or less the same due to the hierarchical organization of knowledge and official knowledge in particular.Information-Based Economy, Information Society, E-Government, Educational System

    International Financial Instability in a World of Currencies Hierarchy

    Get PDF
    The 1990s have witnessed an increase of international financial turbulence. Indeed, the frequency, the size, the geographic extension, and the social costs of financial crises have made the topic a global policy issue. An array of policy actions have been advocated to prevent crises from happening again. A major controversial question is whether efforts should be directed towards national reforms in emerging markets or, rather, towards a new international design of international payments. After a critical review of the standing proposals, this paper contends that this debate has not yet fully explored one of the problems of international instability, that is to say, the problem raised by international payments in a world of currencies of diverse quality. As Keynes firmly contended, the monetary side of the (global) economy is not a neutral factor. In fact, it may be the problems posed by the different degrees of “international moneyness” that make currencies unequal that should be considered as one of the fundamental factors behind any model of international financial instability. Viewed in this light, a major re-design of international payments systems is warranted, and options seem limited to either world dollarization or the ‘bancor’ solution. Recent reformulations of Keynes’s original ‘bancor’ proposal seem to be a more viable alternative to either the status quo or world dollarization.Currency hierarchy; Currency crises; Banking crises; Capital flows; International monetary arrangements and institutions.

    "The "Keynesian Moment" in Policymaking, the Perils Ahead, and a Flow-of-funds Interpretation of Fiscal Policy"

    Get PDF
    With the global crisis, the policy stance around the world has been shaken by massive government and central bank efforts to prevent the meltdown of markets, banks, and the economy. Fiscal packages, in varied sizes, have been adopted throughout the world after years of proclaimed fiscal containment. This change in policy regime, though dubbed the "Keynesian moment," is a "short-run fix" that reflects temporary acceptance of fiscal deficits at a time of political emergency, and contrasts with John Maynard Keynes’s long-run policy propositions. More important, it is doomed to be ineffective if the degree of tolerance of fiscal deficits is too low for full employment. Keynes’s view that outside the gold standard fiscal policies face real, not financial, constraints is illustrated by means of a simple flow-of-funds model. This shows that government deficits do not take financial resources from the private sector, and that demand for net financial savings by the private sector can be met by a rising trade surplus at the cost of reduced consumption, or by a rising government deficit financed by the monopoly supply of central bank credit. Fiscal deficits can thus be considered functional to the objective of supplying the private sector with a provision of financial wealth sufficient to restore demand. By contrast, tax hikes and/or spending cuts aimed at reducing the public deficit lower the available savings of the private sector, and, if adopted too soon, will force the adjustment by way of a reduction of demand and standard of living. This notion, however, is not applicable to the euro area, where constraints have been deliberately created that limit public deficits and the supply of central bank credit, thus introducing national solvency risks. This is a crucial flaw in the institutional structure of Euroland, where monetary sovereignty has been removed from all existing fiscal authorities. Absent a reassessment of its design, the euro area is facing a deflationary tendency that may further erode the economic welfare of the region.Government and the Monetary System; Fiscal Policy; Keynes; Euro Area

    Stock-bond correlation and the bond quality ratio: Removing the discount factor to generate a “deflated” stock index

    Get PDF
    This paper investigates the cyclical co-movements between US stocks and interest rates by testing a simple model where divergence between stock and bond price behavior is explained by “stock market strength,” where the latter depends on the market climate about future corporate profits—as captured by the corporate bond quality ratio—and an unexplained stock market sentiment. Using two different regression techniques to check for robustness, we find evidence of a statistically significant cyclical correlation between stocks and bonds. On the basis of this finding, we then present a methodology to “deflate” a stock price index such that we can compare stock market strength over time. This is obtained by removing the effect of a changing discount rate—as measured by our regressions—on stock prices. For example, viewed in this light, the past five years in the US stock market reveal a wider fluctuation in stock market strength than we can observe on the basis of stock price indices alone.Stock-bond correlation, Market sentiment, Stock price.

    A brief study of some aspects of early father-child relationship

    Full text link
    Thesis (M.S.)--Boston Universit

    Community-aware network sparsification

    Full text link
    Network sparsification aims to reduce the number of edges of a network while maintaining its structural properties; such properties include shortest paths, cuts, spectral measures, or network modularity. Sparsification has multiple applications, such as, speeding up graph-mining algorithms, graph visualization, as well as identifying the important network edges. In this paper we consider a novel formulation of the network-sparsification problem. In addition to the network, we also consider as input a set of communities. The goal is to sparsify the network so as to preserve the network structure with respect to the given communities. We introduce two variants of the community-aware sparsification problem, leading to sparsifiers that satisfy different connectedness community properties. From the technical point of view, we prove hardness results and devise effective approximation algorithms. Our experimental results on a large collection of datasets demonstrate the effectiveness of our algorithms.https://epubs.siam.org/doi/10.1137/1.9781611974973.48Accepted manuscrip

    A Divide-and-Conquer Algorithm for Betweenness Centrality

    Full text link
    The problem of efficiently computing the betweenness centrality of nodes has been researched extensively. To date, the best known exact and centralized algorithm for this task is an algorithm proposed in 2001 by Brandes. The contribution of our paper is Brandes++, an algorithm for exact efficient computation of betweenness centrality. The crux of our algorithm is that we create a sketch of the graph, that we call the skeleton, by replacing subgraphs with simpler graph structures. Depending on the underlying graph structure, using this skeleton and by keeping appropriate summaries Brandes++ we can achieve significantly low running times in our computations. Extensive experimental evaluation on real life datasets demonstrate the efficacy of our algorithm for different types of graphs. We release our code for benefit of the research community.Comment: Shorter version of this paper appeared in Siam Data Mining 201

    Learning-based predictive control for linear systems: a unitary approach

    Full text link
    A comprehensive approach addressing identification and control for learningbased Model Predictive Control (MPC) for linear systems is presented. The design technique yields a data-driven MPC law, based on a dataset collected from the working plant. The method is indirect, i.e. it relies on a model learning phase and a model-based control design one, devised in an integrated manner. In the model learning phase, a twofold outcome is achieved: first, different optimal p-steps ahead prediction models are obtained, to be used in the MPC cost function; secondly, a perturbed state-space model is derived, to be used for robust constraint satisfaction. Resorting to Set Membership techniques, a characterization of the bounded model uncertainties is obtained, which is a key feature for a successful application of the robust control algorithm. In the control design phase, a robust MPC law is proposed, able to track piece-wise constant reference signals, with guaranteed recursive feasibility and convergence properties. The controller embeds multistep predictors in the cost function, it ensures robust constraints satisfaction thanks to the learnt uncertainty model, and it can deal with possibly unfeasible reference values. The proposed approach is finally tested in a numerical example
    corecore