42,123 research outputs found

    Response of Firm Agent Network to Exogenous Shock

    Get PDF
    This paper describes an agent-based model of interacting firms, in which interacting firm agents rationally invest capital and labor in order to maximize payoff. Both transactions and production are taken into account in this model. First, the performance of individual firms on a real transaction network was simulated. The simulation quantitatively reproduced the cumulative probability distribution of revenue, material cost, capital, and labor. Then, the response of the firms to a given exogenous shock, defined as a sudden change of gross domestic product, is discussed. The longer tail in cumulative probability and skewed distribution of growth rate are observed for a high growth scenario.Comment: 8 pages, 9 figures, APFA

    Monitoring Networked Applications With Incremental Quantile Estimation

    Full text link
    Networked applications have software components that reside on different computers. Email, for example, has database, processing, and user interface components that can be distributed across a network and shared by users in different locations or work groups. End-to-end performance and reliability metrics describe the software quality experienced by these groups of users, taking into account all the software components in the pipeline. Each user produces only some of the data needed to understand the quality of the application for the group, so group performance metrics are obtained by combining summary statistics that each end computer periodically (and automatically) sends to a central server. The group quality metrics usually focus on medians and tail quantiles rather than on averages. Distributed quantile estimation is challenging, though, especially when passing large amounts of data around the network solely to compute quality metrics is undesirable. This paper describes an Incremental Quantile (IQ) estimation method that is designed for performance monitoring at arbitrary levels of network aggregation and time resolution when only a limited amount of data can be transferred. Applications to both real and simulated data are provided.Comment: This paper commented in: [arXiv:0708.0317], [arXiv:0708.0336], [arXiv:0708.0338]. Rejoinder in [arXiv:0708.0339]. Published at http://dx.doi.org/10.1214/088342306000000583 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Convolutional Deblurring for Natural Imaging

    Full text link
    In this paper, we propose a novel design of image deblurring in the form of one-shot convolution filtering that can directly convolve with naturally blurred images for restoration. The problem of optical blurring is a common disadvantage to many imaging applications that suffer from optical imperfections. Despite numerous deconvolution methods that blindly estimate blurring in either inclusive or exclusive forms, they are practically challenging due to high computational cost and low image reconstruction quality. Both conditions of high accuracy and high speed are prerequisites for high-throughput imaging platforms in digital archiving. In such platforms, deblurring is required after image acquisition before being stored, previewed, or processed for high-level interpretation. Therefore, on-the-fly correction of such images is important to avoid possible time delays, mitigate computational expenses, and increase image perception quality. We bridge this gap by synthesizing a deconvolution kernel as a linear combination of Finite Impulse Response (FIR) even-derivative filters that can be directly convolved with blurry input images to boost the frequency fall-off of the Point Spread Function (PSF) associated with the optical blur. We employ a Gaussian low-pass filter to decouple the image denoising problem for image edge deblurring. Furthermore, we propose a blind approach to estimate the PSF statistics for two Gaussian and Laplacian models that are common in many imaging pipelines. Thorough experiments are designed to test and validate the efficiency of the proposed method using 2054 naturally blurred images across six imaging applications and seven state-of-the-art deconvolution methods.Comment: 15 pages, for publication in IEEE Transaction Image Processin

    Calibration of optimal execution of financial transactions in the presence of transient market impact

    Full text link
    Trading large volumes of a financial asset in order driven markets requires the use of algorithmic execution dividing the volume in many transactions in order to minimize costs due to market impact. A proper design of an optimal execution strategy strongly depends on a careful modeling of market impact, i.e. how the price reacts to trades. In this paper we consider a recently introduced market impact model (Bouchaud et al., 2004), which has the property of describing both the volume and the temporal dependence of price change due to trading. We show how this model can be used to describe price impact also in aggregated trade time or in real time. We then solve analytically and calibrate with real data the optimal execution problem both for risk neutral and for risk averse investors and we derive an efficient frontier of optimal execution. When we include spread costs the problem must be solved numerically and we show that the introduction of such costs regularizes the solution.Comment: 31 pages, 8 figure

    Money and interest rates under a reserves operating target

    Get PDF
    This study examines the short-run dynamic relationships between nonborrowed reserves, the federal funds rate, and transaction accounts using daily data from 1979 through 1982. Separate models are estimated for each day of the week, and simulation experiments are performed. The results suggest that the funds rate responded quite rapidly to a change in nonborrowed reserves, but that the short-run nonborrowed reserves multiplier for transaction accounts was only about 18 percent of its theoretical maximum. In addition, the Federal Reserve appeared to accommodate about 65 percent of a permanent shock to money, and lagged reserve requirements seemed to delay depository institutions' response to a money shock.Interest rates ; Bank reserves

    Consumer Search on the Internet

    Get PDF
    This paper uses consumer search data to explain search frictions in online markets, within the context of an equilibrium search model. I use a novel dataset of consumer online browsing and purchasing behavior, which tracks all consumer search prior to each transaction. Using observed search intensities from the online book industry, I estimate search cost distributions that allow for asymmetric consumer sampling. Research on consumer search often assumes a symmetric sampling rule for analytical convenience despite its lack of realism. Search behavior in the online book industry is quite limited: in only 25 percen of the transactions did consumers visit more than one bookstore's website. The industry is characterized by a strong consumer preference for certain retailers. Accounting for unequal consumer sampling halves the search cost estimates from 1.8 to 0.9 dollars per search in the online book industry. Analysis of time spent online suggests substitution between the time consumers spend searching and the relative opportunity cost of their time. Retired people, those with lower education levels, and minorities (with the exception of Hispanics) spent significantly more time searching for a book online. There is a negative relationship between income levels and time spent searching.consumer search, internet, search costs
    corecore