315,750 research outputs found

    Constraint satisfaction adaptive neural network and heuristics combined approaches for generalized job-shop scheduling

    Get PDF
    Copyright @ 2000 IEEEThis paper presents a constraint satisfaction adaptive neural network, together with several heuristics, to solve the generalized job-shop scheduling problem, one of NP-complete constraint satisfaction problems. The proposed neural network can be easily constructed and can adaptively adjust its weights of connections and biases of units based on the sequence and resource constraints of the job-shop scheduling problem during its processing. Several heuristics that can be combined with the neural network are also presented. In the combined approaches, the neural network is used to obtain feasible solutions, the heuristic algorithms are used to improve the performance of the neural network and the quality of the obtained solutions. Simulations have shown that the proposed neural network and its combined approaches are efficient with respect to the quality of solutions and the solving speed.This work was supported by the Chinese National Natural Science Foundation under Grant 69684005 and the Chinese National High-Tech Program under Grant 863-511-9609-003, the EPSRC under Grant GR/L81468

    Earnings Management and Long-Run Stock Underperformance of Private Placements

    Get PDF
    The study investigates whether private placement issuers manipulate their earnings around the time of issuance and the effect of earnings management on the long-run stock performance. We find that managers of U.S. private placement issuers tend to engage in income-increasing earnings management in the year prior to the issuance of private placements. We further speculate that earnings management serves as a likely source of investor over-optimism at the time of private placements. To support this speculation, we find evidence suggesting that the income-increasing accounting accruals made at the time of private placements predict the post-issue long-term stock underperformance. The study contributes to the large body of literature on earnings manipulation around the time of securities issuance

    Compressive Mechanism: Utilizing Sparse Representation in Differential Privacy

    Full text link
    Differential privacy provides the first theoretical foundation with provable privacy guarantee against adversaries with arbitrary prior knowledge. The main idea to achieve differential privacy is to inject random noise into statistical query results. Besides correctness, the most important goal in the design of a differentially private mechanism is to reduce the effect of random noise, ensuring that the noisy results can still be useful. This paper proposes the \emph{compressive mechanism}, a novel solution on the basis of state-of-the-art compression technique, called \emph{compressive sensing}. Compressive sensing is a decent theoretical tool for compact synopsis construction, using random projections. In this paper, we show that the amount of noise is significantly reduced from O(n)O(\sqrt{n}) to O(log(n))O(\log(n)), when the noise insertion procedure is carried on the synopsis samples instead of the original database. As an extension, we also apply the proposed compressive mechanism to solve the problem of continual release of statistical results. Extensive experiments using real datasets justify our accuracy claims.Comment: 20 pages, 6 figure

    On a Novel Class of Integrable ODEs Related to the Painlev\'e Equations

    Full text link
    One of the authors has recently introduced the concept of conjugate Hamiltonian systems: the solution of the equation h=H(p,q,t),h=H(p,q,t), where HH is a given Hamiltonian containing tt explicitly, yields the function t=T(p,q,h)t=T(p,q,h), which defines a new Hamiltonian system with Hamiltonian TT and independent variable h.h. By employing this construction and by using the fact that the classical Painlev\'e equations are Hamiltonian systems, it is straightforward to associate with each Painlev\'e equation two new integrable ODEs. Here, we investigate the conjugate Painlev\'e II equations. In particular, for these novel integrable ODEs, we present a Lax pair formulation, as well as a class of implicit solutions. We also construct conjugate equations associated with Painlev\'e I and Painlev\'e IV equations.Comment: This paper is dedicated to Professor T. Bountis on the occasion of his 60th birthday with appreciation of his important contributions to "Nonlinear Science

    Scalable Unbalanced Optimal Transport using Generative Adversarial Networks

    Full text link
    Generative adversarial networks (GANs) are an expressive class of neural generative models with tremendous success in modeling high-dimensional continuous measures. In this paper, we present a scalable method for unbalanced optimal transport (OT) based on the generative-adversarial framework. We formulate unbalanced OT as a problem of simultaneously learning a transport map and a scaling factor that push a source measure to a target measure in a cost-optimal manner. In addition, we propose an algorithm for solving this problem based on stochastic alternating gradient updates, similar in practice to GANs. We also provide theoretical justification for this formulation, showing that it is closely related to an existing static formulation by Liero et al. (2018), and perform numerical experiments demonstrating how this methodology can be applied to population modeling
    corecore