74,456 research outputs found

    Computability and analysis: the legacy of Alan Turing

    Full text link
    We discuss the legacy of Alan Turing and his impact on computability and analysis.Comment: 49 page

    "Rotterdam econometrics": publications of the econometric institute 1956-2005

    Get PDF
    This paper contains a list of all publications over the period 1956-2005, as reported in the Rotterdam Econometric Institute Reprint series during 1957-2005.

    Determination of forest road surface roughness by kinect depth imaging

    Get PDF
    Roughness is a dynamic property of the gravel road surface that affects safety, ride comfort as well as vehicle tyre life and maintenance costs. A rapid survey of gravel road condition is fundamental for an effective maintenance planning and definition of the intervention priorities. Different non-contact techniques such as laser scanning, ultrasonic sensors and photogrammetry have recently been proposed to reconstruct three-dimensional topography of road surface and allow extraction of roughness metrics. The application of Microsoft Kinect\u2122 depth camera is proposed and discussed here for collection of 3D data sets from gravel roads, to be implemented in order to allow quantification of surface roughness. The objectives are to: i) verify the applicability of the Kinect sensor for characterization of different forest roads, ii) identify the appropriateness and potential of different roughness parameters and iii) analyse the correlation with vibrations recoded by 3-axis accelerometers installed on different vehicles. The test took advantage of the implementation of the Kinect depth camera for surface roughness determination of 4 different forest gravel roads and one well-maintained asphalt road as reference. Different vehicles (mountain bike, off-road motorcycle, ATV vehicle, 4WD car and compact crossover) were included in the experiment in order to verify the vibration intensity when travelling on different road surface conditions. Correlations between the extracted roughness parameters and vibration levels of the tested vehicles were then verified. Coefficients of determination of between 0.76 and 0.97 were detected between average surface roughness and standard deviation of relative accelerations, with higher values in the case of lighter vehicles

    Low Complexity Regularization of Linear Inverse Problems

    Full text link
    Inverse problems and regularization theory is a central theme in contemporary signal processing, where the goal is to reconstruct an unknown signal from partial indirect, and possibly noisy, measurements of it. A now standard method for recovering the unknown signal is to solve a convex optimization problem that enforces some prior knowledge about its structure. This has proved efficient in many problems routinely encountered in imaging sciences, statistics and machine learning. This chapter delivers a review of recent advances in the field where the regularization prior promotes solutions conforming to some notion of simplicity/low-complexity. These priors encompass as popular examples sparsity and group sparsity (to capture the compressibility of natural signals and images), total variation and analysis sparsity (to promote piecewise regularity), and low-rank (as natural extension of sparsity to matrix-valued data). Our aim is to provide a unified treatment of all these regularizations under a single umbrella, namely the theory of partial smoothness. This framework is very general and accommodates all low-complexity regularizers just mentioned, as well as many others. Partial smoothness turns out to be the canonical way to encode low-dimensional models that can be linear spaces or more general smooth manifolds. This review is intended to serve as a one stop shop toward the understanding of the theoretical properties of the so-regularized solutions. It covers a large spectrum including: (i) recovery guarantees and stability to noise, both in terms of 2\ell^2-stability and model (manifold) identification; (ii) sensitivity analysis to perturbations of the parameters involved (in particular the observations), with applications to unbiased risk estimation ; (iii) convergence properties of the forward-backward proximal splitting scheme, that is particularly well suited to solve the corresponding large-scale regularized optimization problem

    Contracting innovations and the evolution of clearing and settlement methods at futures exchanges

    Get PDF
    Defining futures contracts as substitutes for associated cash transactions enables a discussion of the evolution of controls over contract nonperformance risk. These controls are incorporated into exchange methods for clearing contracts. Three clearing methods are discussed: direct, ringing and complete. The incidence and operation of each are described. Direct-clearing systems feature bilateral contracts with terms specified by the counterparties to the contract. Exchanges relying on direct clearing system chiefly serve as mediators in trade disputes. Ringing is shown to facilitate contract offset by increasing the number of potential counterparties. Ringing settlements reduce counterparty credit risk by reducing the accumulation of dependencies as contracts are offset. Ringing settlements also lower the cost of maintaining open contract positions, chiefly by lowering the amount or required margin deposits. Exchanges employing ringing methods generally adopted a clearinghouse to handle payments. Complete clearing interposes the clearinghouse as counterparty to every contract. This measure ensures that contracts are fungible with respect to both the underlying commodity and counterparty risk.Contracts ; Futures ; Clearinghouses (Banking)
    corecore