33,792 research outputs found

    Level set percolation for random interlacements and the Gaussian free field

    Full text link
    We consider continuous-time random interlacements on Z^d, d greater or equal to 3, and investigate the percolation model where a site x of Z^d is occupied if the total amount of time spent at x by all the trajectories of the interlacement at level u > 0 exceeds some given non-negative parameter, and empty otherwise. Thus, the set of occupied sites forms a subset of the interlacement at level u. We also investigate percolation properties of empty sites. A recent isomorphism theorem arXiv:1111.4818 of Sznitman enables us to "translate" some of the relevant questions into the language of level-set percolation for the Gaussian free field on Z^d, d greater or equal to 3, for which useful tools have been developed in arXiv:1202.5172. We also gain new insights of independent interest concerning "two-sided" level-set percolation, where a site x of Z^d is occupied if and only if the absolute value of the field variable at that site exceeds a given non-negative level.Comment: 32 pages, 1 figur

    Times series averaging from a probabilistic interpretation of time-elastic kernel

    Get PDF
    At the light of regularized dynamic time warping kernels, this paper reconsider the concept of time elastic centroid (TEC) for a set of time series. From this perspective, we show first how TEC can easily be addressed as a preimage problem. Unfortunately this preimage problem is ill-posed, may suffer from over-fitting especially for long time series and getting a sub-optimal solution involves heavy computational costs. We then derive two new algorithms based on a probabilistic interpretation of kernel alignment matrices that expresses in terms of probabilistic distributions over sets of alignment paths. The first algorithm is an iterative agglomerative heuristics inspired from the state of the art DTW barycenter averaging (DBA) algorithm proposed specifically for the Dynamic Time Warping measure. The second proposed algorithm achieves a classical averaging of the aligned samples but also implements an averaging of the time of occurrences of the aligned samples. It exploits a straightforward progressive agglomerative heuristics. An experimentation that compares for 45 time series datasets classification error rates obtained by first near neighbors classifiers exploiting a single medoid or centroid estimate to represent each categories show that: i) centroids based approaches significantly outperform medoids based approaches, ii) on the considered experience, the two proposed algorithms outperform the state of the art DBA algorithm, and iii) the second proposed algorithm that implements an averaging jointly in the sample space and along the time axes emerges as the most significantly robust time elastic averaging heuristic with an interesting noise reduction capability. Index Terms-Time series averaging Time elastic kernel Dynamic Time Warping Time series clustering and classification

    Exchange functionals based on finite uniform electron gases

    Full text link
    We show how one can construct \alert{a simple} exchange functional by extending the well-know local-density approximation (LDA) to finite uniform electron gases. This new generalized local-density approximation (GLDA) functional uses only two quantities: the electron density ρ\rho and the curvature of the Fermi hole α\alpha. This alternative "rung 2" functional can be easily coupled with generalized-gradient approximation (GGA) functionals to form a new family of "rung 3" meta-GGA (MGGA) functionals that we have named factorizable MGGAs (FMGGAs). Comparisons are made with various LDA, GGA and MGGA functionals for atoms and molecules.Comment: 20 pages, 5 figures and 2 table

    Advertising budgeting practices of Belgian industrial marketers.

    Get PDF
    The author reports on the results of a survey of a random sample of 102 belgian industrial companies, which measured which budget setting processes companies use, how they set budgets and the resulting budget composition. The objective of the study was first to compare the results with international practice, and second to try to explain their budgeting practices as a function of company, product and market characteristics measured in the same survey. The major conclusions are mixed : on the negative side, we found a lot of heterogeneity in process usage, budget setting rules and media shares, but not much variance that could be explained with the available independent variables. On the positive side, we discovered that belgian companies are 'well behaved' according to expectations based on Marketing Theory. Their use of specific communication objectives is for example based on sound principles. One of the major conclusions is that product type is the major determinant of the communication behavior of companies, together with company size, while market factors play a minor role. These results clearly underline the need for effect measurement studies that would help companies set the size of their communication budgets and allocate these budgets over specific media.
    • 

    corecore