3,931 research outputs found

    Two cheers for the urban white paper

    Get PDF
    In November 2000, the government finally published its Urban White Paper. Our Towns and Cities: The Future appeared over a year after the Rogers’ Urban Task Force report, to which it provided an indirect official response, and no less than 23 years after the last such statement of government urban policy

    Periodogram and likelihood periodicity search in the SNO solar neutrino data

    Get PDF
    In this work a detailed spectral analysis for periodicity search of the time series of the 8B solar neutrino flux released by the SNO Collaboration is presented. The data have been publicly released with truncation of the event times to the unit of day (1 day binning); they are thus suited to undergo the traditional Lomb-Scargle analysis for periodicity investigation, as well as an extension of such a method based on a likelihood approach. The results of the analysis presented here confirm the absence of modulation signatures in the SNO data. For completeness, a more refined "1 day binned" likelihood is also illustrated, which approximates the unbinned likelihood methodology, based upon the availability of the full time information, adopted by the SNO collaboration. Finally, this work is completed with two different joint analyses of the SNO and Super-Kamiokande data, respectively, over the common and the entire data taking periods. While both analyses reinforce the case of the constancy of the neutrino flux, the latter in addition provides evidence of the detection at the 99.7% confidence level of the annual modulation spectral line due to the Earth's orbit eccentricity around the SunComment: 27 pages, 29 figures. Joint periodicity analysis of the SNO and Super-Kamiokande data added. Accepted for publication on Phys. Rev.

    Assessing the contribution of shallow and deep knowledge sources for word sense disambiguation

    No full text
    Corpus-based techniques have proved to be very beneficial in the development of efficient and accurate approaches to word sense disambiguation (WSD) despite the fact that they generally represent relatively shallow knowledge. It has always been thought, however, that WSD could also benefit from deeper knowledge sources. We describe a novel approach to WSD using inductive logic programming to learn theories from first-order logic representations that allows corpus-based evidence to be combined with any kind of background knowledge. This approach has been shown to be effective over several disambiguation tasks using a combination of deep and shallow knowledge sources. Is it important to understand the contribution of the various knowledge sources used in such a system. This paper investigates the contribution of nine knowledge sources to the performance of the disambiguation models produced for the SemEval-2007 English lexical sample task. The outcome of this analysis will assist future work on WSD in concentrating on the most useful knowledge sources

    Testing linear hypotheses in high-dimensional regressions

    Full text link
    For a multivariate linear model, Wilk's likelihood ratio test (LRT) constitutes one of the cornerstone tools. However, the computation of its quantiles under the null or the alternative requires complex analytic approximations and more importantly, these distributional approximations are feasible only for moderate dimension of the dependent variable, say p≤20p\le 20. On the other hand, assuming that the data dimension pp as well as the number qq of regression variables are fixed while the sample size nn grows, several asymptotic approximations are proposed in the literature for Wilk's \bLa including the widely used chi-square approximation. In this paper, we consider necessary modifications to Wilk's test in a high-dimensional context, specifically assuming a high data dimension pp and a large sample size nn. Based on recent random matrix theory, the correction we propose to Wilk's test is asymptotically Gaussian under the null and simulations demonstrate that the corrected LRT has very satisfactory size and power, surely in the large pp and large nn context, but also for moderately large data dimensions like p=30p=30 or p=50p=50. As a byproduct, we give a reason explaining why the standard chi-square approximation fails for high-dimensional data. We also introduce a new procedure for the classical multiple sample significance test in MANOVA which is valid for high-dimensional data.Comment: Accepted 02/2012 for publication in "Statistics". 20 pages, 2 pages and 2 table

    Preface: Advances in post-processing and blending of deterministic and ensemble forecasts

    Get PDF
    The special issue on advances in post-processing and blending of deterministic and ensemble forecasts is the outcome of several successful successive sessions organized at the General Assembly of the European Geosciences Union. Statistical post-processing and blending of forecasts are currently topics of important attention and development in many countries to produce optimal forecasts. Ten contributions have been received, covering key aspects of current concerns on statistical post-processing, namely the restoration of inter-variable dependences, the impact of model changes on the statistical relationships and how to cope with it, the operational implementation at forecasting centers, the development of appropriate metrics for forecast verification, and finally two specific applications to snow forecasts and seasonal forecasts of the North Atlantic Oscillation

    Microscopic mechanism for mechanical polishing of diamond (110) surfaces

    Full text link
    Mechanically induced degradation of diamond, as occurs during polishing, is studied using total--energy pseudopotential calculations. The strong asymmetry in the rate of polishing between different directions on the diamond (110) surface is explained in terms of an atomistic mechanism for nano--groove formation. The post--polishing surface morphology and the nature of the polishing residue predicted by this mechanism are consistent with experimental evidence.Comment: 4 pages, 5 figure

    Optical quenching and recovery of photoconductivity in single-crystal diamond

    Full text link
    We study the photocurrent induced by pulsed-light illumination (pulse duration is several nanoseconds) of single-crystal diamond containing nitrogen impurities. Application of additional continuous-wave light of the same wavelength quenches pulsed photocurrent. Characterization of the optically quenched photocurrent and its recovery is important for the development of diamond based electronics and sensing

    A New Technique for Finding Needles in Haystacks: A Geometric Approach to Distinguishing Between a New Source and Random Fluctuations

    Full text link
    We propose a new test statistic based on a score process for determining the statistical significance of a putative signal that may be a small perturbation to a noisy experimental background. We derive the reference distribution for this score test statistic; it has an elegant geometrical interpretation as well as broad applicability. We illustrate the technique in the context of a model problem from high-energy particle physics. Monte Carlo experimental results confirm that the score test results in a significantly improved rate of signal detection.Comment: 5 pages, 4 figure

    Population Cycling in Space-Limited Organisms Subject to Density-Dependent Predation

    Get PDF
    We present a population model with density-dependent disturbance. The model is motivated by, and is illustrated with, data on the percentage of space covered by barnacles on quadrats of rock in the intertidal zone. The autocorrelation function observed indicates population cycling. This autocorrelation function is predicted qualitatively and quantitatively by the detailed model we present. The general version of the model suggests the following rules regarding cycling in space-limited communities subject to density-dependent disturbances. These rules may apply to any space-limited community where a density-dependent disturbance reduces population densities to very low levels, like fire or wind for plant communities. We propose that the period of the cycle will be approximately equal to the time it takes the community to reach a critical density plus the average time between disturbance events when the density is above that critical density. The cycling will only be clear from autocorrelation data if the growth process is relatively consistent, there is a critical density (which the sessile organism reaches and passes) above which the probability of disturbance increases rapidly, and the time to reach the critical density is at least twice the average time between disturbance events
    • …
    corecore