3,908 research outputs found

    Two cheers for the urban white paper

    Get PDF
    In November 2000, the government finally published its Urban White Paper. Our Towns and Cities: The Future appeared over a year after the Rogers’ Urban Task Force report, to which it provided an indirect official response, and no less than 23 years after the last such statement of government urban policy

    Periodogram and likelihood periodicity search in the SNO solar neutrino data

    Get PDF
    In this work a detailed spectral analysis for periodicity search of the time series of the 8B solar neutrino flux released by the SNO Collaboration is presented. The data have been publicly released with truncation of the event times to the unit of day (1 day binning); they are thus suited to undergo the traditional Lomb-Scargle analysis for periodicity investigation, as well as an extension of such a method based on a likelihood approach. The results of the analysis presented here confirm the absence of modulation signatures in the SNO data. For completeness, a more refined "1 day binned" likelihood is also illustrated, which approximates the unbinned likelihood methodology, based upon the availability of the full time information, adopted by the SNO collaboration. Finally, this work is completed with two different joint analyses of the SNO and Super-Kamiokande data, respectively, over the common and the entire data taking periods. While both analyses reinforce the case of the constancy of the neutrino flux, the latter in addition provides evidence of the detection at the 99.7% confidence level of the annual modulation spectral line due to the Earth's orbit eccentricity around the SunComment: 27 pages, 29 figures. Joint periodicity analysis of the SNO and Super-Kamiokande data added. Accepted for publication on Phys. Rev.

    Assessing the contribution of shallow and deep knowledge sources for word sense disambiguation

    No full text
    Corpus-based techniques have proved to be very beneficial in the development of efficient and accurate approaches to word sense disambiguation (WSD) despite the fact that they generally represent relatively shallow knowledge. It has always been thought, however, that WSD could also benefit from deeper knowledge sources. We describe a novel approach to WSD using inductive logic programming to learn theories from first-order logic representations that allows corpus-based evidence to be combined with any kind of background knowledge. This approach has been shown to be effective over several disambiguation tasks using a combination of deep and shallow knowledge sources. Is it important to understand the contribution of the various knowledge sources used in such a system. This paper investigates the contribution of nine knowledge sources to the performance of the disambiguation models produced for the SemEval-2007 English lexical sample task. The outcome of this analysis will assist future work on WSD in concentrating on the most useful knowledge sources

    Testing linear hypotheses in high-dimensional regressions

    Full text link
    For a multivariate linear model, Wilk's likelihood ratio test (LRT) constitutes one of the cornerstone tools. However, the computation of its quantiles under the null or the alternative requires complex analytic approximations and more importantly, these distributional approximations are feasible only for moderate dimension of the dependent variable, say p≤20p\le 20. On the other hand, assuming that the data dimension pp as well as the number qq of regression variables are fixed while the sample size nn grows, several asymptotic approximations are proposed in the literature for Wilk's \bLa including the widely used chi-square approximation. In this paper, we consider necessary modifications to Wilk's test in a high-dimensional context, specifically assuming a high data dimension pp and a large sample size nn. Based on recent random matrix theory, the correction we propose to Wilk's test is asymptotically Gaussian under the null and simulations demonstrate that the corrected LRT has very satisfactory size and power, surely in the large pp and large nn context, but also for moderately large data dimensions like p=30p=30 or p=50p=50. As a byproduct, we give a reason explaining why the standard chi-square approximation fails for high-dimensional data. We also introduce a new procedure for the classical multiple sample significance test in MANOVA which is valid for high-dimensional data.Comment: Accepted 02/2012 for publication in "Statistics". 20 pages, 2 pages and 2 table

    Microscopic mechanism for mechanical polishing of diamond (110) surfaces

    Full text link
    Mechanically induced degradation of diamond, as occurs during polishing, is studied using total--energy pseudopotential calculations. The strong asymmetry in the rate of polishing between different directions on the diamond (110) surface is explained in terms of an atomistic mechanism for nano--groove formation. The post--polishing surface morphology and the nature of the polishing residue predicted by this mechanism are consistent with experimental evidence.Comment: 4 pages, 5 figure

    Focusing of Intense Subpicosecond Laser Pulses in Wedge Targets

    Full text link
    Two dimensional particle-in-cell simulations characterizing the interaction of ultraintense short pulse lasers in the range 10^{18} \leq I \leq 10^{20} W/cm^{2} with converging target geometries are presented. Seeking to examine intensity amplification in high-power laser systems, where focal spots are typically non-diffraction limited, we describe key dynamical features as the injected laser intensity and convergence angle of the target are systematically varied. We find that laser pulses are focused down to a wavelength with the peak intensity amplified by an order of magnitude beyond its vacuum value, and develop a simple model for how the peak location moves back towards the injection plane over time. This performance is sustained over hundreds of femtoseconds and scales to laser intensities beyond 10^{20} W/cm^{2} at 1 \mu m wavelength.Comment: 5 pages, 6 figures, accepted for publication in Physics of Plasma

    Thermodynamics of spinning D3-branes

    Get PDF
    Spinning black three-branes in type IIB supergravity are thermodynamically stable up to a critical value of the angular momentum density. Inside the region of thermodynamic stability, the free energy from supergravity is roughly reproduced by a naive model based on free N=4 super-Yang-Mills theory on the world-volume. The field theory model correctly predicts a limit on angular momentum density, but near this limit it does not reproduce the critical exponents one can compute from supergravity. Analogies with Bose condensation and modified matrix models are discussed, and a mean field theory improvement of the naive model is suggested which corrects the critical exponents.Comment: 20 pages, 1 figure, small improvement

    Optical quenching and recovery of photoconductivity in single-crystal diamond

    Full text link
    We study the photocurrent induced by pulsed-light illumination (pulse duration is several nanoseconds) of single-crystal diamond containing nitrogen impurities. Application of additional continuous-wave light of the same wavelength quenches pulsed photocurrent. Characterization of the optically quenched photocurrent and its recovery is important for the development of diamond based electronics and sensing

    Preface: Advances in post-processing and blending of deterministic and ensemble forecasts

    Get PDF
    The special issue on advances in post-processing and blending of deterministic and ensemble forecasts is the outcome of several successful successive sessions organized at the General Assembly of the European Geosciences Union. Statistical post-processing and blending of forecasts are currently topics of important attention and development in many countries to produce optimal forecasts. Ten contributions have been received, covering key aspects of current concerns on statistical post-processing, namely the restoration of inter-variable dependences, the impact of model changes on the statistical relationships and how to cope with it, the operational implementation at forecasting centers, the development of appropriate metrics for forecast verification, and finally two specific applications to snow forecasts and seasonal forecasts of the North Atlantic Oscillation
    • …
    corecore