17,056 research outputs found

    Borel Degenerations of Arithmetically Cohen-Macaulay curves in P^3

    Full text link
    We investigate Borel ideals on the Hilbert scheme components of arithmetically Cohen-Macaulay (ACM) codimension two schemes in P^n. We give a basic necessary criterion for a Borel ideal to be on such a component. Then considering ACM curves in P^3 on a quadric we compute in several examples all the Borel ideals on their Hilbert scheme component. Based on this we conjecture which Borel ideals are on such a component, and for a range of Borel ideals we prove that they are on the component.Comment: 20 pages, shorter and more effective versio

    Term testing: a case study

    Get PDF
    Purpose and background: The litigation world has many examples of cases where the volume of Electronically Stored Information (ESI) demands that litigators use automatic means to assist with document identification, classification, and filtering. This case study describes one such process for one case. This case study is not a comprehensive analysis of the entire case, only the Term Testing portion. Term Testing is an analytical practice of refining match terms by running in-depth analysis on a sampling of documents. The goal of term testing is to reduce the number of false negatives (relevant / privilege document with no match, also known as “misdetections”) and false positives (documents matched but not actually relevant / privilege) as much as possible. The case was an employment discrimination suit, against a government agency. The collection effort turned up common sources of ESI: hard drives, network shares, CDs and DVDs, and routine e-mail storage and backups. Initial collection, interviews, and reviews had revealed that a few key documents, such as old versions of policies, had not been retained or collected. Then an unexpected source of information was unearthed: one network administrator had been running an unauthorized “just-in-case” tracer on the email system, outside the agency’s document retention policies, which created dozens of tapes full of millions of encrypted compressed emails, covering more years than the agency’s routine email backups. The agency decided to process and review these tracer emails for the missing key documents, even though the overall volume of relevant documents would rise exponentially. The agency had clear motivation to reduce the volume of documents flowing into relevancy and privilege reviews, but had concerns about the defensibility of using an automated process to determine which documents would never be reviewed. The case litigators and Subject Matter Experts (SMEs) decided to use a process of Term Testing to ensure that automated filtering was both defensible and as accurate as possible

    Combining Harmonic Generation and Laser Chirping to Achieve High Spectral Density in Compton Sources

    Get PDF
    Recently various laser-chirping schemes have been investigated with the goal of reducing or eliminating ponderomotive line broadening in Compton or Thomson scattering occurring at high laser intensities. As a next level of detail in the spectrum calculations, we have calculated the line smoothing and broadening expected due to incident beam energy spread within a one-dimensional plane wave model for the incident laser pulse, both for compensated (chirped) and unchirped cases. The scattered compensated distributions are treatable analytically within three models for the envelope of the incident laser pulses: Gaussian, Lorentzian, or hyperbolic secant. We use the new results to demonstrate that the laser chirping in Compton sources at high laser intensities: (i) enables the use of higher order harmonics, thereby reducing the required electron beam energies; and (ii) increases the photon yield in a small frequency band beyond that possible with the fundamental without chirping. This combination of chirping and higher harmonics can lead to substantial savings in the design, construction and operational costs of the new Compton sources. This is of particular importance to the the widely popular laser-plasma accelerator based Compton sources, as the improvement in their beam quality enters the regime where chirping is most effective.Comment: 5 pages, 4 figure

    Barefoot running improves economy at high intensities and peak treadmill velocity

    Get PDF
    Aim: Barefoot running can improve running economy (RE) compared to shod running at low exercise intensities, but data is lacking for the higher intensities typical during many distance running competitions. The influence of barefoot running on the velocity at maximal oxygen uptake (vVO2max) and peak incremental treadmill test velocity (vmax) is unknown. The present study tested the hypotheses that barefoot running would improve RE, vVO2max and vmax relative to shod running. Methods: Using a balanced within-subject repeated measures design, eight male runners (aged 23.1±4.5 years, height 1.80±0.06 m, mass 73.8±11.5 kg, VO2max 4.08±0.39 L·min-1) completed a familiarization followed by one barefoot and one shod treadmill running trial, 2-14 days apart. Trial sessions consisted of a 5 minute warm-up, 5 minute rest, followed by 4×4 minute stages, at speeds corresponding to ~67, 75, 84 and 91% shod VO2max respectively, separated by a 1 minute rest. After the 4th stage treadmill speed was incremented by 0.1 km·h-1 every 15 s until participants reached volitional exhaustion. Results: RE was improved by 4.4±7.0% across intensities in the barefoot condition (P=0.040). The improvement in RE was related to removed shoe mass (r2=0.80, P=0.003) with an intercept at 0% improvement for RE at 0.520 kg total shoe mass. Both vVO2max (by 4.5±5.0%, P=0.048) and vmax (by 3.9±4.0%, P=0.030) also improved but VO2max was unchanged (p=0.747). Conclusion: Barefoot running improves RE at high exercise intensities and increases vVO2max and vmax, but further research is required to clarify the influence of very light shoe weights on RE

    Multispectral scanner data processing over Sam Houston National Forest

    Get PDF
    The Edit 9 forest scene, a computer processing technique, and its capability to map timber types in the Sam Houston National Forest, are evaluated. Special efforts were made to evaluate existing computer processing techniques in mapping timber types using ERTS-1 and aircraft data, and to provide an opportunity to open up new research and development areas in forestry data

    Tri-county pilot study

    Get PDF
    The author has identified the following significant results. An area inventory was performed for three southeast Texas counties (Montgomery, Walker, and San Jacinto) totaling 0.65 million hectares. The inventory was performed using a two level hierarchy. Level 1 was divided into forestland, rangeland, and other land. Forestland was separated into Level 2 categories: pine, hardwood, and mixed; rangeland was not separated further. Results consisted of area statistics for each county and for the entire study site for pine, hardwood, mixed, rangeland, and other land. Color coded county classification maps were produced for the May data set, and procedures were developed and tested

    An XMM-Newton observation of the Narrow Line Seyfert 1 Galaxy, Markarian 896

    Get PDF
    XMM-Newton observations of the NLS1 Markarian 896 are presented. Over the 2-10 keV band, an iron emission line, close to 6.4 keV, is seen. The line is just resolved and has an equivalent width of ~170 eV. The broad-band spectrum is well modelled by a power law slope of gamma ~ 2.03, together with two blackbody components to fit the soft X-ray excess. Using a more physical two-temperature Comptonisation model, a good fit is obtained for an input photon distribution of kT ~ 60eV and Comptonising electron temperatures of ~0.3 and 200 keV. The soft excess cannot be explained purely through the reprocessing of a hard X-ray continuum by an ionised disc reflector.Comment: 6 pages, 4 figures, accepted by MNRA
    • …
    corecore