25,372 research outputs found
An integrated geoscience modelling approach to ground water and coastal flood risk
Understanding flood risk in Morayshire has required the adoption of a holistic work programme that brought together a multidisciplinary scientific team who used a variety of modelling techniques to investigate groundwater and coastal flood risk within the Lower Findhorn Catchment in Morayshire, N. E. Scotland. This work, which has been largely funded by Moray Council, delimits potential zones of ground water flooding and has informed the design and construction of flood defences, in parts of the catchment where both commercial and residential properties were at risk. It also assessed the potential for coastal flooding of the low-lying coastal zone.
Determining the nature of the complex shallow Quaternary strata and the degree of continuity, heterogeneity and relative permeability of packages of sediment in 3D was critical to establishing the areas at risk of ground water flooding. A ‘source to sink approach’ was adopted covering not only the floodplain of the River Findhorn, but also the interfluves and the surrounding coastal zone. This was achieved by construction of a GSI 3D model of the shallow geology, based on extensive and detailed field investigation of the catchment geology, but also informed by conceptual models of the glacial and postglacial evolution of North East Scotland. The model was calibrated by borehole drilling and trial pitting, and subsequently reattributed with permeability values based on pumping test results and geotechnical analyses. The resulting 3D distribution of shallow subsurface permeability provided a major set of parameters for ZOOM groundwater modelling. This was used, together with outputs from third party hydrological models, to model groundwater flow directions, changes in the water table relative to base level (sea level), and to establish the potential groundwater component of river flooding in the catchment.
Regional modelling of changes in Relative Sea Level is also and important component in determining the probable future trends in flood risk in the Lower Findhorn. Unusually for the UK, conceptual models and proxies indicate that little rise in sea level has occurred during the last 100 years and that the coastline is prograding. Consequently, modelling indicates that the risk of inundation of this low-lying portion of Morayshire by coastal flooding is slight, when compared to that from river flooding events
Combining Harmonic Generation and Laser Chirping to Achieve High Spectral Density in Compton Sources
Recently various laser-chirping schemes have been investigated with the goal
of reducing or eliminating ponderomotive line broadening in Compton or Thomson
scattering occurring at high laser intensities. As a next level of detail in
the spectrum calculations, we have calculated the line smoothing and broadening
expected due to incident beam energy spread within a one-dimensional plane wave
model for the incident laser pulse, both for compensated (chirped) and
unchirped cases. The scattered compensated distributions are treatable
analytically within three models for the envelope of the incident laser pulses:
Gaussian, Lorentzian, or hyperbolic secant. We use the new results to
demonstrate that the laser chirping in Compton sources at high laser
intensities: (i) enables the use of higher order harmonics, thereby reducing
the required electron beam energies; and (ii) increases the photon yield in a
small frequency band beyond that possible with the fundamental without
chirping. This combination of chirping and higher harmonics can lead to
substantial savings in the design, construction and operational costs of the
new Compton sources. This is of particular importance to the the widely popular
laser-plasma accelerator based Compton sources, as the improvement in their
beam quality enters the regime where chirping is most effective.Comment: 5 pages, 4 figure
Term testing: a case study
Purpose and background: The litigation world has many examples of cases where the volume of Electronically Stored Information (ESI) demands that litigators use automatic means to assist with document identification, classification, and filtering. This case study describes one such process for one case. This case study is not a
comprehensive analysis of the entire case, only the Term Testing portion.
Term Testing is an analytical practice of refining match terms by running in-depth analysis on a sampling
of documents. The goal of term testing is to reduce the number of false negatives (relevant / privilege
document with no match, also known as “misdetections”) and false positives (documents matched but
not actually relevant / privilege) as much as possible.
The case was an employment discrimination suit, against a government agency. The collection effort
turned up common sources of ESI: hard drives, network shares, CDs and DVDs, and routine e-mail
storage and backups. Initial collection, interviews, and reviews had revealed that a few key documents,
such as old versions of policies, had not been retained or collected.
Then an unexpected source of information was unearthed: one network administrator had been running
an unauthorized “just-in-case” tracer on the email system, outside the agency’s document retention
policies, which created dozens of tapes full of millions of encrypted compressed emails, covering more
years than the agency’s routine email backups. The agency decided to process and review these tracer emails for the missing key documents, even though the overall volume of relevant documents would rise
exponentially.
The agency had clear motivation to reduce the volume of documents flowing into relevancy and privilege
reviews, but had concerns about the defensibility of using an automated process to determine which
documents would never be reviewed. The case litigators and Subject Matter Experts (SMEs) decided to
use a process of Term Testing to ensure that automated filtering was both defensible and as accurate as
possible
Multispectral scanner data processing over Sam Houston National Forest
The Edit 9 forest scene, a computer processing technique, and its capability to map timber types in the Sam Houston National Forest, are evaluated. Special efforts were made to evaluate existing computer processing techniques in mapping timber types using ERTS-1 and aircraft data, and to provide an opportunity to open up new research and development areas in forestry data
Barefoot running improves economy at high intensities and peak treadmill velocity
Aim: Barefoot running can improve running economy (RE) compared to shod running at low exercise intensities, but data is lacking for the higher intensities typical during many distance running competitions. The influence of barefoot running on the velocity at maximal oxygen uptake (vVO2max) and peak incremental treadmill test velocity (vmax) is unknown. The present study tested the hypotheses that barefoot running would improve RE, vVO2max and vmax relative to shod running.
Methods: Using a balanced within-subject repeated measures design, eight male runners (aged 23.1±4.5 years, height 1.80±0.06 m, mass 73.8±11.5 kg, VO2max 4.08±0.39 L·min-1) completed a familiarization followed by one barefoot and one shod treadmill running trial, 2-14 days apart. Trial sessions consisted of a 5 minute warm-up, 5 minute rest, followed by 4×4 minute stages, at speeds corresponding to ~67, 75, 84 and 91% shod VO2max respectively, separated by a 1 minute rest. After the 4th stage treadmill speed was incremented by 0.1 km·h-1 every 15 s until participants reached volitional exhaustion.
Results: RE was improved by 4.4±7.0% across intensities in the barefoot condition (P=0.040). The improvement in RE was related to removed shoe mass (r2=0.80, P=0.003) with an intercept at 0% improvement for RE at 0.520 kg total shoe mass. Both vVO2max (by 4.5±5.0%, P=0.048) and vmax (by 3.9±4.0%, P=0.030) also improved but VO2max was unchanged (p=0.747).
Conclusion: Barefoot running improves RE at high exercise intensities and increases vVO2max and vmax, but further research is required to clarify the influence of very light shoe weights on RE
Borel Degenerations of Arithmetically Cohen-Macaulay curves in P^3
We investigate Borel ideals on the Hilbert scheme components of
arithmetically Cohen-Macaulay (ACM) codimension two schemes in P^n. We give a
basic necessary criterion for a Borel ideal to be on such a component. Then
considering ACM curves in P^3 on a quadric we compute in several examples all
the Borel ideals on their Hilbert scheme component. Based on this we conjecture
which Borel ideals are on such a component, and for a range of Borel ideals we
prove that they are on the component.Comment: 20 pages, shorter and more effective versio
Tri-county pilot study
The author has identified the following significant results. An area inventory was performed for three southeast Texas counties (Montgomery, Walker, and San Jacinto) totaling 0.65 million hectares. The inventory was performed using a two level hierarchy. Level 1 was divided into forestland, rangeland, and other land. Forestland was separated into Level 2 categories: pine, hardwood, and mixed; rangeland was not separated further. Results consisted of area statistics for each county and for the entire study site for pine, hardwood, mixed, rangeland, and other land. Color coded county classification maps were produced for the May data set, and procedures were developed and tested
- …
