164 research outputs found
Potentially Active Faults in Dam Foundations
The Paper contains information on existing dams founded on active faults, a summary of pertinent lessons learned from study of historic fault breaks and fault mechanisms, recommended practice for evaluation of active faults, and opinions concerning design of dams on active faults. While a dam site with an active fault should be avoided if possible, if a reservoir is vitally needed and a better site is not available, it is reasonable practice to construct a conservatively designed embankment dam. Concrete dams on active faults, or near some major active faults, are not advisable. For evaluation of fault activity, geological studies usually must be carried a considerable distance from the dam site, a departure from recent past practice. Experience of the last few years with many fault studies indicates that thorough geological investigations with modern techniques will usually provide sufficient evidence to allow a judgement on the activity or inactivity of a fault
- shell gap reduction in neutron-rich systems and cross-shell excitations in O
Excited states in O were populated in the reaction
Be(C,) at Florida State University. Charged particles
were detected with a particle telescope consisting of 4 annularly segmented Si
surface barrier detectors and radiation was detected with the FSU
detector array. Five new states were observed below 6 MeV from the
- and -- coincidence data. Shell model
calculations suggest that most of the newly observed states are core-excited
1p-1h excitations across the shell gap. Comparisons between
experimental data and calculations for the neutron-rich O and F isotopes imply
a steady reduction of the - shell gap as neutrons are added
Probabilities of Large Earthquakes in the San Francisco Bay Region, California
In 1987 a Working Group on California Earthquake Probabilities was organized by the U.S. Geological
Survey at the recommendation of the National Earthquake Prediction Evaluation Council (NEPEC). The
membership included representatives from private industry, academia, and the U.S. Geological Survey. The
Working Group computed long-term probabilities of earthquakes along the major faults of the San Andreas
fault system on the basis of consensus interpretations of information then available. Faults considered by the
Working Group included the San Andreas fault proper, the San Jacinto and Imperial-faults of southern
California, and the Hayward fault of northern California. The Working Group issued a final report of its
findings in 1988 (Working Group, 1988) that was reviewed and endorsed by NEPEC.
As a consequence of the magnitude 7.1 Loma Prieta, California, earthquake of October 17, 1989, a
second Working Group on California Earthquake Probabilities was organized under the auspices of NEPEC.
Its charge was to review and, as necessary, revise the findings of the 1988 report on the probability of large
earthquakes in the San Francisco Bay region. In particular, the Working Group was requested to examine the
probabilities of large earthquakes in the context of new interpretations or physical changes resulting from the
Loma Prieta earthquake. In addition, it was to consider new information pertaining to the San Andreas and other
faults in the region obtained subsequent to the release of the 1988 report. Insofar as modified techniques and
improved data have been used in this study, the same approach might also, of course, modify the probabilities
for southern California. This reevaluation has, however, been specifically limited to the San Francisco Bay
region.
This report is intended to summarize the collective knowledge and judgments of a diverse group of
earthquake scientists to assist in formulation of rational earthquake policies. A considerable body of information
about active faults in the San Francisco Bay region leads to the conclusion that major earthquakes are likely
within the next tens of years. Several techniques can be used to compute probabilities of future earthquakes,
although there are uncertainties about the validity of specific assumptions or models that must be made when
applying these techniques. The body of this report describes the data and detailed assumptions that lead to
specific probabilities for different fault segments. Additional data and future advances in our understanding of
earthquake physics may alter the way that these probabilities are estimated. Even though this uncertainty must
be acknowledged, we emphasize that the findings of this report are supported by other lines of argument and
are consistent with our best understanding of the likelihood for the occurrence of earthquakes in the San
Francisco Bay region
Methods for Minimizing the Confounding Effects of Word Length in the Analysis of Phonotactic Probability and Neighborhood Density
This is the author's accepted manuscript. The original is available at http://jslhr.pubs.asha.org/article.aspx?articleid=1781521&resultClick=3Recent research suggests that phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (the number of words phonologically similar to a given word) influence spoken language processing and acquisition across the lifespan in both normal and clinical populations. The majority of research in this area has tended to focus on controlled laboratory studies rather than naturalistic data such as spontaneous speech samples or elicited probes. One difficulty in applying current measures of phonotactic probability and neighborhood density to more naturalistic samples is the significant correlation between these variables and word length. This study examines several alternative transformations of phonotactic probability and neighborhood density as a means of reducing or eliminating this correlation with word length. Computational analyses of the words in a large database and reanalysis of archival data supported the use of z scores for the analysis of phonotactic probability as a continuous variable and the use of median transformation scores for the analysis of phonotactic probability as a dichotomous variable. Neighborhood density results were less clear with the conclusion that analysis of neighborhood density as a continuous variable warrants further investigation to differentiate the utility of z scores in comparison to median transformation scores. Furthermore, balanced dichotomous coding of neighborhood density was difficult to achieve, suggesting that analysis of neighborhood density as a dichotomous variable should be approached with caution. Recommendations for future application and analyses are discussed
Mars 2020 Perseverance Rover Mast Camera Zoom (Mastcam-Z) Multispectral, Stereoscopic Imaging Investigation
Mastcam-Z is a multispectral, stereoscopic imaging investigation on the Mars 2020 mission’s Perseverance rover. Mastcam-Z consists of a pair of focusable, 4:1 zoomable cameras that provide broadband red/green/blue and narrowband 400-1000 nm color imaging with fields of view from 25.6° × 19.2° (26 mm focal length at 283 μrad/pixel) to 6.2° × 4.6° (110 mm focal length at 67.4 μrad/pixel). The cameras can resolve (≥ 5 pixels) ∼0.7 mm features at 2 m and ∼3.3 cm features at 100 m distance. Mastcam-Z shares significant heritage with the Mastcam instruments on the Mars Science Laboratory Curiosity rover. Each Mastcam-Z camera consists of zoom, focus, and filter wheel mechanisms and a 1648 × 1214 pixel charge-coupled device detector and electronics. The two Mastcam-Z cameras are mounted with a 24.4 cm stereo baseline and 2.3° total toe-in on a camera plate ∼2 m above the surface on the rover’s Remote Sensing Mast, which provides azimuth and elevation actuation. A separate digital electronics assembly inside the rover provides power, data processing and storage, and the interface to the rover computer. Primary and secondary Mastcam-Z calibration targets mounted on the rover top deck enable tactical reflectance calibration. Mastcam-Z multispectral, stereo, and panoramic images will be used to provide detailed morphology, topography, and geologic context along the rover’s traverse; constrain mineralogic, photometric, and physical properties of surface materials; monitor and characterize atmospheric and astronomical phenomena; and document the rover’s sample extraction and caching locations. Mastcam-Z images will also provide key engineering information to support sample selection and other rover driving and tool/instrument operations decisions
Cost effectiveness of epidural steroid injections to manage chronic lower back pain
Background
The efficacy of epidural steroid injections in the management of chronic low back pain is disputed, yet the technique remains popular amongst physicians and patients alike. This study assesses the cost effectiveness of injections administered in a routine outpatient setting in England.
Methods
Patients attending the Nottingham University Hospitals’ Pain Clinic received two injections of methylprednisolone plus levobupivacaine at different dosages, separated by at least 12 weeks. Prior to each injection, and every week thereafter for 12 weeks, participants completed the EQ-5D health-related quality of life instrument. For each patient for each injection, total health state utility gain relative to baseline was calculated. The cost of the procedure was modelled from observed clinical practice. Cost effectiveness was calculated as procedure cost relative to utility gain.
Results
39 patients provided records. Over a 13-week period commencing with injection, mean quality adjusted life year (QALY) gains per patient for the two dosages were 0.028 (SD 0.063) and 0.021 (SD 0.057). The difference in QALYs gained by dosage was insignificant (paired t-test, CIs -0.019 – 0.033). Based on modelled resource use and data from other studies, the mean cost of an injection was estimated at £219 (SD 83). The cost utility ratio of the two injections amounted to £8,975 per QALY gained (CIs 5,480 – 22,915). However, at costs equivalent to the tariff price typically paid to providers by health care purchasers, the ratio increased to £27,459 (CIs 16,779 – 70,091).
Conclusions
When provided in an outpatient setting, epidural steroid injections are a short term, but nevertheless cost effective, means of managing chronic low back pain. However, designation of the procedure as a day case requires the National Health Service to reimburse providers at a price which pushes the procedure to the margin of cost effectiveness
Gold-Catalyzed C–H Bond Functionalization of Metallocenes: Synthesis of Densely Functionalized Ferrocene Derivatives
Balancing with Vibration: A Prelude for “Drift and Act” Balance Control
Stick balancing at the fingertip is a powerful paradigm for the study of the control of human balance. Here we show that the mean stick balancing time is increased by about two-fold when a subject stands on a vibrating platform that produces vertical vibrations at the fingertip (0.001 m, 15–50 Hz). High speed motion capture measurements in three dimensions demonstrate that vibration does not shorten the neural latency for stick balancing or change the distribution of the changes in speed made by the fingertip during stick balancing, but does decrease the amplitude of the fluctuations in the relative positions of the fingertip and the tip of the stick in the horizontal plane, A(x,y). The findings are interpreted in terms of a time-delayed “drift and act” control mechanism in which controlling movements are made only when controlled variables exceed a threshold, i.e. the stick survival time measures the time to cross a threshold. The amplitude of the oscillations produced by this mechanism can be decreased by parametric excitation. It is shown that a plot of the logarithm of the vibration-induced increase in stick balancing skill, a measure of the mean first passage time, versus the standard deviation of the A(x,y) fluctuations, a measure of the distance to the threshold, is linear as expected for the times to cross a threshold in a stochastic dynamical system. These observations suggest that the balanced state represents a complex time–dependent state which is situated in a basin of attraction that is of the same order of size. The fact that vibration amplitude can benefit balance control raises the possibility of minimizing risk of falling through appropriate changes in the design of footwear and roughness of the walking surfaces
- …