966 research outputs found

    Application of DInSAR-GPS optimization for derivation of fine-scale surface motion maps of Southern California

    Get PDF
    A method based on random field theory and Gibbs-Markov random fields equivalency within Bayesian statistical framework is used to derive 3-D surface motion maps from sparse global positioning system (GPS) measurements and differential interferometric synthetic aperture radar (DInSAR) interferogram in the southern California region. The minimization of the Gibbs energy function is performed analytically, which is possible in the case when neighboring pixels are considered independent. The problem is well posed and the solution is unique and stable and not biased by the continuity condition. The technique produces a 3-D field containing estimates of surface motion on the spatial scale of the DInSAR image, over a given time period, complete with error estimates. Significant improvement in the accuracy of the vertical component and moderate improvement in the accuracy of the horizontal components of velocity are achieved in comparison with the GPS data alone. The method can be expanded to account for other available data sets, such as additional interferograms, lidar, or leveling data, in order to achieve even higher accuracy

    Gravity changes from a stress evolution earthquake simulation of California

    Get PDF
    The gravity signal contains information regarding changes in density at all depths and can be used as a proxy for the strain accumulation in fault networks. A stress evolution time-dependent model was used to create simulated slip histories over the San Andreas Fault network in California. Using a linear sum of the gravity signals from each fault segment in the model, via coseismic gravity Green's functions, a time-dependent gravity model was created. The steady state gravity from the long-term plate motion generates a signal over 5 years with magnitudes of ±~2 μGal; the current limit of portable instrument observations. Moderate to large events generate signal magnitudes in the range of ~10 to ~80 μGal, well within the range of ground-based observations. The complex fault network geometry of California significantly affects the spatial extent of the gravity signal from the three events studied.Peer reviewe

    Why students do not engage in contract cheating

    Get PDF
    Contract cheating refers to students paying a third party to complete university assessments for them. Although opportunities for commercial contract cheating are widely available in the form of essay mills, only about 3% of students engage in this behaviour. This study examined the reasons why most students do not engage in contract cheating. Students (n = 1204) completed a survey on why they do not engage in contract cheating as well as measures of several individual differences, including self-control, grit and the Dark Triad traits. Morality and motivation for learning received the greatest endorsement for why students do not engage in contract cheating. Controlling for gender, individual differences predicted students’ reasons for not contract cheating. This study supports the use of criminological theories relating to rational choice, self-control and opportunity to explain why students do not engage in contract cheating. Practically, this study may inform academic policies and assessment design that may reduce contract cheating

    Pattern Informatics and its Application for Optimal Forecasting of Large Earthquakes in Japan

    Get PDF
    Pattern Informatics (PI) technique can be used to detect precursory seismic activation or quiescence and make an earthquake forecast. Here we apply the PI method for optimal forecasting of large earthquakes in Japan, using the data catalogue maintained by the Japan Meteorological Agency. The PI method is tested to forecast large (magnitude m ≥ 5) earthquakes spanning the time period 1995-2004 in the Kobe region. Visual inspection and statistical testing show that the optimized PI method has forecasting skill, relative to the seismic intensity data often used as a standard null hypothesis. Moreover, we find in a retrospective forecast that the 1995 Kobe earthquake (m = 7.2) falls in a seismically anomalous area. Another approach to test the forecasting algorithm is to create a future potential map for large (m ≥ 5) earthquake events. This is illustrated using the Kobe and Tokyo regions for the forecast period 2000-2009. Based on the resulting Kobe map we point out several forecasted areas: The epicentral area of the 1995 Kobe earthquake, the Wakayama area, the Mie area, and the Aichi area. The Tokyo forecast map was created prior to the occurrence of the Oct. 23, 2004 Niigata earthquake (m = 6.8) and the principal aftershocks with 5.0 ≤ m. We find that these events were close to in a forecasted area on the Tokyo map. The PI technique for regional seismicity observation substantiates an example showing considerable promise as an intermediate-term earthquake forecasting in Japa

    Earthquake forecasting and its verification

    Get PDF
    No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months). However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI) method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ('hotspots'') where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver) operating characteristic (ROC) diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI) forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances

    Enhanced motivational interviewing for reducing weight and increasing physical activity in adults with high cardiovascular risk: the MOVE IT three-arm RCT.

    Get PDF
    BACKGROUND: Motivational interviewing (MI) enhanced with behaviour change techniques (BCTs) and deployed by health trainers targeting multiple risk factors for cardiovascular disease (CVD) may be more effective than interventions targeting a single risk factor. OBJECTIVES: The clinical effectiveness and cost-effectiveness of an enhanced lifestyle motivational interviewing intervention for patients at high risk of CVD in group settings versus individual settings and usual care (UC) in reducing weight and increasing physical activity (PA) were tested. DESIGN: This was a three-arm, single-blind, parallel randomised controlled trial. SETTING: A total of 135 general practices across all 12 South London Clinical Commissioning Groups were recruited. PARTICIPANTS: A total of 1742 participants aged 40-74 years with a ≥ 20.0% risk of a CVD event in the following 10 years were randomised. INTERVENTIONS: The intervention was designed to integrate MI and cognitive-behavioural therapy (CBT), delivered by trained healthy lifestyle facilitators in 10 sessions over 1 year, in group or individual format. The control group received UC. RANDOMISATION: Simple randomisation was used with computer-generated randomisation blocks. In each block, 10 participants were randomised to the group, individual or UC arm in a 4 : 3 : 3 ratio. Researchers were blind to the allocation. MAIN OUTCOME MEASURES: The primary outcomes are change in weight (kg) from baseline and change in PA (average number of steps per day over 1 week) from baseline at the 24-month follow-up, with an interim follow-up at 12 months. An economic evaluation estimates the relative cost-effectiveness of each intervention. Secondary outcomes include changes in low-density lipoprotein cholesterol and CVD risk score. RESULTS: The mean age of participants was 69.75 years (standard deviation 4.11 years), 85.5% were male and 89.4% were white. At the 24-month follow-up, the group and individual intervention arms were not more effective than UC in increasing PA [mean 70.05 steps, 95% confidence interval (CI) -288 to 147.9 steps, and mean 7.24 steps, 95% CI -224.01 to 238.5 steps, respectively] or in reducing weight (mean -0.03 kg, 95% CI -0.49 to 0.44 kg, and mean -0.42 kg, 95% CI -0.93 to 0.09 kg, respectively). At the 12-month follow-up, the group and individual intervention arms were not more effective than UC in increasing PA (mean 131.1 steps, 95% CI -85.28 to 347.48 steps, and mean 210.22 steps, 95% CI -19.46 to 439.91 steps, respectively), but there were reductions in weight for the group and individual intervention arms compared with UC (mean -0.52 kg, 95% CI -0.90 to -0.13 kg, and mean -0.55 kg, 95% CI -0.95 to -0.14 kg, respectively). The group intervention arm was not more effective than the individual intervention arm in improving outcomes at either follow-up point. The group and individual interventions were not cost-effective. CONCLUSIONS: Enhanced MI, in group or individual formats, targeted at members of the general population with high CVD risk is not effective in reducing weight or increasing PA compared with UC. Future work should focus on ensuring objective evidence of high competency in BCTs, identifying those with modifiable factors for CVD risk and improving engagement of patients and primary care. TRIAL REGISTRATION: Current Controlled Trials ISRCTN84864870. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 23, No. 69. See the NIHR Journals Library website for further project information. This research was part-funded by the NIHR Biomedical Research Centre at South London and Maudsley NHS Foundation Trust and King's College London

    Using earthquake intensities to forecast earthquake occurrence times

    No full text
    International audienceIt is well known that earthquakes do not occur randomly in space and time. Foreshocks, aftershocks, precursory activation, and quiescence are just some of the patterns recognized by seismologists. Using the Pattern Informatics technique along with relative intensity analysis, we create a scoring method based on time dependent relative operating characteristic diagrams and show that the occurrences of large earthquakes in California correlate with time intervals where fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering. Furthermore, we show that the methods used to obtain these results may be applicable to other parts of the world

    Pattern Informatics and Its Application for Optimal Forecasting of Large Earthquakes in Japan

    Full text link
    Pattern informatics (PI) technique can be used to detect precursory seismic activation or quiescence and make earthquake forecast. Here we apply the PI method for optimal forecasting of large earthquakes in Japan, using the data catalogue maintained by the Japan Meteorological Agency. The PI method is tested to forecast large (magnitude m >= 5) earthquakes for the time period 1995-2004 in the Kobe region. Visual inspection and statistical testing show that the optimized PI method has forecasting skill, relative to the seismic intensity data often used as a standard null hypothesis. Moreover, we find a retrospective forecast that the 1995 Kobe earthquake (m = 7.2) falls in a seismically anomalous area. Another approach to test the forecasting algorithm is to create a future potential map for large (m >= 5) earthquake events. This is illustrated using the Kobe and Tokyo regions for the forecast period 2000-2009. Based on the resulting Kobe map we point out several forecasted areas: the epicentral area of the 1995 Kobe earthquake, the Wakayama area, the Mie area, and the Aichi area. The Tokyo forecasted map was created prior to the occurrence of the Oct. 23, 2004 Niigata earthquake (m = 6.8) and the principal aftershocks with m >= 5.0. We find that these events occurred in a forecasted area in the Tokyo map. The PI technique for regional seismicity observation substantiates an example showing considerable promise as an intermediate-term earthquake forecasting in Japan.Comment: 36 pages, 6 figures, 1 tabl
    corecore