819 research outputs found
Earthquake distribution patterns in Africa: their relationship to variations in lithospheric and geological structure, and their rheological implications
We use teleseismic waveform inversion, along with depth phase analysis, to constrain the centroid depths and source parameters of large African earthquakes. The majority of seismic activity is concentrated along the East African Rift System, with additional active regions along stretches of the continental margins in north and east Africa, and in the Congo Basin. We examine variations in the seismogenic thickness across Africa, based on a total of 227 well-determined earthquake depths, 112 of which are new to this study. Seismogenic thickness varies in correspondence with lithospheric thickness, as determined from surface wave tomography, with regions of thick lithosphere being associated with seismogenic thicknesses of up to 40 km. In regions of thin lithosphere, the seismogenic thickness is typically limited to ≤20 km. Larger seismogenic thicknesses also correlate with regions that have dominant tectonothermal ages of ≥1500 Ma, where the East African Rift passes around the Archean cratons of Africa, through the older Proterozoic mobile belts. These correlations are likely to be related to the production, affected by method and age of basement formation, and preservation, affected by lithospheric thickness, of a strong, anhydrous lower crust. The Congo Basin contains the only compressional earthquakes in the continental interior. Simple modelling of the forces induced by convective support of the African plate, based on long-wavelength free-air gravity anomalies, indicates that epeirogenic effects are sufficient to account for the localization and occurrence of both extensional and compressional deformation in Africa. Seismicity along the margins of Africa reflects a mixture between oceanic and continental seismogenic characteristics, with earthquakes in places extending to 40 km depth
Joining the dots: Conditional pass and programmatic assessment enhances recognition of problems with professionalism and factors hampering student progress
<p>Abstract</p> <p>Background</p> <p>Programmatic assessment that looks across a whole year may contribute to better decisions compared with those made from isolated assessments alone. The aim of this study is to describe and evaluate a programmatic system to handle student assessment results that is aligned not only with learning and remediation, but also with defensibility. The key components are standards based assessments, use of "Conditional Pass", and regular progress meetings.</p> <p>Methods</p> <p>The new assessment system is described. The evaluation is based on years 4-6 of a 6-year medical course. The types of concerns staff had about students were clustered into themes alongside any interventions and outcomes for the students concerned. The likelihoods of passing the year according to type of problem were compared before and after phasing in of the new assessment system.</p> <p>Results</p> <p>The new system was phased in over four years. In the fourth year of implementation 701 students had 3539 assessment results, of which 4.1% were Conditional Pass. More in-depth analysis for 1516 results available from 447 students revealed the odds ratio (95% confidence intervals) for failure was highest for students with problems identified in more than one part of the course (18.8 (7.7-46.2) p < 0.0001) or with problems with professionalism (17.2 (9.1-33.3) p < 0.0001). The odds ratio for failure was lowest for problems with assignments (0.7 (0.1-5.2) NS). Compared with the previous system, more students failed the year under the new system on the basis of performance during the year (20 or 4.5% compared with four or 1.1% under the previous system (p < 0.01)).</p> <p>Conclusions</p> <p>The new system detects more students in difficulty and has resulted in less "failure to fail". The requirement to state conditions required to pass has contributed to a paper trail that should improve defensibility. Most importantly it has helped detect and act on some of the more difficult areas to assess such as professionalism.</p
The Student Movement Volume 105 Issue 4: Students Destress in AUSA\u27s Nest
NEWS
AU Adelante Club Hosts Vespers, Joelle Kim
AUSA Hosts The Nest, Amanda Cho
The Gazebo Reopens with GetFood App, Taylor Uphus
PULSE
Cultural Hispanic Catchphrases, Wambui Karanja
Keep Calm and Breathing On (Yourself), Jessica Rim
Meet & Make: Reflections, Masy Domecillo
HUMANS
Event Planning With Malachi Regis, Interviewed by Fitz-Earl McKenzie II
Interview with Michael Nixon: Vice President for Diversity & Inclusion, Interviewed by Abigail Lee
Meet Professor Pedro Navia, Interviewed by Pearl Parker
Torian Hill, Interviewed by TJ Hunter
ARTS & ENTERTAINMENT
Hispanic Artist Feature: Felix Gillett, Megan Napod
Música para el Alma or Music for the Soul, Hannah Cruse
Signal Boost, Alannah Tjhatra
IDEAS
An Defense Against Burnout: Why Meaning Matters, Adoniah Simon
Remembering RBG: Part Two, Lyle Goulbourne
THE LAST WORD
How Does it Change Us?, Daniel Selfhttps://digitalcommons.andrews.edu/sm-105/1004/thumbnail.jp
Comparative population structure of <i>Plasmodium malariae</i> and <i>Plasmodium falciparum</i> under different transmission settings in Malawi
<b>Background:</b> Described here is the first population genetic study of Plasmodium malariae, the causative agent of quartan malaria. Although not as deadly as Plasmodium falciparum, P. malariae is more common than previously thought, and is frequently in sympatry and co-infection with P. falciparum, making its study increasingly important. This study compares the population parameters of the two species in two districts of Malawi with different malaria transmission patterns - one seasonal, one perennial - to explore the effects of transmission on population structures.
<BR/>
<b>Methods:</b> Six species-specific microsatellite markers were used to analyse 257 P. malariae samples and 257 P. falciparum samples matched for age, gender and village of residence. Allele sizes were scored to within 2 bp for each locus and haplotypes were constructed from dominant alleles in multiple infections. Analysis of multiplicity of infection (MOI), population differentiation, clustering of haplotypes and linkage disequilibrium was performed for both species. Regression analyses were used to determine association of MOI measurements with clinical malaria parameters.
<BR/>
<b>Results:</b> Multiple-genotype infections within each species were common in both districts, accounting for 86.0% of P. falciparum and 73.2% of P. malariae infections and did not differ significantly with transmission setting. Mean MOI of P. falciparum was increased under perennial transmission compared with seasonal (3.14 vs 2.59, p = 0.008) and was greater in children compared with adults. In contrast, P. malariae mean MOI was similar between transmission settings (2.12 vs 2.11) and there was no difference between children and adults. Population differentiation showed no significant differences between villages or districts for either species. There was no evidence of geographical clustering of haplotypes. Linkage disequilibrium amongst loci was found only for P. falciparum samples from the seasonal transmission setting.
<BR/>
<b>Conclusions:</b> The extent of similarity between P. falciparum and P. malariae population structure described by the high level of multiple infection, the lack of significant population differentiation or haplotype clustering and lack of linkage disequilibrium is surprising given the differences in the biological features of these species that suggest a reduced potential for out-crossing and transmission in P. malariae. The absence of a rise in P. malariae MOI with increased transmission or a reduction in MOI with age could be explained by differences in the duration of infection or degree of immunity compared to P. falciparum
miR-132/212 knockout mice reveal roles for these miRNAs in regulating cortical synaptic transmission and plasticity
miR-132 and miR-212 are two closely related miRNAs encoded in the same intron of a small non-coding gene, which have been suggested to play roles in both immune and neuronal function. We describe here the generation and initial characterisation of a miR-132/212 double knockout mouse. These mice were viable and fertile with no overt adverse phenotype. Analysis of innate immune responses, including TLR-induced cytokine production and IFNβ induction in response to viral infection of primary fibroblasts did not reveal any phenotype in the knockouts. In contrast, the loss of miR-132 and miR-212, while not overtly affecting neuronal morphology, did affect synaptic function. In both hippocampal and neocortical slices miR-132/212 knockout reduced basal synaptic transmission, without affecting paired-pulse facilitation. Hippocampal long-term potentiation (LTP) induced by tetanic stimulation was not affected by miR-132/212 deletion, whilst theta burst LTP was enhanced. In contrast, neocortical theta burst-induced LTP was inhibited by loss of miR-132/212. Together these results indicate that miR-132 and/or miR-212 play a significant role in synaptic function, possibly by regulating the number of postsynaptic AMPA receptors under basal conditions and during activity-dependent synaptic plasticity
Recommended from our members
Identifying factors likely to influence compliance with diagnostic imaging guideline recommendations for spine disorders among chiropractors in North America: a focus group study using the Theoretical Domains Framework
Background: The Theoretical Domains Framework (TDF) was developed to investigate determinants of specific clinical behaviors and inform the design of interventions to change professional behavior. This framework was used to explore the beliefs of chiropractors in an American Provider Network and two Canadian provinces about their adherence to evidence-based recommendations for spine radiography for uncomplicated back pain. The primary objective of the study was to identify chiropractors’ beliefs about managing uncomplicated back pain without xrays and to explore barriers and facilitators to implementing evidence-based recommendations on lumbar spine xrays. A secondary objective was to compare chiropractors in the United States and Canada on their beliefs regarding the use of spine x-rays.
Methods: Six focus groups exploring beliefs about managing back pain without x-rays were conducted with a purposive sample. The interview guide was based upon the TDF. Focus groups were digitally recorded, transcribed verbatim, and analyzed by two independent assessors using thematic content analysis based on the TDF.
Results: Five domains were identified as likely relevant. Key beliefs within these domains included the following: conflicting comments about the potential consequences of not ordering x-rays (risk of missing a pathology, avoiding adverse treatment effects, risks of litigation, determining the treatment plan, and using x-ray-driven techniques contrasted with perceived benefits of minimizing patient radiation exposure and reducing costs; beliefs about consequences); beliefs regarding professional autonomy, professional credibility, lack of standardization, and agreement with guidelines widely varied (social/professional role & identity); the influence of formal training, colleagues, and patients also appeared to be important factors (social influences); conflicting comments regarding levels of confidence and comfort in managing patients without x-rays (belief about capabilities); and guideline awareness and agreements (knowledge).
Conclusions: Chiropractors’ use of diagnostic imaging appears to be influenced by a number of factors. Five key domains may be important considering the presence of conflicting beliefs, evidence of strong beliefs likely to impact the behavior of interest, and high frequency of beliefs. The results will inform the development of a theorybased survey to help identify potential targets for behavioral-change strategies
Hypernovae and Other Black-Hole-Forming Supernovae
During the last few years, a number of exceptional core-collapse supernovae
(SNe) have been discovered. Their kinetic energy of the explosions are larger
by more than an order of magnitude than the typical values for this type of
SNe, so that these SNe have been called `Hypernovae'. We first describe how the
basic properties of hypernovae can be derived from observations and modeling.
These hypernovae seem to come from rather massive stars, thus forming black
holes. On the other hand, there are some examples of massive SNe with only a
small kinetic energy. We suggest that stars with non-rotating black holes are
likely to collapse "quietly" ejecting a small amount of heavy elements (Faint
supernovae). In contrast, stars with rotating black holes are likely to give
rise to very energetic supernovae (Hypernovae). We present distinct
nucleosynthesis features of these two types of "black-hole-forming" supernovae.
Hypernova nucleosynthesis is characterized by larger abundance ratios
(Zn,Co,V,Ti)/Fe and smaller (Mn,Cr)/Fe. Nucleosynthesis in Faint supernovae is
characterized by a large amount of fall-back. We show that the abundance
pattern of the most Fe deficient star, HE0107-5240, and other extremely
metal-poor carbon-rich stars are in good accord with those of
black-hole-forming supernovae, but not pair-instability supernovae. This
suggests that black-hole-forming supernovae made important contributions to the
early Galactic (and cosmic) chemical evolution.Comment: 49 pages, to be published in "Stellar Collapse" (Astrophysics and
Space Science; Kluwer) ed. C. L. Fryer (2003
Hand sanitisers for reducing illness absences in primary school children in New Zealand: a cluster randomised controlled trial study protocol
<p>Abstract</p> <p>Background</p> <p>New Zealand has relatively high rates of morbidity and mortality from infectious disease compared with other OECD countries, with infectious disease being more prevalent in children compared with others in the population. Consequences of infectious disease in children may have significant economic and social impact beyond the direct effects of the disease on the health of the child; including absence from school, transmission of infectious disease to other pupils, staff, and family members, and time off work for parents/guardians. Reduction of the transmission of infectious disease between children at schools could be an effective way of reducing the community incidence of infectious disease. Alcohol based no-rinse hand sanitisers provide an alternative hand cleaning technology, for which there is some evidence that they may be effective in achieving this. However, very few studies have investigated the effectiveness of hand sanitisers, and importantly, the potential wider economic implications of this intervention have not been established.</p> <p>Aims</p> <p>The primary objective of this trial is to establish if the provision of hand sanitisers in primary schools in the South Island of New Zealand, in addition to an education session on hand hygiene, reduces the incidence rate of absence episodes due to illness in children. In addition, the trial will establish the cost-effectiveness and conduct a cost-benefit analysis of the intervention in this setting.</p> <p>Methods/Design</p> <p>A cluster randomised controlled trial will be undertaken to establish the effectiveness and cost-effectiveness of hand sanitisers. Sixty-eight primary schools will be recruited from three regions in the South Island of New Zealand. The schools will be randomised, within region, to receive hand sanitisers and an education session on hand hygiene, or an education session on hand hygiene alone. Fifty pupils from each school in years 1 to 6 (generally aged from 5 to 11 years) will be randomly selected for detailed follow-up about their illness absences, providing a total of 3400 pupils. In addition, absence information will be collected on all children from the school rolls. Investigators not involved in the running of the trial, outcome assessors, and the statistician will be blinded to the group allocation until the analysis is completed.</p> <p>Trial registration</p> <p>ACTRN12609000478213</p
Quantum Measurement Theory in Gravitational-Wave Detectors
The fast progress in improving the sensitivity of the gravitational-wave (GW)
detectors, we all have witnessed in the recent years, has propelled the
scientific community to the point, when quantum behaviour of such immense
measurement devices as kilometer-long interferometers starts to matter. The
time, when their sensitivity will be mainly limited by the quantum noise of
light is round the corner, and finding the ways to reduce it will become a
necessity. Therefore, the primary goal we pursued in this review was to
familiarize a broad spectrum of readers with the theory of quantum measurements
in the very form it finds application in the area of gravitational-wave
detection. We focus on how quantum noise arises in gravitational-wave
interferometers and what limitations it imposes on the achievable sensitivity.
We start from the very basic concepts and gradually advance to the general
linear quantum measurement theory and its application to the calculation of
quantum noise in the contemporary and planned interferometric detectors of
gravitational radiation of the first and second generation. Special attention
is paid to the concept of Standard Quantum Limit and the methods of its
surmounting.Comment: 147 pages, 46 figures, 1 table. Published in Living Reviews in
Relativit
Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy
Background
A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets.
Methods
Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis.
Results
A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001).
Conclusion
We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty
- …