724 research outputs found
Mobile applications for weather and climate information: their use and potential for smallholder farmers
Mobile phones are increasingly being used to provide smallholder farmers with agricultural
and related information. There is currently great interest in their scope to communicate
climate and weather information. Farmers consistently identify demand for weather
information and whilst ICTs may be one way of delivering this at scale there are concerns that
this should not be seen as a panacea. At a time when there have been a range of initiatives and
projects that have been implemented this paper seeks to draw lessons and identify key
considerations to inform the development of future mobile applications to provide climate
services to smallholder farmers. A literature review, interviews with key informants and
experts and 15 case study reviews were conducted. This focused principally on Sub Saharan
Africa but included some examples from India.
Despite numerous initiatives few have developed fully beyond the pilot stage and few have
been evaluated. Some of the provision to date has been of questionable value to farmers. A
key observation is that relatively little attention has been paid in design, to the needs for and
use of both the information and technology by farmers, and few attempts made to differentiate
provision according to gender and other demographic variables. Other factors contributing to
success included communications approaches, which are interactive and/or involve trusted
intermediaries who can add context to and help interpret more complex information.
Providing weather information alongside other services as ‘bundles’ and in conjunction with
complementary communications approaches appears to work well. An important challenge is
how to meet farmers’ needs for location specific, timely and relevant information in
economically sustainable ways. More widely there are challenges in achieving successful
business models and potential conflicts between initiatives driven by mobile network
operators and public goals.
The study identified areas of considerable potential which include: the use of increasingly
available mobile data connections to ensure locally relevant content is available to farmers in
timely fashion (including both historical climate information and forecasts); development of
participatory decision making tools to enable farmers to interpret information for their own
contexts and consider implications and management options; use of visual applications and
participatory video on mobile devices to enhance learning and advisory services for farmers; the potential for increased feedback between farmers and service providers as well as
increased knowledge sharing between farmers provided by the use of social media
Iatrogenic CJD due to pituitary-derived growth hormone with genetically determined incubation times of up to 40 years
Patients with iatrogenic Creutzfeldt-Jakob disease due to administration of cadaver-sourced growth hormone during childhood are still being seen in the UK 30 years after cessation of this treatment. Of the 77 patients who have developed iatrogenic Creutzfeldt-Jakob disease, 56 have been genotyped. There has been a marked change in genotype profile at polymorphic codon 129 of the prion protein gene (PRNP) from predominantly valine homozygous to a mixed picture of methionine homozygous and methionine-valine heterozygous over time. The incubation period of iatrogenic Creutzfeldt-Jakob disease is significantly different between all three genotypes. This experience is a striking contrast with that in France and the USA, which may relate to contamination of different growth hormone batches with different strains of human prions. We describe the clinical, imaging, molecular and autopsy features in 22 of 24 patients who have developed iatrogenic Creutzfeldt-Jakob disease in the UK since 2003. Mean age at onset of symptoms was 42.7 years. Gait ataxia and lower limb dysaesthesiae were the most frequent presenting symptoms. All had cerebellar signs, and the majority had myoclonus and lower limb pyramidal signs, with relatively preserved cognitive function, when first seen. There was a progressive decline in neurological and cognitive function leading to death after 5-32 (mean 14) months. Despite incubation periods approaching 40 years, the clinical duration in methionine homozygote patients appeared to be shorter than that seen in heterozygote patients. MRI showed restricted diffusion in the basal ganglia, thalamus, hippocampus, frontal and the paracentral motor cortex and cerebellar vermis. The electroencephalogram was abnormal in 15 patients and cerebrospinal fluid 14-3-3 protein was positive in half the patients. Neuropathological examination was conducted in nine patients. All but one showed synaptic prion deposition with numerous kuru type plaques in the basal ganglia, anterior frontal and parietal cortex, thalamus, basal ganglia and cerebellum. The patient with the shortest clinical duration had an atypical synaptic deposition of abnormal prion protein and no kuru plaques. Taken together, these data provide a remarkable example of the interplay between the strain of the pathogen and host prion protein genotype. Based on extensive modelling of human prion transmission barriers in transgenic mice expressing human prion protein on a mouse prion protein null background, the temporal distribution of codon 129 genotypes within the cohort of patients with iatrogenic Creutzfeldt-Jakob disease in the UK suggests that there was a point source of infecting prion contamination of growth hormone derived from a patient with Creutzfeldt-Jakob disease expressing prion protein valine 129
History of clinical transplantation
The emergence of transplantation has seen the development of increasingly potent immunosuppressive agents, progressively better methods of tissue and organ preservation, refinements in histocompatibility matching, and numerous innovations is surgical techniques. Such efforts in combination ultimately made it possible to successfully engraft all of the organs and bone marrow cells in humans. At a more fundamental level, however, the transplantation enterprise hinged on two seminal turning points. The first was the recognition by Billingham, Brent, and Medawar in 1953 that it was possible to induce chimerism-associated neonatal tolerance deliberately. This discovery escalated over the next 15 years to the first successful bone marrow transplantations in humans in 1968. The second turning point was the demonstration during the early 1960s that canine and human organ allografts could self-induce tolerance with the aid of immunosuppression. By the end of 1962, however, it had been incorrectly concluded that turning points one and two involved different immune mechanisms. The error was not corrected until well into the 1990s. In this historical account, the vast literature that sprang up during the intervening 30 years has been summarized. Although admirably documenting empiric progress in clinical transplantation, its failure to explain organ allograft acceptance predestined organ recipients to lifetime immunosuppression and precluded fundamental changes in the treatment policies. After it was discovered in 1992 that long-surviving organ transplant recipient had persistent microchimerism, it was possible to see the mechanistic commonality of organ and bone marrow transplantation. A clarifying central principle of immunology could then be synthesized with which to guide efforts to induce tolerance systematically to human tissues and perhaps ultimately to xenografts
Contrasting prefrontal cortex contributions to episodic memory dysfunction in behavioural variant frontotemporal dementia and alzheimer's disease
Recent evidence has questioned the integrity of episodic memory in behavioural variant frontotemporal dementia (bvFTD), where recall performance is impaired to the same extent as in Alzheimer's disease (AD). While these deficits appear to be mediated by divergent patterns of brain atrophy, there is evidence to suggest that certain prefrontal regions are implicated across both patient groups. In this study we sought to further elucidate the dorsolateral (DLPFC) and ventromedial (VMPFC) prefrontal contributions to episodic memory impairment in bvFTD and AD. Performance on episodic memory tasks and neuropsychological measures typically tapping into either DLPFC or VMPFC functions was assessed in 22 bvFTD, 32 AD patients and 35 age- and education-matched controls. Behaviourally, patient groups did not differ on measures of episodic memory recall or DLPFC-mediated executive functions. BvFTD patients were significantly more impaired on measures of VMPFC-mediated executive functions. Composite measures of the recall, DLPFC and VMPFC task scores were covaried against the T1 MRI scans of all participants to identify regions of atrophy correlating with performance on these tasks. Imaging analysis showed that impaired recall performance is associated with divergent patterns of PFC atrophy in bvFTD and AD. Whereas in bvFTD, PFC atrophy covariates for recall encompassed both DLPFC and VMPFC regions, only the DLPFC was implicated in AD. Our results suggest that episodic memory deficits in bvFTD and AD are underpinned by divergent prefrontal mechanisms. Moreover, we argue that these differences are not adequately captured by existing neuropsychological measures
Influence of phyllosilicate mineral assemblages, fabrics, and fluids on the behavior of the Punchbowl fault, southern California
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/95206/1/jgrb13457.pd
The reaction in counter-action: how Meisner technique and active analysis complement each other
This article is an investigation into the difference between reaction and counter-action. The question arose during my experience of Active Analysis at the Stanislavski Acting Laboratory in California Riverside University. In the Meisner technique the emphasis lies on instinctive reaction, whereas in Stanislavski’s Active Analysis the action and counter-action are emphasised. Counter-action can be seen as the force working against the main action of the scene creating conflict. Having extensive knowledge of reaction, and experience of counter-action, it became important to understand the difference between the two concepts and the importance of both in actor training and application to text. Through research into Action-Perception theory, self-regulation and motivation, I attempt to dissect the fundamental discrepancies between the two principles. My findings show that reaction stems from impulse and instinct, whereas counter-action is rooted in motivation. When motivation and instinct are in conflict, self-regulation will attempt to supress the impulse and over-ride it with an alternative counter-action more suited to the overall motivation. As self-regulation is a limited resource, prolonged use will make this harder to control. Emotion control draws on the same limited resource as self-regulation. Suggesting that emotion regulation would be affected by a conflict in instinct and motivation. These conclusions have a strong impact on how emotions are manifested and produced in actors and warrant a re-evaluation of how actors reach emotional connection to the given circumstance, as well as how emotion is viewed and engaged with in actor training in general
Vigilance in a Cooperatively Breeding Primate
Collective vigilance is considered a major advantage of group living in animals. We investigated vigilance behavior in wild mustached tamarins (Saguinus mystax), small, arboreal, cooperatively breeding New World primates that form stable mixed-species groups with saddleback tamarins (Saguinus fuscicollis). We aimed 1) to investigate whether vigilance patterns change according to individual activity and 2) to examine whether there is a social component of vigilance in their cooperative and nonaggressive society. We studied 11 factors that may influence vigilance and used this data to interpret the possible functions of vigilance. We observed 44 individuals in 3 mixed-species and 2 single-species groups of 2 populations that differed in population density and home range sizes. Vigilance changed greatly when individuals were engaged in different activities and individual vigilance was affected by different sets of factors depending on the activity. As vigilance decreased in proximity of conspecifics and heterospecifics when feeding, and in larger mixed-species groups when resting, we conclude that the predominant function of vigilance in mustached tamarins is predator related. However, the absence of the group size effect in very large single-species groups suggests that it may also function to maintain group cohesion. In the population with higher density and smaller home ranges individuals also increased their vigilance in home range overlap areas. We found no evidence that mustached tamarins monitor group mates to avoid food stealing or aggression. The effect of heterospecifics on individual vigilance suggests that collective vigilance might have been an important incentive in the evolution of tamarin mixed-species groups
Recommended from our members
Escalating costs of self-injury mortality in the 21st century United States: an interstate observational study
Background
Estimating the economic costs of self-injury mortality (SIM) can inform health planning and clinical and public health interventions, serve as a basis for their evaluation, and provide the foundation for broadly disseminating evidence-based policies and practices. SIM is operationalized as a composite of all registered suicides at any age, and 80% of drug overdose (intoxication) deaths medicolegally classified as ‘accidents,’ and 90% of corresponding undetermined (intent) deaths in the age group 15 years and older. It is the long-term practice of the United States (US) Centers for Disease Control and Prevention (CDC) to subsume poisoning (drug and nondrug) deaths under the injury rubric. This study aimed to estimate magnitude and change in SIM and suicide costs in 2019 dollars for the United States (US), including the 50 states and the District of Columbia.
Methods
Cost estimates were generated from underlying cause-of-death data for 1999/2000 and 2018/2019 from the US Centers for Disease Control and Prevention’s (CDC’s) Wide-ranging ONline Data for Epidemiologic Research (WONDER). Estimation utilized the updated version of Medical and Work Loss Cost Estimation Methods for CDC’s Web-based Injury Statistics Query and Reporting System (WISQARS). Exposures were medical expenditures, lost work productivity, and future quality of life loss. Main outcome measures were disaggregated, annual-averaged total and per capita costs of SIM and suicide for the nation and states in 1999/2000 and 2018/2019.
Results
40,834 annual-averaged self-injury deaths in 1999/2000 and 101,325 in 2018/2019 were identified. Estimated national costs of SIM rose by 143% from 1.12 trillion. Ratios of quality of life and work losses to medical spending in 2019 US dollars in 2018/2019 were 1,476 and 526, respectively, versus 1,419 and 526 in 1999/2000. Total national suicide costs increased 58%—from 502.7 billion. National per capita costs of SIM doubled from 3,413 over the observation period; costs of the suicide component rose from 1,534. States in the top quintile for per capita SIM, those whose cost increases exceeded 152%, concentrated in the Great Lakes, Southeast, Mideast and New England. States in the bottom quintile, those with per capita cost increases below 70%, were located in the Far West, Southwest, Plains, and Rocky Mountain regions. West Virginia exhibited the largest increase at 263% and Nevada the smallest at 22%. Percentage per capita cost increases for suicide were smaller than for SIM. Only the Far West, Southwest and Mideast were not represented in the top quintile, which comprised states with increases of 50% or greater. The bottom quintile comprised states with per capita suicide cost increases below 24%. Regions represented were the Far West, Southeast, Mideast and New England. North Dakota and Nevada occupied the extremes on the cost change continuum at 75% and − 1%, respectively.
Conclusion
The scale and surge in the economic costs of SIM to society are large. Federal and state prevention and intervention programs should be financed with a clear understanding of the total costs—fiscal, social, and personal—incurred by deaths due to self-injurious behaviors
Formation and physicochemical properties of crystalline and amorphous salts with different stoichiometries formed between ciprofloxacin and succinic acid
YesMulti-ionizable compounds, such as dicarboxylic
acids, offer the possibility of forming salts of drugs with
multiple stoichiometries. Attempts to crystallize ciprofloxacin,
a poorly water-soluble, amphoteric molecule with succinic acid
(S) resulted in isolation of ciprofloxacin hemisuccinate (1:1)
trihydrate (CHS-I) and ciprofloxacin succinate (2:1) tetrahydrate
(CS-I). Anhydrous ciprofloxacin hemisuccinate (CHS-II)
and anhydrous ciprofloxacin succinate (CS-II) were also
obtained. It was also possible to obtain stoichiometrically
equivalent amorphous salt forms, CHS-III and CS-III, by spray
drying and milling, respectively, of the drug and acid. Anhydrous CHS and CS had melting points at ∼215 and ∼228 °C, while
the glass transition temperatures of CHS-III and CS-III were ∼101 and ∼79 °C, respectively. Dynamic solubility studies revealed
the metastable nature of CS-I in aqueous media, resulting in a transformation of CS-I to a mix of CHS-I and ciprofloxacin 1:3.7
hydrate, consistent with the phase diagram. CS-III was observed to dissolve noncongruently leading to high and sustainable drug
solution concentrations in water at 25 and 37 °C, with the ciprofloxacin concentration of 58.8 ± 1.18 mg/mL after 1 h of the
experiment at 37 °C. This work shows that crystalline salts with multiple stoichiometries and amorphous salts have diverse
pharmaceutically relevant properties, including molecular, solid state, and solubility characteristics.Solid State Pharmaceutical Cluster (SSPC), supported by Science Foundation Ireland under grant number 07/SRC/ B1158
- …