47 research outputs found
Chemotherapeutic errors in hospitalised cancer patients: attributable damage and extra costs
<p>Abstract</p> <p>Background</p> <p>In spite of increasing efforts to enhance patient safety, medication errors in hospitalised patients are still relatively common, but with potentially severe consequences. This study aimed to assess antineoplastic medication errors in both affected patients and intercepted cases in terms of frequency, severity for patients, and costs.</p> <p>Methods</p> <p>A 1-year prospective study was conducted in order to identify the medication errors that occurred during chemotherapy treatment of cancer patients at a French university hospital. The severity and potential consequences of intercepted errors were independently assessed by two physicians. A cost analysis was performed using a simulation of potential hospital stays, with estimations based on the costs of diagnosis-related groups.</p> <p>Results</p> <p>Among the 6, 607 antineoplastic prescriptions, 341 (5.2%) contained at least one error, corresponding to a total of 449 medication errors. However, most errors (n = 436) were intercepted before medication was administered to the patients. Prescription errors represented 91% of errors, followed by pharmaceutical (8%) and administration errors (1%). According to an independent estimation, 13.4% of avoided errors would have resulted in temporary injury and 2.6% in permanent damage, while 2.6% would have compromised the vital prognosis of the patient, with four to eight deaths thus being avoided. Overall, 13 medication errors reached the patient without causing damage, although two patients required enhanced monitoring. If the intercepted errors had not been discovered, they would have resulted in 216 additional days of hospitalisation and cost an estimated annual total of 92, 907€, comprising 69, 248€ (74%) in hospital stays and 23, 658€ (26%) in additional drugs.</p> <p>Conclusion</p> <p>Our findings point to the very small number of chemotherapy errors that actually reach patients, although problems in the chemotherapy ordering process are frequent, with the potential for being dangerous and costly.</p
The European TeleCheck-AF project on remote app-based management of atrial fibrillation during the COVID-19 pandemic: Centre and patient experiences
Aims: TeleCheck-AF is a multicentre international project initiated to maintain care delivery for patients with atrial fibrillation (AF) during COVID-19 through teleconsultations supported by an on-demand photoplethysmography-based heart rate and rhythm monitoring app (FibriCheck® ). We describe the characteristics, inclusion rates and experiences from participating centres according the TeleCheck-AF infrastructure as well as characteristics and experiences from recruited patients.Methods: Three surveys exploring centre characteristics (n=25), centre experiences (n=23) and patient experiences (n=826) were completed. Self-reported patient characteristics were obtained from the app.Results: Most centres were academic (64%) and specialized public cardiology/district hospitals (36%). Majority of centres had AF outpatient clinics (64%) and only 36% had AF ablation clinics. The time required to start patient inclusion and total number of included patients in the project was comparable for centres experienced (56%) or inexperienced in mHealth use. Within 28 weeks, 1930 AF patients were recruited, mainly for remote AF control (31% of patients) and AF ablation follow-up (42%). Average inclusion rate was highest during the lockdown restrictions and reached a steady state at a lower level after easing the restrictions (188 vs 52 weekly recruited patients). Majority (>80%) of the centres reported no problems during the implementation of the TeleCheck-AF approach. Recruited patients (median age 64 [55-71], 62% male) agreed that the FibriCheck® app was easy to use (94%).Conclusions: Despite different health care settings and mHealth experiences, the TeleCheck-AF approach could be set up within an extremely short time and easily used in different European centres during COVID-19
Effect of temperature on oviposition in four species of the melanogaster group of Drosophila
Crohn's disease of the anus
Importance d'un traitement chirurgical précoce, que les lésions soient simples ou compliquées. Le débridement complet associé à un traitement soigneux des lésions cutanées et des dilatations anales postopératoires prolongées donne d'excellents résultat
Does the Musculus Cricopharyngeus Play a Role in the Genesis of Zenker’s Diverticulum? Enzyme Histochemical and Contractility Properties
Continuous Quantitative Monitoring of Mural, Platelet-Dependent, Thrombus Kinetics in the Crushed Rat Femoral Vein
The effects of sampling method and vegetation type on the estimated abundance of Ixodes ricinus ticks in forests
Estimating the spatial and temporal variation in tick abundance is of great economical and ecological importance. Entire-blanket dragging is the most widely used method to sample free-living ixodid ticks. However, this technique is not equally efficient in different vegetation types. The height and structure of the vegetation under study will not only determine the likelihood of a tick-blanket contact, but will also determine the rate of dislodgement. The purpose of this study was therefore to determine whether the alternative strip-blanket is more effectively in picking up ticks than the standard entire-blanket. Sampling was carried out in four forest understory vegetation types that differed in height and structure on five collection dates between April and September 2008. A total of 8,068 Ixodes ricinus ticks was collected (778 adults, 1,920 nymphs, and 5,370 larvae). The highest numbers of ticks were collected along the forest trails, where the dominant vegetation consisted of short grasses. The lowest numbers of ticks were collected in bracken-fern-dominated sites, where the vegetation seriously hampered tick sampling. Surprisingly, in each vegetation type, significantly more nymphs and adults were collected using the entire-blanket. However, the strip-blanket was more effectively in collecting larvae, especially in dense and tall vegetation
Local habitat and landscape affect Ixodes ricinus tick abundancies in forests on poor, sandy soils
A large fraction of the forests in northern Belgium consists of homogeneous pine stands on nutrient-poor
and acid sandy soils. However, in common with many other parts of Europe, the current forest management
aims at increasing the share of deciduous and mixed forests. This might create favourable habitats
for the tick Ixodes ricinus, which is Europe’s main vector of Borrelia burgdorferi sensu lato, the causative
agent of Lyme borreliosis in humans. Considering the threat to human health, it is important to know
which factors regulate tick abundance. The influence of local habitat and landscape variables on the
abundance of I. ricinus ticks were studied by collecting questing larvae, nymphs, and adults at 176 locations
in forests in the Campine region (northern Belgium). Both I. ricinus ticks and B. burgdorferi spirochetes
occurred throughout the study area, which means that the entire region represents an area of
risk for contracting Lyme borreliosis. At the forest stand level, the main tree species and the shrub cover
significantly affected the abundance of all life stages of I. ricinus. The abundance was higher in oak stands
compared to pine stands, and increased with increasing shrub cover. Additionally, at the landscape level,
a positive effect was found for forest edge length but not for forest cover. These patterns may be
explained by the habitat preferences of the tick’s main hosts. Our results indicate that forest conversion
might indeed create suitable habitats for ticks, which highlights the need for intensive information campaigns
and effective tick control measures.IWT-Flanders,
the Institute for the Promotion of Innovation through Science
and Technology in Flanders.http://www.elsevier.com/ locate/forecoab201
Negative effects of temperature and atmospheric depositions on the seed viability of common juniper (Juniperus communis)
Background and Aims: Environmental change is increasingly impacting ecosystems worldwide. However, our knowledge about the interacting effects of various drivers of global change on sexual reproduction of plants, one of their key mechanisms to cope with change, is limited. This study examines populations of poorly regenerating and threatened common juniper (Juniperus communis) to determine the influence of four drivers of global change (rising temperatures, nitrogen deposition, potentially acidifying deposition and altering precipitation patterns) on two key developmental phases during sexual reproduction, gametogenesis and fertilization (seed phase two, SP2) and embryo development (seed phase three, SP3), and on the ripening time of seeds.
Methods: In 42 populations throughout the distribution range of common juniper in Europe, 11 943 seeds of two developmental phases were sampled. Seed viability was determined using seed dissection and related to accumulated temperature (expressed as growing degree-days), nitrogen and potentially acidifying deposition (nitrogen plus sulfur), and precipitation data.
Key Results: Precipitation had no influence on the viability of the seeds or on the ripening time. Increasing temperatures had a negative impact on the viability of SP2 and SP3 seeds and decreased the ripening time. Potentially acidifying depositions negatively influenced SP3 seed viability, while enhanced nitrogen deposition led to lower ripening times.
Conclusions: Higher temperatures and atmospheric deposition affected SP3 seeds more than SP2 seeds. However, this is possibly a delayed effect as juniper seeds develop practically independently, due to the absence of vascular communication with the parent plant from shortly after fertilization. It is proposed that the failure of natural regeneration in many European juniper populations might be attributed to climate warming as well as enhanced atmospheric deposition of nitrogen and sulfur
