624 research outputs found
The impact of large core and late treatment trials: An update on the modelled annual thrombectomy eligibility of UK stroke patients
\ua9 European Stroke Organisation 2024.Introduction: To support decisions about thrombectomy provision, we have previously estimated the annual UK population eligible for treatment as βΌ10% of stroke admissions. Since then, eight further randomised trials that could alter the eligibility rate have reported in 2021-23. We updated our estimates of the eligible population from these trials and other recent studies. Patients and methods: An updated decision tree describing the EVT eligible population for UK stroke admissions was produced. Decision criteria were derived from the highest level of evidence available. For nodes where no specific RCT data existed, evidence was obtained from the latest systematic review(s) or the highest quality observational data. Results: We estimate that 15,420 (approximately 15%) of admitted UK stroke patients are now eligible for thrombectomy, or 14,930 if advanced brain imaging using MRI/CT perfusion or collateral assessment were used in all patients. This is a 54% increase in our previous estimate in 2021. Over 50% of LAO strokes are now potentially eligible for thrombectomy. The increase in eligibility is principally due to a much larger cohort of later presenting and/or larger ischaemic core patients. Conclusion: Most previously independent LAO stroke patients presenting within 24 h, even in the presence of a large ischaemic core on initial non-contrast CT, should be considered for thrombectomy with use of advanced brain imaging in those presenting beyond 12 h to identify salvageable penumbral brain tissue. Treatment in most patients remains critically time-dependent and our estimates should be interpreted with this in mind
An observational study of patient characteristics associated with the mode of admission to acute stroke services in North East, England
Objective
Effective provision of urgent stroke care relies upon admission to hospital by emergency ambulance and may involve pre-hospital redirection. The proportion and characteristics of patients who do not arrive by emergency ambulance and their impact on service efficiency is unclear. To assist in the planning of regional stroke services we examined the volume, characteristics and prognosis of patients according to the mode of presentation to local services.
Study design and setting
A prospective regional database of consecutive acute stroke admissions was conducted in North East, England between 01/09/10-30/09/11. Case ascertainment and transport mode were checked against hospital coding and ambulance dispatch databases.
Results
Twelve acute stroke units contributed data for a mean of 10.7 months. 2792/3131 (89%) patients received a diagnosis of stroke within 24 hours of admission: 2002 arrivals by emergency ambulance; 538 by private transport or non-emergency ambulance; 252 unknown mode. Emergency ambulance patients were older (76 vs 69 years), more likely to be from institutional care (10% vs 1%) and experiencing total anterior circulation symptoms (27% vs 6%). Thrombolysis treatment was commoner following emergency admission (11% vs 4%). However patients attending without emergency ambulance had lower inpatient mortality (2% vs 18%), a lower rate of institutionalisation (1% vs 6%) and less need for daily carers (7% vs 16%). 149/155 (96%) of highly dependent patients were admitted by emergency ambulance, but none received thrombolysis.
Conclusion
Presentations of new stroke without emergency ambulance involvement were not unusual but were associated with a better outcome due to younger age, milder neurological impairment and lower levels of pre-stroke dependency. Most patients with a high level of pre-stroke dependency arrived by emergency ambulance but did not receive thrombolysis. It is important to be aware of easily identifiable demographic groups that differ in their potential to gain from different service configurations
Chest radiography practice in critically ill patients: a postal survey in the Netherlands
BACKGROUND: To ascertain current chest radiography practice in intensive care units (ICUs) in the Netherlands. METHODS: Postal survey: a questionnaire was sent to all ICUs with > 5 beds suitable for mechanical ventilation; pediatric ICUs were excluded. When an ICU performed daily-routine chest radiographs in any group of patients it was considered to be a "daily-routine chest radiography" ICU. RESULTS: From the number of ICUs responding, 63% practice a daily-routine strategy, in which chest radiographs are obtained on a daily basis without any specific reason. A daily-routine chest radiography strategy is practiced less frequently in university-affiliated ICUs (50%) as compared to other ICUs (68%), as well as in larger ICUs (> 20 beds, 50%) as compared to smaller ICUs (< 20 beds, 65%) (P > 0.05). Remarkably, physicians that practice a daily-routine strategy consider daily-routine radiographs helpful in guiding daily practice in less than 30% of all performed radiographs. Chest radiographs are considered essential for verification of the position of invasive devices (81%) and for diagnosing pneumothorax, pneumonia or acute respiratory distress syndrome (82%, 74% and 69%, respectively). On demand chest radiographs are obtained after introduction of thoracic drains, central venous lines and endotracheal tubes in 98%, 84% and 75% of responding ICUs, respectively. Chest films are also obtained in case of ventilatory deterioration (49% of responding ICUs), and after cardiopulmonary resuscitation (59%), tracheotomy (58%) and mini-tracheotomy (23%). CONCLUSION: There is notable lack of consensus on chest radiography practice in the Netherlands. This survey suggests that a large number of intensivists may doubt the value of daily-routine chest radiography, but still practice a daily-routine strategy
Degree correlations in directed scale-free networks
Scale-free networks, in which the distribution of the degrees obeys a
power-law, are ubiquitous in the study of complex systems. One basic network
property that relates to the structure of the links found is the degree
assortativity, which is a measure of the correlation between the degrees of the
nodes at the end of the links. Degree correlations are known to affect both the
structure of a network and the dynamics of the processes supported thereon,
including the resilience to damage, the spread of information and epidemics,
and the efficiency of defence mechanisms. Nonetheless, while many studies focus
on undirected scale-free networks, the interactions in real-world systems often
have a directionality. Here, we investigate the dependence of the degree
correlations on the power-law exponents in directed scale-free networks. To
perform our study, we consider the problem of building directed networks with a
prescribed degree distribution, providing a method for proper generation of
power-law-distributed directed degree sequences. Applying this new method, we
perform extensive numerical simulations, generating ensembles of directed
scale-free networks with exponents between~2 and~3, and measuring ensemble
averages of the Pearson correlation coefficients. Our results show that
scale-free networks are on average uncorrelated across directed links for three
of the four possible degree-degree correlations, namely in-degree to in-degree,
in-degree to out-degree, and out-degree to out-degree. However, they exhibit
anticorrelation between the number of outgoing connections and the number of
incoming ones. The findings are consistent with an entropic origin for the
observed disassortativity in biological and technological networks.Comment: 10 pages, 5 figure
Pre-hospital transdermal glyceryl trinitrate for transient ischaemic attack: Data from the RIGHT-2 trial
\ua9 2024 The Author(s). European Journal of Neurology published by John Wiley & Sons Ltd on behalf of European Academy of Neurology.Background and purpose: Ambulance trials assessing interventions in suspected stroke patients will recruit patients with currently active symptoms that will resolve into transient ischaemic attack (TIA). The safety and efficacy of glyceryl trinitrate (GTN) in the pre-specified subgroup of patients with TIA in the Rapid Intervention with Glyceryl Trinitrate in Hypertensive Stroke Trial 2 (RIGHT-2) was assessed. Methods: RIGHT-2 was a pre-hospital-initiated multicentre randomized sham-controlled blinded-endpoint trial that randomized patients with presumed ultra-acute stroke within 4 h of symptom onset to transdermal GTN or sham. Final diagnosis was determined by site investigators. The primary outcome was a shift in modified Rankin Scale (mRS) scores at 90 days analysed using ordinal logistic regression reported as adjusted common odds ratio with 95% confidence intervals (CIs). Secondary outcomes included death or dependence (mRS >2). Results: In all, 109 of 1149 (9.5%) patients had a final diagnosis of TIA (GTN 57, sham 52) with mean age 73 (SD 13) years, 19 (17.4%) had pre-morbid mRS >2, and onset to randomization was 80 min (interquartile range 49, 105). GTN lowered blood pressure by 7.4/5.2 mmHg compared with sham by hospital arrival. At day 90, GTN had no effect on shift in mRS scores (common odds ratio for increased dependence 1.47, 95% CI 0.70β3.11) but was associated with increased death or dependence (mRS >2): GTN 29 (51.8%) versus sham 23 (46.9%), odds ratio 3.86 (95% CI 1.09β13.59). Conclusions: Pre-hospital ultra-acute transdermal GTN did not improve overall functional outcome in patients with investigator-diagnosed TIA compared with sham treatment
Investigating Unique Environmental Contributions to the Neural Representation of Written Words: A Monozygotic Twin Study
The visual word form area (VWFA) is a region of left inferior occipitotemporal cortex that is critically involved in visual word recognition. Previous studies have investigated whether and how experience shapes the functional characteristics of VWFA by comparing neural response magnitude in response to words and nonwords. Conflicting results have been obtained, however, perhaps because response magnitude can be influenced by other factors such as attention. In this study, we measured neural activity in monozygotic twins, using functional magnetic resonance imaging. This allowed us to quantify differences in unique environmental contributions to neural activation evoked by words, pseudowords, consonant strings, and false fonts in the VWFA and striate cortex. The results demonstrate significantly greater effects of unique environment in the word and pseudoword conditions compared to the consonant string and false font conditions both in VWFA and in left striate cortex. These findings provide direct evidence for environmental contributions to the neural architecture for reading, and suggest that learning phonology and/or orthographic patterns plays the biggest role in shaping that architecture
Robust and Task-Independent Spatial Profile of the Visual Word Form Activation in Fusiform Cortex
Written language represents a special category of visual information. There is strong evidence for the existence of a cortical region in ventral occipitotemporal cortex for processing the visual form of written words. However, due to inconsistent findings obtained with different tasks, the level of specialization and selectivity of this so called visual word form area (VWFA) remains debated. In this study, we examined category selectivity for Chinese characters, a non-alphabetic script, in native Chinese readers. In contrast to traditional approaches of examining response levels in a restricted predefined region of interest (ROI), a detailed distribution of the BOLD signal across the mid-fusiform cortical surface and the spatial patterns of responses to Chinese characters were obtained. Results show that a region tuned for Chinese characters could be consistently found in the lateral part of the left fusiform gyrus in Chinese readers, and this spatial pattern of selectivity for written words was not influenced by top-down tasks such as phonological or semantic modulations. These results provide strong support for the robust spatial coding of category selective response in the mid-fusiform cortex, and demonstrate the utility of the spatial distribution analysis as a more meaningful approach to examine functional magnetic resonance imaging (fMRI) data
Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?
Β© 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. Β© 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio
- β¦