7,698 research outputs found
Crime, Trade Marks, and Soft Trade Policy in the Inter-War Era: Market Realities and the Merchandise Marks Act 1926
In this paper we explore a specific facet of the relationship between trade marks and the criminal law in the UK in early twentieth century. Fusing insights from legal, business and economic history, we show how, in the inter-war years of the twentieth century, the context of domestic politics and wider international trade policy resulted in a greater focus on the relationship between trade marks and market-place understandings of the national manufacturing origin of products. This context resulted, amongst other things, in the passage of the Merchandise Marks Act 1926: a criminal law measure that stipulated the circumstances in which imported goods were to be marked with an indication of origin of the place of manufacture/production (either a definite indication of the country of origin, or ‘Empire’ or ‘Foreign’) and included, in section 1, a criminal offence regulating the use of inter alia a trade mark ‘being or purporting to be’ the trade mark of a UK trader. We show that, during the interwar period, the criminal law regulating trade marks became entwined with ‘soft’ trade policy, i.e. a means of protecting the domestic/Empire market falling short of ‘hard’ trade policy (protection/tariffs). The proper role of the criminal law regulating trade marks in international trade policy polarised political debate in the legislature, and also involved major commercial and manufacturing organisations, such as the Federation of British Industry, and various chambers of commerce. Using new archival sources, we show how the meaning of the 1926 Act was actualised through the enforcement of the 1926 Act by the Board of Trade. We explore the problems that confronted the Board of Trade when it enforced the 1926 Act, particularly stemming from the substantial presence of foreign multinationals located in the UK. The 1926 Act did not apply to imported goods that had undergone a ‘substantial change’ through a treatment/process in the UK, and the difficulties in applying this test in practice ultimately led to the demise of Board of Trade prosecutions under the 1926 Act, and a renewed focus, on the part of the Board, on its powers to prosecute ‘false trade descriptions’ contained in earlier legislation: section 2(1) Merchandise Marks Act 1887
Non-flushing of IV administration sets: an under-recognised under-dosing risk
Background:
intravenous (IV) drugs are administered widely and under-dosing can result in therapy failure. The aim of this study was to quantify frequency, volume and dose of drug discarded within administration sets in the clinical setting.
Methods:
residual volume for 24 different administration sets was measured under controlled conditions in a laboratory. Clinical assessment of current practice regarding post-infusion flushing occurred in 6 departments of one teaching hospital in the UK over 7 days. Details of drug last infused, (concentration, diluent and volume) and type and brand of administration set were collected.
Results:
74% of administration sets were not flushed. Non-flushing exceeded 90% and 61% for gravity and pump infusions respectively (p<0.001) in all areas excluding oncology. Oncology was the only area where flushing was standard practice for all infusions (p<0.001). Mean residual volume of the administration sets was 13.1 ml and 16.7 ml for gravity and pump sets respectively. Antibiotics were commonly infused and up to 21% of antibiotic dose was frequently discarded.
Conclusions:
the findings suggest disposal of substantial volumes of drugs occurs frequently in general hospital areas. Without clear national and local policies this unrecognised under-dosing will continue
Preliminary evaluation of a variable compliance joystick for people with multiple sclerosis
Upper-limb fatigue is a common problem that may restrict people with multiple sclerosis (MS) from using their electric powered wheelchair effectively and for a long period of time. The objective of this research is to evaluate whether participants with MS can drive better with a variable compliance joystick (VCJ) and customizable algorithms than with a conventional wheelchair joystick. Eleven participants were randomly assigned to one of two groups. The groups used the VCJ in either compliant or noncompliant isometric mode and a standard algorithm, personally fitted algorithm, or personally fitted algorithm with fatigue adaptation running in the background in order to complete virtual wheelchair driving tasks. Participants with MS showed better driving performance metrics while using the customized algorithms than while using the standard algorithm with the VCJ. Fatigue adaptation algorithms are especially beneficial in improving overall task performance while using the VCJ in isometric mode. The VCJ, along with the personally fitted algorithms and fatigue adaptation algorithms, has the potential to be an effective input interface for wheelchairs
Diel turbidity cycles in a headwater stream: evidence of nocturnal bioturbation?
Purpose: A small number of recent studies have linked daily cycles in stream turbidity to nocturnal bioturbation by aquatic fauna, principally crayfish, and demonstrated this process can significantly impact upon water quality under baseflow conditions. Adding to this limited body of research, we use high-resolution water quality monitoring data to investigate evidence of diel turbidity cycles in a lowland, headwater stream with a known signal crayfish (Pacifastacus leniusculus) population and explore a range of potential causal mechanisms. Materials and methods: Automatic bankside monitoring stations measured turbidity and other water quality parameters at 30-min resolution at three locations on the River Blackwater, Norfolk, UK during 2013. Specifically, we focused on two 20-day periods of baseflow conditions during January and April 2013 which displayed turbidity trends typical of winter and spring seasons, respectively. The turbidity time-series, which were smoothed with 6.5 hour Savitzky-Golay filters to highlight diel trends, were correlated against temperature, stage, dissolved oxygen and pH to assess the importance of abiotic influences on turbidity. Turbidity was also calibrated against suspended particulate matter (SPM) over a wide range of values via linear regression. Results and discussion: Pronounced diel turbidity cycles were found at two of the three sites under baseflow conditions during April. Spring night-time turbidity values consistently peaked between 21:00 and 04:00 with values increasing by ~10 nephelometric turbidity units (NTU) compared with the lowest recorded daytime values which occurred between 10:00 and 14:00. This translated into statistically significant increases in median midnight SPM concentration of up to 76% compared with midday, with night-time (18:00 – 05:30) SPM loads also up to 30% higher than that recorded during the daytime (06:00 – 17:30). Relating turbidity to other water quality parameters exhibiting diel cycles revealed there to be neither any correlation that might indicate a causal link, nor any obvious mechanistic connections to explain the temporal turbidity trends. Diel turbidity cycles were less prominent at all sites during the winter. Conclusions: Considering the seasonality and timing of elevated turbidity, visual observations of crayfish activity, and an absence of mechanistic connections with other water quality parameters, the results presented here are consistent with the hypothesis that nocturnal bioturbation is responsible for generating diel turbidity cycles under baseflow conditions in headwater streams. However, further research in a variety of fluvial environments is required to better assess the spatial extent, importance and causal mechanisms of this phenomenon
Cost-Effectiveness of PHMB & betaine wound bed preparation compared with standard care in venous leg ulcers: A cost-utility analysis in the United Kingdom
Background
Wounds cost £8.3 billion per year in the United Kingdom (UK) annually. Venous leg ulcers (VLUs) account for 15% of wounds and can be complicated to heal, increasing nurse visits and resource costs. Recent wound bed preparation consensus recommends wound cleansing and biofilm disrupting agents. However, inert cleansers such as tap water or saline are inexpensive, an evaluation of evidence is required to justify the higher upfront costs of treatment with active cleansers. We undertook a cost-effectiveness analysis of the use of a biofilm disrupting and cleansing solution and gel, Prontosan® Solution and Gel X, (PSGX) (B Braun Medical), as compared to the standard practice of using saline solution, for treating VLUs.
Methods
A Markov model was parameterised to one-year costs and health-related quality of life consequences of treating chronic VLUs with PSGX versus saline solution. Costs are viewed from a UK healthcare payer perspective, include routine care and management of complications. A systematic literature search was performed to inform the clinical parameters of the economic model. Deterministic univariate sensitivity analysis (DSA) and probabilistic sensitivity analysis (PSA) were undertaken.
Results
For PSGX an Incremental Net Monetary Benefit (INMB) of £1,129.65 to £1,042.39 per patient (with a Maximum Willingness to Pay of £30k and £20k per QALY respectively), of which cost savings are £867.87 and 0.0087 quality-adjusted life years (QALYs) gain per patient. PSA indicates a 99.3% probability of PSGX being cost-effective over saline.
Conclusions
PSGX for the treatment of VLUs is dominant compared with saline solution in the UK with expected cost-savings within a year and improved patient outcomes
Buttressing staples with cholecyst-derived extracellular matrix (CEM) reinforces staple lines in an ex vivo peristaltic inflation model
This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ Springer Science + Business Media, LLC 2008Background - Staple line leakage and bleeding are the most common problems associated with the use of surgical staplers for gastrointestinal resection and anastomotic procedures. These complications can be reduced by reinforcing the staple lines with buttressing materials. The current study reports the potential use of cholecyst-derived extracellular matrix (CEM) in non-crosslinked (NCEM) and crosslinked (XCEM) forms, and compares their mechanical performance with clinically available buttress materials [small intestinal submucosa (SIS) and bovine pericardium (BP)] in an ex vivo small intestine model.
Methods - Three crosslinked CEM variants (XCEM0005, XCEM001, and XCEM0033) with different degree of crosslinking were produced. An ex vivo peristaltic inflation model was established. Porcine small intestine segments were stapled on one end, using buttressed or non-buttressed surgical staplers. The opened, non-stapled ends were connected to a peristaltic pump and pressure transducer and sealed. The staple lines were then exposed to increased intraluminal pressure in a peristaltic manner. Both the leak and burst pressures of the test specimens were recorded.
Results - The leak pressures observed for non-crosslinked NCEM (137.8 ± 22.3 mmHg), crosslinked XCEM0005 (109.1 ± 14.1 mmHg), XCEM001 (150.1 ± 16.0 mmHg), XCEM0033 (98.8 ± 10.5 mmHg) reinforced staple lines were significantly higher when compared to non-buttressed control (28.3 ± 10.8 mmHg) and SIS (one and four layers) (62.6 ± 11.8 and 57.6 ± 12.3 mmHg, respectively) buttressed staple lines. NCEM and XCEM were comparable to that observed for BP buttressed staple lines (138.8 ± 3.6 mmHg). Only specimens with reinforced staple lines were able to achieve high intraluminal pressures (ruptured at the intestinal mesentery), indicating that buttress reinforcements were able to withstand pressure higher than that of natural tissue (physiological failure).
Conclusions - These findings suggest that the use of CEM and XCEM as buttressing materials is associated with reinforced staple lines and increased leak pressures when compared to non-buttressed staple lines. CEM and XCEM were found to perform comparably with clinically available buttress materials in this ex vivo model.Enterprise Irelan
Recommended from our members
The influence of task demands, verbal ability and executive functions on item and source memory in Autism Spectrum Disorder
Autism Spectrum Disorder (ASD) is generally associated with difficulties in contextual source memory but not single item memory. There are surprising inconsistencies in the literature, however, that the current study seeks to address by examining item and source memory in age and ability matched groups of 22 ASD and 21 comparison adults. Results show that group differences in source memory are moderated by task demands but not by individual differences in verbal ability, executive function or item memory. By contrast, unexpected group differences in item memory could largely be explained by individual differences in source memory. These observations shed light on the factors underlying inconsistent findings in the memory literature in ASD, which has important implications for theory and practice
A biophysical model of cell adhesion mediated by immunoadhesin drugs and antibodies
A promising direction in drug development is to exploit the ability of
natural killer cells to kill antibody-labeled target cells. Monoclonal
antibodies and drugs designed to elicit this effect typically bind cell-surface
epitopes that are overexpressed on target cells but also present on other
cells. Thus it is important to understand adhesion of cells by antibodies and
similar molecules. We present an equilibrium model of such adhesion,
incorporating heterogeneity in target cell epitope density and epitope
immobility. We compare with experiments on the adhesion of Jurkat T cells to
bilayers containing the relevant natural killer cell receptor, with adhesion
mediated by the drug alefacept. We show that a model in which all target cell
epitopes are mobile and available is inconsistent with the data, suggesting
that more complex mechanisms are at work. We hypothesize that the immobile
epitope fraction may change with cell adhesion, and we find that such a model
is more consistent with the data. We also quantitatively describe the parameter
space in which binding occurs. Our results point toward mechanisms relating
epitope immobility to cell adhesion and offer insight into the activity of an
important class of drugs.Comment: 13 pages, 5 figure
Clinical risk factor assessment had better discriminative ability than bone mineral density in identifying subjects with vertebral fracture
Summary: This study evaluated the characteristics of patients with vertebral fractures and examined the discriminative ability of clinical risk factors. The findings provide further insights into possible development of a simple, cost-effective scheme for fracture risk assessment using clinical risk factors to identify high-risk patients for further evaluation. Introduction: Vertebral fractures are the most common complication of osteoporosis. The aim of this study was to evaluate the characteristics of patients with vertebral fractures and to determine the discriminative ability of bone mineral density (BMD) and other clinical risk factors. Methods: Postmenopausal Southern Chinese women (2,178) enrolled in the Hong Kong Osteoporosis Study since 1995 were prospectively followed up for fracture outcome. Subjects (1,372) with lateral spine radiographs were included in this study. Baseline demographic, BMD, and clinical risk factor information were obtained from a structured questionnaire. Results: Subjects (299; 22%) had prevalent vertebral fractures. The prevalence of vertebral fractures increased with increasing age, number of clinical risk factors, and decreasing BMD. The odds of having a prevalent vertebral fracture per SD reduction in BMD after adjustment for age in Hong Kong Southern Chinese postmenopausal women was 1.5 for the lumbar spine and femoral neck. Analysis of the receiver operating characteristic curve revealed that bone mineral apparent density did not enhance fracture risk prediction. Subjects with ≥4 clinical risk factors had 2.3-fold higher odds of having a prevalent vertebral fracture while subjects with ≥4 clinical risk factors plus a low BMD (i.e., femoral neck T-score <-2.5) had 2.6-fold. Addition of BMD to clinical risk factors did not enhance the discriminative ability to identify subjects with vertebral fracture. Conclusions: Based on these findings, we recommend that screening efforts should focus on older postmenopausal women with multiple risk factors to identify women who are likely to have a prevalent vertebral fracture. © 2010 The Author(s).published_or_final_versionSpringer Open Choice, 21 Feb 201
- …