5,024 research outputs found
The triple decomposition of a fluctuating velocity field in a multiscale flow
A new method for the triple decomposition of a multiscale flow, which is based on the novel optimal mode decomposition (OMD) technique, is presented. OMD provides low order linear dynamics, which fits a given data set in an optimal way and is used to distinguish between a coherent (periodic) part of a flow and a stochastic fluctuation. The method needs no external phase indication since this information, separate for coherent structures associated with each length scale introduced into the flow, appears as the output. The proposed technique is compared against two traditional methods of the triple decomposition, i.e., bin averaging and proper orthogonal decomposition. This is done with particle image velocimetry data documenting the near wake of a multiscale bar array. It is shown that both traditional methods are unable to provide a reliable estimation for the coherent fluctuation while the proposed technique performs very well. The crucial result is that the coherence peaks are not observed within the spectral properties of the stochastic fluctuation derived with the proposed method; however, these properties remain unaltered at the residual frequencies. This proves the method’s capability of making a distinction between both types of fluctuations. The sensitivity to some prescribed parameters is checked revealing the technique’s robustness. Additionally, an example of the method application for analysis of a multiscale flow is given, i.e., the phase conditioned transverse integral length is investigated in the near wake region of the multiscale object array
Recommended from our members
From bench to bedside: Tracing the payback forwards from basic or early clinical research – A preliminary exercise and proposals for a future study
EXECUTIVE SUMMARY
Chapter 1 : Introduction
• The members of the research team from HERG and the Wellcome Trust have conducted previous studies showing that it is possible both to assess the payback from applied health research, and to use bibliometrics to trace the links between generations of research and clinical guidelines. In another of the team’s studies, however, it proved difficult to replicate the major study by Comroe and Dripps (1976) that had identified clinical advances and then worked backwards to show that they had relied on earlier basic research. Therefore, the study reported here sets out to use the methods developed in our previous studies of payback to undertake analysis that starts with more basic or early clinical research and traces the research lines forwards to clinical applications. Whilst this preliminary study involved preparation for a future large-scale study, it was hoped that it would also provide an interesting case study.
• Starting with the research outputs of one team 20 years ago, called the 1st generation papers, the preliminary study has three main elements: standard bibliometric analysis through several generations of papers; categorisation of the citations; and qualitative analysis using questionnaires, critical pathway analysis and interviews to trace the impact of the 1st generation of research.
• Diabetes and cardiology were suggested as possible topics on which to base the study. Initial reviews identified two bodies of research in diabetes as being potentially suitable for reasons such as the continuing activity of key members of the team.
• The research into diabetes conducted in 1981 by George Alberti and his team at Newcastle, and collaborators elsewhere, was selected to provide the case study for this preliminary stage for several reasons. It was thought to have been important science and there was a belief that some of it had made a contribution to clinical practice.
Chapter 2 : Bibliometric analysis
• An original plan to look at publications produced over a three year period was changed to looking at the output of just one year, 1981, because in that year alone Alberti and colleagues published 29 articles. These form the 1st generation papers and the average number of citations they received is high. Identifying the citations given to these 29 papers resulted in 799 2nd generation papers and 12,891 3rd generation papers. The numbers involved meant that it was impractical to go beyond the 3rd generation. Within the high overall average, the variation in the number of citations per paper was iii
considerable going from 76 to just one. Similarly, the half-lives of the 29 papers, ie the time taken for an article to receive 50% of its citations, ranged from two years to 11.
• Articles can be given a Research Level (ie one of four levels from clinical observation to basic) based on the journals in which they appear. Such analysis demonstrates the breadth of Alberti’s work because the 29 articles are spread across all four Research Levels. Crucially, there was not a shift from basic to more clinical levels across the generations. The higher than average number of authors and addresses per paper is testimony to Alberti’s extensive collaborations.
• The funding acknowledgements reveal the high proportion of papers supported, at least partially, by one funder: the British Diabetic Association, now Diabetes UK, which provided core support for Alberti’s Newcastle team.
Chapter 3 : Categorisation of citations
• Traditional citation analysis does not allow identification of the importance of the cited article to the citing article, and therefore limits the ability to use citation analysis to trace the impact of basic or early research on later research. We conducted a review of the literature of the meaning of citations.
• From this review, a template was devised that allowed the location, nature and importance of citations to be recorded as well as the type of research (basic or clinical) described in the paper. This was used by six assessors on a sample of papers and inter-rater reliability was tested. Further work is required to refine the template and its definitions, and to improve its consistency in application.
• Nevertheless, for initial analysis, it was applied to 623 out of the 799 2nd generation papers. A four point scale was used for the importance of the cited paper to the citing paper. In just 9% of cases was the cited 1st generation paper thought to be in one of the top two categories, ie of Considerable or Essential importance to the citing paper.
• Statistical analysis revealed no relationship between the number of citations a paper received and the proportion of citations where the cited paper was classified as being of high (ie. Considerable or Essential) importance to the citing paper. Self-citations, however, were shown to be significantly more likely to be in this category.
• The classification of the type of research (basic or clinical) by our analysis of each paper broadly agreed with the classification of the journals by Research Level.
• The time constraints involved in applying the template, plus the lack of any overall pattern in terms of correlations between number and importance of citations, might point to the desirability of adopting a more selective approach, guided by qualitative analysis. In any selective approach, however, it is likely that self-citations should feature.
iv
Chapter 4 : Qualitative analysis
• Given the number of co-authors, it seemed appropriate to send them a questionnaire rather than attempt to interview them. Therefore the interviewing was rather more concentrated than originally intended. Only one formal critical pathway was created, but it was undertaken by an expert in the field who worked with Alberti at Newcastle.
• Some problems emerged in taking 1981 as the starting point for the study. Alberti identified 10 selected papers from the 1970s and 1980s that he felt had had most impact on clinical practice. These helped to give us both a better understanding of the payback from our 1st generation, or 1981, papers, and provided further material for analysis.
• Attempting to describe the impact from the 1981 body of work, and from the 10 selected papers, underlines the complex reality of how science advances and influences clinical practice. If they make a contribution at all, most studies make a small, incremental one.
• A few papers, however, have been shown to have a considerably greater impact. A possible key to the level of payback indicated is the enormous breadth of Alberti’s contacts, and fields and methods of working, to which various references were made. This is well illustrated in the account of how the idea for subcutaneous pumps came about. Similarly, the ability to produce the very important guidelines on treating diabetics during surgery, and diabetic coma, partly resulted from the application to clinical problems of the understandings gained from some of the basic/early clinical studies. It is significant that the key papers on these issues, all of which come from the list of 10 selected papers from the 1970s and 1980s, were having an impact on the 1981 work.
• How far the collection of papers from 1981 have been drawn upon in similar ways is less clear. Nevertheless, papers on treating diabetics during open heart surgery, and on bolus delivery of insulin at meal times, were key parts of these wider streams, despite variable citation levels. Furthermore, various papers, including on acarbose, on portal infusion of insulin, and on semi-human insulin, were important steps in bodies of work in their respective areas. The complexity was illustrated by a paper that helped debunk the Chlorpropamide alcohol flushing hypothesis, and thus end a line of scientific enquiry: there was payback in stopping an incorrect line of inquiry, but nothing on which to build.
• Each technique in the qualitative study produced information about the successful subsequent careers followed by many researchers trained through working with Alberti.
• Historical perspectives, and insider expert opinions, were important in the qualitative analysis. Overall, the qualitative methods highlighted some limitations in the bibliometric approach but also showed how aspects of the citation analysis can complement the opinions expressed, for example about the importance of the breadth of Alberti’s work.
v
Chapter 5 : Lessons learnt and the way forward
• Lessons learnt: a variety of methods can be used successfully to gather considerable data about the payback from a body of research undertaken 20 years ago. Traditional citation analysis alone, however, is not sufficient: the importance of the surgery papers despite their relatively low citation rates illustrates this. The qualitative methods are important and much of the analysis is strengthened by drawing on multiple approaches. Several problems remain, including: identifying a coherent starting point for the analysis; coping with the enormous number of papers involved in later generations; and refining the template for categorising citations and developing ways of fully utilising the results from applying it.
• Preparing for the large-scale study: this preliminary study provides a basis on which to attempt to undertake the larger study we envisaged. Issues now being addressed include identification of the level of bibliometric/citation analysis necessary to complement any qualitative studies. To provide confidence in the findings from an eventual large-scale study, we will need to expand the focus. The study will need to cover at least four sets of case studies. Ideally, each set should focus on a number of research groups working in a country in the same field. We hope there will be sets of case studies in two or three fields and in at least two countries. The issues to be explored will include ones highlighted by this study such as breadth of work, level of collaboration, and the role of core funding.
• Methods for the large-scale study: for each case study we now propose to employ two methodological elements based on the qualitative and quantitative techniques adopted in the preliminary study. They will work in parallel but the quantitative bibliometric analysis would be applied selectively to parts of ‘research lines’ (ie discrete themes of research) identified in the qualitative studies as being important in influencing clinical practice.
• Presenting the findings: each research line could be written-up in a standardised document that would use the HERG payback model and categories to describe the impact of that research. We shall use the qualitative and quantitative data to compare and contrast the ‘payback’ of research lines by country and disease, and then identify common factors that correlate with the translation of basic or early clinical research.
• Concluding comments: in the era of ‘evidence based policy’, research funders are looking for value for money in the research they support and for evidence on the effectiveness of different research strategies. In this study we have begun developing a methodology that will allow us to understand the complexity of research development over a series of generations. The utility of the policy research we propose here will only be realised when it is scaled up to cover a number of different fields in different settings.NHS Executive, London Regio
Discovery of a supernova associated with GRB 031203: SMARTS Optical-Infrared Lightcurves from 0.2 to 92 days
Optical and infrared monitoring of the afterglow site of gamma-ray burst
(GRB) 031203 has revealed a brightening source embedded in the host galaxy,
which we attribute to the presence of a supernova (SN) related to the GRB ("SN
031203"). We present details of the discovery and evolution of SN 031203 from
0.2 to 92 days after the GRB, derived from SMARTS consortium photometry in I
and J bands. A template type Ic lightcurve, constructed from SN 1998bw
photometry, is consistent with the peak brightness of SN 031203 although the
lightcurves are not identical. Differential astrometry reveals that the SN, and
hence the GRB, occurred less than 300 h_71^-1 pc (3-sigma) from the apparent
galaxy center. The peak of the supernova is brighter than the optical afterglow
suggesting that this source is intermediate between a strong GRB and a
supernova.Comment: 11 pages, 3 figures, submitted to ApJ Letter
Optimization of triangular airfoils for Martian helicopters using direct numerical simulations
Mars has a lower atmospheric density than Earth, and the speed of sound is lower due to its atmospheric composition and lower surface temperature. Consequently, Martian rotor blades operate in a low-Reynolds-number compressible regime that is atypical for terrestrial helicopters. Nonconventional airfoils with sharp edges and flat surfaces have shown improved performance under such conditions, and second-order-accurate Reynolds-averaged Navier–Stokes (RANS) and unsteady RANS (URANS) solvers have been combined with genetic algorithms to optimize them. However, flow over such airfoils is characterized by unsteady roll-up of coherent vortices that subsequently break down/transition. Accordingly, RANS/URANS solvers have limited predictive capability, especially at higher angles of attack where the aforementioned physics are more acute. To overcome this limitation, we undertake optimization using high-order direct numerical simulations (DNSs). Specifically, a triangular airfoil is optimized using DNSs. Multi-objective optimization is performed to maximize lift and minimize drag, yielding a Pareto front. Various quantities, including lift spectra and pressure distributions, are analyzed for airfoils on the Pareto front to elucidate flow physics that yield optimal performance. The optimized airfoils that form the Pareto front achieve up to a 48% increase in lift or a 28% reduction in drag compared to a reference triangular airfoil studied in the Mars Wind Tunnel at Tohoku University. The work constitutes the first use of DNSs for aerodynamic shape optimization
Economic evaluation of ASCOT-BPLA: Antihypertensive treatment with an amlodipine-based regimen is cost-effective compared to an atenolol-based regimen
Copyright © 2010 BMJ Publishing Group Ltd & British Cardiovascular Society. Internal or personal use of this material is permitted. However, permission to reprint/republish this material must be obtained from the Publisher.Objective: To compare the cost effectiveness of an amlodipine-based strategy and an atenolol-based strategy in the treatment of hypertension in the UK and Sweden.
Design: A prospective, randomised trial complemented with a Markov model to assess long-term costs and health effects.
Setting: Primary care.
Patients: Patients with moderate hypertension and three or more additional risk factors.
Interventions: Amlodipine 5–10 mg with perindopril 4–8 mg added as needed or atenolol 50–100 mg with bendroflumethiazide 1.25–2.5 mg and potassium added as needed
Main outcome measures: Cost per cardiovascular event and procedure avoided, and cost per quality-adjusted life-year gained.
Results: In the UK, the cost to avoid one cardiovascular event or procedure would be €18 965, and the cost to gain one quality-adjusted life-year would be €21 875. The corresponding figures for Sweden were €13 210 and €16 856.
Conclusions: Compared with the thresholds applied by NICE and in the Swedish National Board of Health and Welfare’s Guidelines for Cardiac Care, an amlodipine-based regimen is cost effective for the treatment of hypertension compared with an atenolol-based regimen in the population studied.The study was supported by the principal funding source, Pfizer, New York, USA
Near field development of artificially generated high Reynolds number turbulent boundary layers
Particle image velocimetry is conducted in the near field of two distinct wall-mounted trips for the artificial generation of a high Reynolds number turbulent boundary layer. The first of these trips consists of high aspect ratio obstacles, which are supposed to minimize the influence of their wakes on the near-wall region, contrasting with low aspect ratio trips, which would enhance this influence. A comprehensive study involving flow description, turbulent-nonturbulent interface detection, a low-order model description of the flow and an exploration of the influence of the wake in the near-wall region is conducted and two different mechanisms are clearly identified and described. First, high aspect ratio trips generate a wall-driven mechanism whose characteristics are a thinner, sharper, and less tortuous turbulent-nonturbulent interface and a reduced influence of the trips' wake in the near-wall region. Second, low aspect ratio trips generate a wake-driven mechanisms in which their turbulent-nonturbulent interface is thicker, less sharply defined, and with a higher tortuosity and the detached wake of the obstacles presents a significant influence on the near-wall region. Study of the low-order modeling of the flow field suggests that these two mechanisms may not be exclusive to the particular geometries tested in the present study but, on the contrary, can be explained based on the predominant flow features. In particular, the distinction of these two mechanisms can explain some of the trends that have appeared in the literature in the past decades
Allocating Public Spending Efficiently: Is There a Need for a Better Mechanism to Inform Decisions in the UK and Elsewhere?
In the UK few if any regular processes explicitly address comparisons of value for money between spending in different government departments, despite the existence of mechanisms that could in principle achieve that. This leaves a very important gap in evidence and means that decisions about public spending allocations are likely to miss opportunities to improve social welfare from existing budgets. Greater attention to the development of methods and evidence to better inform the allocation of public sector spending between departments is therefore urgently needed. We identify a number of possible approaches to this—some of which are being used in different countries—and highlight their strengths and weaknesses. We propose a new, pragmatic approach that incorporates a generic descriptive system to measure the disparate outcomes produced by public sector activities in a commensurate manner. Discrete-choice experiments could be used to generate evidence of the relative importance placed on different aspects of public sector outcomes by members of the general public. The proposed approach would produce evidence on value for money across departments, and the generation of evidence on public preferences to support that
- …