95 research outputs found
An unstructured parallel least-squares spectral element solver for incompressible flow problems
The parallelization of the least-squares spectral element formulation of the Stokes problem has recently been discussed for incompressible flow problems on structured grids. In the present work, the extension to unstructured grids is discussed. It will be shown that, to obtain an efficient and scalable method, two different kinds of distribution of data are required involving a rather complicated parallel conversion between the data. Once the data conversion has been performed, a large symmetric positive definite algebraic system has to be solved iteratively. It is well known that the Conjugate Gradient method is a good choice to solve such systems. To improve the convergence rate of the Conjugate Gradient process, both Jacobi and Additive Schwarz preconditioners are applied. The Additive Schwarz preconditioner is based on domain decomposition and can be implemented such that a preconditioning step corresponds to a parallel matrix-by-vector product. The new results reveal that the Additive Schwarz preconditioner is very suitable for the p-refinement version of the least-squares spectral element method. To obtain good portable programs which may run on distributed-memory multiprocessors, networks of workstations as well as shared-memory machines we use MPI (Message Passing Interface). Numerical simulations have been performed to validate the scalability of the different parts of the proposed method. The experiments entailed simulating several large scale incompressible flows on a Cray T3E and on an SGI Origin 3800 with the number of processors varying from one to more than one hundred. The results indicate that the present method has very good parallel scaling properties making it a powerful method for numerical simulations of incompressible flows
Parallel Implementation of a Least-Squares Spectral Element Solver for Incomressible Flow Problems
Least-squares spectral element methods are based on two important and successful numerical methods: spectral/{\em hp} element methods and least-squares finite element methods. Least-squares methods lead to symmetric and positive definite algebraic systems which circumvent the
Ladyzhenskaya-Babu\v{s}ka-Brezzi stability condition and
consequently allow the use of equal order interpolation polynomials for all variables.
In this paper, we present results obtained with a parallel implementation of the least-squares spectral element solver on a distributed memory machine (Cray T3E) and on a virtual shared memory machine (SGI Origin 3800)
A parallel, state-of-the-art, least-squares spectral element solver for incompressible flow problems
The paper deals with the efficient parallelization of
least-squares spectral element methods for incompressible flows.
The parallelization of this sort of problems requires two different
strategies. On the one hand, the spectral element discretization benefits
from an element-by-element parallelization strategy. On the other hand,
an efficient strategy to solve the large sparse global systems benefits
from a row-wise distribution of data. This requires two different
kinds of data distributions and the conversion between them
is rather complicated. In the present paper, the different
strategies together with its conversion are discussed. Moreover, some
results obtained on a distributed memory machine (Cray T3E) are presented
An Unstructured Parallel Least-Squares Spectral Element Solver for Incompressible Flow Problems
The parallelization of the least-squares spectral element formulation of the Stokes problem has recently been discussed for incompressible flow problems on structured grids. In the present work, the extension to unstructured grids is discussed. It will be shown that, to obtain an efficient and scalable method, two different kinds of distribution of data are required involving a rather complicated parallel conversion between the data. Once the data conversion has been performed, a large symmetric positive definite algebraic system has to be solved iteratively. It is well known that the Conjugate Gradient method is a good choice to solve such systems. To improve the convergence rate of the Conjugate Gradient process, both Jacobi and Additive Schwarz preconditioners are applied. The Additive Schwarz preconditioner is based on domain decomposition and can be implemented such that a preconditioning step corresponds to a parallel matrix-by-vector product. The new results reveal that the Additive Schwarz preconditioner is very suitable for the p-refinement version of the least-squares spectral element method. To obtain good portable programs which may run on distributed-memory multiprocessors, networks of workstations as well as shared-memory machines we use MPI (Message Passing Interface). Numerical simulations have been performed to validate the scalability of the different parts of the proposed method. The experiments entailed simulating several large scale incompressible flows on a Cray T3E and on an SGI Origin 3800 with the number of processors varying from one to more than one hundred. The results indicate that the present method has very good parallel scaling properties making it a powerful method for numerical simulations of incompressible flows
Exploring early combination strategy in Latin American patients with newly diagnosed type 2 diabetes : a sub-analysis of the VERIFY study
Background Patients with type 2 diabetes mellitus (T2DM) from Latin American countries face challenges in access to healthcare, leading to under-diagnosis, under-achievement of glycemic target, and long-term complications. Early diagnosis and treatment initiation are of paramount importance in this population due to the high prevalence of risk factors such as obesity and metabolic syndrome. The VERIFY study in patients with newly diagnosed T2DM (across 34 countries), assessed the normoglycemic durability (5 years), with early combination (EC) therapy approach versus the traditional stepwise approach of initiating treatment with metformin monotherapy (MET). Here we present the results from the VERIFY study for participants from eight countries in Latin America. Methods Newly diagnosed adult patients with T2DM, HbA1c 6.5-7.5% and body-mass index (BMI) of 22-40 kg/m(2) were enrolled. The primary endpoint was time to initial treatment failure (TF; HbA1c >= 7.0% at two consecutive scheduled visits 13 weeks apart). Time to second TF was evaluated when patients in both groups were receiving and failing on the vildagliptin combination. Safety and tolerability were also assessed for both treatment approaches during the study. Results A total of 537 eligible patients (female, 58.8%) were randomly assigned to receive either EC (n = 266) or MET (n = 271). EC significantly reduced the relative risk of time to initial TF by 47% versus MET [HR (95% CI) 0.53 (0.4, 0.7) p < 0.0001]. Overall, 46.4% versus 66.3% of patients achieved the primary endpoint in the EC and MET groups, with a median [interquartile range (IQR)] time to TF of 59.8 (27.5, not evaluable) and 33.4 (12.2, 60.1) months, respectively. The risk for time to second TF was 31% lower with EC (p < 0.0092). A higher proportion of patients receiving EC maintained durable HbA1c < 7.0%, < 6.5%, and < 6.0%. Both treatment approaches were well tolerated, and only 3.2% of participants discontinued the study due to adverse events. All hypoglycemic events (EC: n = 7 and MET: n = 3) were single, mild episodes and did not lead to study discontinuation. Conclusion Similar to the global population, long-term clinical benefits were achieved more frequently and without tolerability issues with EC versus standard-of-care MET in this Latin American sub-population. This study is registered with ClinicalTrials.gov, NCT01528254.Peer reviewe
The use of standard calendar software by individuals with acquired brain injury and cognitive complaints: a mixed methods study
PURPOSE: To explore the actual use of standard calendar software by people with acquired brain injury (ABI) and healthy individuals. METHOD: Mixed methods design with qualitative and quantitative analyses of the respondents' use of calendar software. Fifteen individuals with ABI and 15 healthy participants were enrolled. Participants were asked to execute five consecutive tasks using standard calendar software, which resembled everyday use of an electronic calendar. RESULTS: The core processes "task execution" and "information processing" were influenced by internal factors (cognitive and emotional processes and fatigue) as well as environmental factors (software features and distractions). Results obtained by qualitative and quantitative methods showed similar reaction patterns in both groups. However, ABI patients had more cognitive problems and showed stronger emotions during task performance than healthy participants. Healthy participants were more successful and needed less time and mental effort to perform a task. CONCLUSIONS: Although ABI patients were able to use standard calendar software, they became upset more easily, needed more effort, became tired sooner and more suddenly. Strategies to support ABI patients in the use of calendar software are suggested from multi-disciplinary perspectives
A pre-specified statistical analysis plan for the VERIFY study : Vildagliptin efficacy in combination with metformin for early treatment of T2DM
Aims To ensure the integrity of the planned analyses and maximize the clinical utility of the VERIFY study results by describing the detailed concepts behind its statistical analysis plan (SAP) before completion of data collection and study database lock. The SAP will be adhered to for the final primary data analysis of the VERIFY trial. Materials and Methods Vildagliptin efficacy in combination with metformin for early treatment of T2DM (VERIFY) is an ongoing, multicentre, randomized controlled trial aiming to demonstrate the clinical benefits of glycaemic durability and glucose control achieved with an early combination therapy in newly-diagnosed type 2 diabetes (T2DM) patients. Results The SAP was initially designed at the study protocol conception phase and later modified, as reported here, in collaboration between the steering committee members, statisticians, and the VERIFY study leadership team. All authors were blinded to treatment allocation. An independent statistician has additionally retrieved and presented unblinded data to the independent data safety monitoring committee. An overview of the trial design with a focus on describing the fine-tuning of the analysis plan for the primary efficacy endpoint, risk of initial treatment failure, and secondary, exploratory and pre-specified subgroup analyses is provided here. Conclusion According to optimal trial practice, the details of the statistical analysis and data-handling plan prior to locking the database are reported here. The SAP accords with high-quality standards of internal validity to minimize analysis bias and will enhance the utility of the reported results for improved outcomes in the management of T2DM.Peer reviewe
Do-Not-Attempt-Resuscitation orders for people with intellectual disabilities : dilemmas and uncertainties for ID physicians and trainees. The importance of the deliberation process
Item does not contain fulltextBACKGROUND: Not much is known about Do-Not-Attempt-Resuscitation (DNAR) decision-making for people with intellectual disabilities (IDs). The aim of this study was to clarify the problems and pitfalls of non-emergency DNAR decision-making for people with IDs, from the perspective of ID physicians. METHODS: This qualitative study was based on semi-structured individual interviews, focus group interviews and an expert meeting, all recorded digitally and transcribed verbatim. Forty ID physicians and trainees were interviewed about problems, pitfalls and dilemmas of DNAR decision-making for people with IDs in the Netherlands. Data were analysed using Grounded Theory procedures. RESULTS: The core category identified was 'Patient-related considerations when issuing DNAR orders'. Within this category, medical considerations were the main contributory factor for the ID physicians. Evaluation of quality of life was left to the relatives and was sometimes a cause of conflicts between physicians and relatives. The category of 'The decision-maker role' was as important as that of 'The decision procedure in an organisational context'. The procedure of issuing a non-emergency DNAR order and the embedding of this procedure in the health care organisation were important for the ID physicians. CONCLUSION: The theory we developed clarifies that DNAR decision-making for people with IDs is complex and causes uncertainties. This theory offers a sound basis for training courses for physicians to deal with uncertainties regarding DNAR decision-making, as well as a method for advance care planning. Health care organisations are strongly advised to implement a procedure regarding DNAR decision-making
Long-Term Glycaemic Durability of Early Combination Therapy Strategy versus Metformin Monotherapy in Korean Patients with Newly Diagnosed Type 2 Diabetes Mellitus
We assessed the glycaemic durability with early combination (EC; vildagliptin+metformin [MET], n=22) versus MET monotherapy (n=17), among newly-diagnosed type 2 diabetes mellitus (T2DM) enrolled (between 2012 and 2014) in the VERIFY study from Korea (n=39). Primary endpoint was time to initial treatment failure (TF) (glycosylated hemoglobin [HbA1c]>= 7.0% at two consecutive scheduled visits after randomization [end of period 1]). Time to second TF was assessed when both groups were receiving and failing on the combination (end of period 2). With EC the risk of initial TF significantly reduced by 78% compared to MET (n=3 [15%] vs. n=10 [58.7%], P=0.0228). No secondary TF occurred in EC group versus five patients (29.4%) in MET. Patients receiving EC treatment achieved consistently lower HbA1c levels. Both treatment approaches were well tolerated with no hypoglycaemic events. In Korean patients with newly diagnosed T2DM, EC treatment significantly and consistently improved the long-term glycaemic durability as compared with MET.Peer reviewe
Uncovering treatment burden as a key concept for stroke care: a systematic review of qualitative research
<b>Background</b> Patients with chronic disease may experience complicated management plans requiring significant personal investment. This has been termed ‘treatment burden’ and has been associated with unfavourable outcomes. The aim of this systematic review is to examine the qualitative literature on treatment burden in stroke from the patient perspective.<p></p>
<b>Methods and findings</b> The search strategy centred on: stroke, treatment burden, patient experience, and qualitative methods. We searched: Scopus, CINAHL, Embase, Medline, and PsycINFO. We tracked references, footnotes, and citations. Restrictions included: English language, date of publication January 2000 until February 2013. Two reviewers independently carried out the following: paper screening, data extraction, and data analysis. Data were analysed using framework synthesis, as informed by Normalization Process Theory. Sixty-nine papers were included. Treatment burden includes: (1) making sense of stroke management and planning care, (2) interacting with others, (3) enacting management strategies, and (4) reflecting on management. Health care is fragmented, with poor communication between patient and health care providers. Patients report inadequate information provision. Inpatient care is unsatisfactory, with a perceived lack of empathy from professionals and a shortage of stimulating activities on the ward. Discharge services are poorly coordinated, and accessing health and social care in the community is difficult. The study has potential limitations because it was restricted to studies published in English only and data from low-income countries were scarce.<p></p>
<b>Conclusions</b> Stroke management is extremely demanding for patients, and treatment burden is influenced by micro and macro organisation of health services. Knowledge deficits mean patients are ill equipped to organise their care and develop coping strategies, making adherence less likely. There is a need to transform the approach to care provision so that services are configured to prioritise patient needs rather than those of health care systems
- …