49 research outputs found
Executive summary: heart disease and stroke statistics--2013 update: a report from the American Heart Association.
Each year, the American Heart Association (AHA), in conjunction with the Centers for Disease Control and Prevention, the National Institutes of Health, and other government agencies, brings together the most up-to-date statistics on heart disease, stroke, other vascular diseases, and their risk factors and presents them in its Heart Disease and Stroke Statistical Update*The Statistical Update is a valuable resource for researchers, clinicians, healthcare policy makers, media professionals, the lay public, and many others who seek the best national data available on heart disease, stroke, and other cardiovascular disease-related morbidity and mortality and the risks, quality of care, medical procedures and operations, and costs associated with the management of these diseases in a single document*Indeed, since 1999, the Statistical Update has been cited \u3e10 500 times in the literature, based on citations of all annual versions*In 2011 alone, the various Statistical Updates were cited ≈1500 times (data from ISI Web of Science)*In recent years, the Statistical Update has undergone some major changes with the addition of new chapters and major updates across multiple areas, as well as increasing the number of ways to access and use the information assembled*For this year\u27s edition, the Statistics Committee, which produces the document for the AHA, updated all of the current chapters with the most recent nationally representative data and inclusion of relevant articles from the literature over the past year*This year\u27s edition also implements a new chapter organization to reflect the spectrum of cardiovascular health behaviors and health factors and risks, as well as subsequent complicating conditions, disease states, and outcomes*Also, the 2013 Statistical Update contains new data on the monitoring and benefits of cardiovascular health in the population, with additional new focus on evidence-based approaches to changing behaviors, implementation strategies, and implications of the AHA\u27s 2020 Impact Goals*Below are a few highlights from this year\u27s Update . © 2013 American Heart Association, Inc
Executive summary: heart disease and stroke statistics--2014 update: a report from the American Heart Association.
Each year, the American Heart Association (AHA), in conjunction with the Centers for Disease Control and Prevention, the National Institutes of Health, and other government agencies, brings together the most up-to-date statistics on heart disease, stroke, other vascular diseases, and their risk factors and presents them in its Heart Disease and Stroke Statistical Update. The Statistical Update is a critical resource for researchers, clinicians, healthcare policy makers, media professionals, the lay public, and many others who seek the best available national data on heart disease, stroke, and other cardiovascular disease-related morbidity and mortality and the risks, quality of care, use of medical procedures and operations, and costs associated with the management of these diseases in a single document. Indeed, since 1999, the Statistical Update has been cited >10 500 times in the literature, based on citations of all annual versions. In 2012 alone, the various Statistical Updates were cited ≈3500 times (data from Google Scholar). In recent years, the Statistical Update has undergone some major changes with the addition of new chapters and major updates across multiple areas, as well as increasing the number of ways to access and use the information assembled. For this year's edition, the Statistics Committee, which produces the document for the AHA, updated all of the current chapters with the most recent nationally representative data and inclusion of relevant articles from the literature over the past year. This year's edition includes a new chapter on peripheral artery disease, as well as new data on the monitoring and benefits of cardiovascular health in the population, with additional new focus on evidence-based approaches to changing behaviors, implementation strategies, and implications of the AHA's 2020 Impact Goals. Below are a few highlights from this year's Update. © 2013 American Heart Association, Inc
Clinical correlates of renal dysfunction in hypertensive patients without cardiovascular complications: the REDHY study
Our study was aimed to assess the clinical correlates of different degrees of renal dysfunction in a wide group of non-diabetic hypertensive patients, free from cardiovascular (CV) complications and known renal diseases, participating to the REDHY (REnal Dysfunction in HYpertension) study. A total of 1856 hypertensive subjects (mean age: 47±14 years), attending our hypertension centre, were evaluated. The glomerular filtration rate (GFR) was estimated by the simplified Modification of Diet in Renal Disease Study prediction equation. A 24-h urine sample was collected to determine albumin excretion rate (AER). Albuminuria was defined as an AER greater than 20 μg min−1. We used the classification proposed by the US National Kidney Foundation's guidelines for chronic kidney disease (CKD) to define the stages of renal function impairment. In multiple logistic regression analysis, the probability of having stage 1 and stage 2 CKD was significantly higher in subjects with greater values of systolic blood pressure (SBP) and with larger waist circumference. SBP was also positively related to stage 3 CKD. Stage 3 and stages 4–5 CKD were inversely associated with waist circumference and directly associated with serum uric acid. Age was inversely related to stage 1 CKD and directly related to stage 3 CKD. The factors associated with milder forms of kidney dysfunction are, in part, different from those associated with more advanced stages of renal function impairment
The Challenges of Creativity in Software Organizations
Part 1: Creating ValueInternational audienceManaging creativity has proven to be one of the most important drivers in software development and use. The continuous changing market environment drives companies like Google, SAS Institute and LEGO to focus on creativity as an increasing necessity when competing through sustained innovations. However, creativity in the information systems (IS) environment is a challenge for most organizations that is primarily caused by not knowing how to strategize creative processes in relation to IS strategies, thus, causing companies to act ad hoc in their creative endeavors. In this paper, we address the organizational challenges of creativity in software organizations. Grounded in a previous literature review and a rigorous selection process, we identify and present a model of seven important factors for creativity in software organizations. From these factors, we identify 21 challenges that software organizations experience when embarking on creative endeavors and transfer them into a comprehensive framework. Using an interpretive research study, we further study the framework by analyzing how the challenges are integrated in 27 software organizations. Practitioners can use this study to gain a deeper understanding of creativity in their own business while researchers can use the framework to gain insight while conducting interpretive field studies of managing creativity
Memory Efficient Algorithms for the Verification of Temporal Properties
This paper addresses the problem of designing memory-efficient algorithms for the verification of temporal properties of finite-state programs. Both the programs and their desired temporal properties are modeled as automata on infinite words (Büchi automata). Verification is then reduced to checking the emptiness of the automaton resulting from the product of the program and the property. This problem is usually solved by computing the strongly connected components of the graph representing the product automaton. Here, we present algorithms which solve the emptiness problem without explicitly constructing the strongly connected components of the product graph. By allowing the algorithms to err with some probability, we can implement them with a randomly accessed memory of size O(n) bits, where n is the number of states of the graph, instead of O(n log n) bits that the presently known algorithms require