1,092 research outputs found
Organizational improvisation and the reduced usefulness of performance measurement BI functionalities
© 2018 Elsevier Inc. Firms are increasingly turning to business intelligence (BI) systems to support their management control activities, while management accounting researchers are increasingly focused on studying beneficial roles of such systems. The extant research focusses on how performance-enhancing effects of BI systems occur via enhanced managerial learning and knowledge creation. The research has however failed to consider how managerial learning and knowledge creation processes can be shaped by fundamental organizational contingencies. This paper ventures into this unexplored space to consider how organizational improvisation may moderate beneficial roles played by BI. We derive the concept of “semi-structuring heuristics” and apply it to theorize that the impact of BI functionalities on performance measurement capabilities is negatively moderated by organizational improvisation. Our hypotheses include two BI constructs (BI-planning functionality and BI-reporting functionality) and two organizational improvisation competences (strategic momentum and organizational flexibility). We test our hypotheses with partial least squares procedures using survey data from 324 top-level managers. We find that BI-planning functionality has a positive effect on performance measurement capabilities that is negatively moderated by both organizational improvisation competences. The only significant effect of BI-reporting functionality is as a positive moderator of the effect of BI-planning functionality. Organizational improvisation competences are quite common and entail managers using only “minimal forms” of performance measurement information. By implication, if the term BI “functionality” connotes usefulness and fitness-for-purpose, then this term appears a misnomer in contexts reliant on organizational improvisation
Direct measurement of antiferromagnetic domain fluctuations
Measurements of magnetic noise emanating from ferromagnets due to domain
motion were first carried out nearly 100 years ago and have underpinned much
science and technology. Antiferromagnets, which carry no net external magnetic
dipole moment, yet have a periodic arrangement of the electron spins extending
over macroscopic distances, should also display magnetic noise, but this must
be sampled at spatial wavelengths of order several interatomic spacings, rather
than the macroscopic scales characteristic of ferromagnets. Here we present the
first direct measurement of the fluctuations in the nanometre-scale spin-
(charge-) density wave superstructure associated with antiferromagnetism in
elemental Chromium. The technique used is X-ray Photon Correlation
Spectroscopy, where coherent x-ray diffraction produces a speckle pattern that
serves as a "fingerprint" of a particular magnetic domain configuration. The
temporal evolution of the patterns corresponds to domain walls advancing and
retreating over micron distances. While the domain wall motion is thermally
activated at temperatures above 100K, it is not so at lower temperatures, and
indeed has a rate which saturates at a finite value - consistent with quantum
fluctuations - on cooling below 40K. Our work is important because it provides
an important new measurement tool for antiferromagnetic domain engineering as
well as revealing a fundamental new fact about spin dynamics in the simplest
antiferromagnet.Comment: 19 pages, 4 figure
Association between proton pump inhibitor therapy and clostridium difficile infection: a contemporary systematic review and meta-analysis.
Abstract
Introduction
Emerging epidemiological evidence suggests that proton pump inhibitor (PPI) acid-suppression therapy is associated with an increased risk of Clostridium difficile infection (CDI).
Methods
Ovid MEDLINE, EMBASE, ISI Web of Science, and Scopus were searched from 1990 to January 2012 for analytical studies that reported an adjusted effect estimate of the association between PPI use and CDI. We performed random-effect meta-analyses. We used the GRADE framework to interpret the findings.
Results
We identified 47 eligible citations (37 case-control and 14 cohort studies) with corresponding 51 effect estimates. The pooled OR was 1.65, 95% CI (1.47, 1.85), I2 = 89.9%, with evidence of publication bias suggested by a contour funnel plot. A novel regression based method was used to adjust for publication bias and resulted in an adjusted pooled OR of 1.51 (95% CI, 1.26–1.83). In a speculative analysis that assumes that this association is based on causality, and based on published baseline CDI incidence, the risk of CDI would be very low in the general population taking PPIs with an estimated NNH of 3925 at 1 year.
Conclusions
In this rigorously conducted systemic review and meta-analysis, we found very low quality evidence (GRADE class) for an association between PPI use and CDI that does not support a cause-effect relationship
Evidence synthesis as the key to more coherent and efficient research
<p>Abstract</p> <p>Background</p> <p>Systematic review and meta-analysis currently underpin much of evidence-based medicine. Such methodologies bring order to <it>previous </it>research, but <it>future </it>research planning remains relatively incoherent and inefficient.</p> <p>Methods</p> <p>To outline a framework for evaluation of health interventions, aimed at increasing coherence and efficiency through i) making better use of information contained within the existing evidence-base when designing future studies; and ii) maximising the information available and thus potentially reducing the need for future studies.</p> <p>Results</p> <p>The framework presented insists that an up-to-date meta-analysis of existing randomised controlled trials (RCTs) should always be considered before future trials are conducted. Such a meta-analysis should inform critical design issues such as sample size determination. The contexts in which the use of individual patient data meta-analysis and mixed treatment comparisons modelling may be beneficial before further RCTs are conducted are considered. Consideration should also be given to how any newly planned RCTs would contribute to the totality of evidence through its incorporation into an updated meta-analysis. We illustrate how new RCTs can have very low power to change inferences of an existing meta-analysis, particularly when between study heterogeneity is taken into consideration.</p> <p>Conclusion</p> <p>While the collation of existing evidence as the basis for clinical practice is now routine, a more coherent and efficient approach to planning future RCTs to strengthen the evidence base needs to be developed. The framework presented is a proposal for how this situation can be improved.</p
The Pioneer Anomaly
Radio-metric Doppler tracking data received from the Pioneer 10 and 11
spacecraft from heliocentric distances of 20-70 AU has consistently indicated
the presence of a small, anomalous, blue-shifted frequency drift uniformly
changing with a rate of ~6 x 10^{-9} Hz/s. Ultimately, the drift was
interpreted as a constant sunward deceleration of each particular spacecraft at
the level of a_P = (8.74 +/- 1.33) x 10^{-10} m/s^2. This apparent violation of
the Newton's gravitational inverse-square law has become known as the Pioneer
anomaly; the nature of this anomaly remains unexplained. In this review, we
summarize the current knowledge of the physical properties of the anomaly and
the conditions that led to its detection and characterization. We review
various mechanisms proposed to explain the anomaly and discuss the current
state of efforts to determine its nature. A comprehensive new investigation of
the anomalous behavior of the two Pioneers has begun recently. The new efforts
rely on the much-extended set of radio-metric Doppler data for both spacecraft
in conjunction with the newly available complete record of their telemetry
files and a large archive of original project documentation. As the new study
is yet to report its findings, this review provides the necessary background
for the new results to appear in the near future. In particular, we provide a
significant amount of information on the design, operations and behavior of the
two Pioneers during their entire missions, including descriptions of various
data formats and techniques used for their navigation and radio-science data
analysis. As most of this information was recovered relatively recently, it was
not used in the previous studies of the Pioneer anomaly, but it is critical for
the new investigation.Comment: 165 pages, 40 figures, 16 tables; accepted for publication in Living
Reviews in Relativit
Bramwell-Hill modeling for local aortic pulse wave velocity estimation: a validation study with velocity-encoded cardiovascular magnetic resonance and invasive pressure assessment
<p>Abstract</p> <p>Background</p> <p>The Bramwell-Hill model describes the relation between vascular wall stiffness expressed in aortic distensibility and the pulse wave velocity (PWV), which is the propagation speed of the systolic pressure wave through the aorta. The main objective of this study was to test the validity of this model locally in the aorta by using PWV-assessments based on in-plane velocity-encoded cardiovascular magnetic resonance (CMR), with invasive pressure measurements serving as the gold standard.</p> <p>Methods</p> <p>Seventeen patients (14 male, 3 female, mean age ± standard deviation = 57 ± 9 years) awaiting cardiac catheterization were prospectively included. During catheterization, intra-arterial pressure measurements were obtained in the aorta at multiple locations 5.8 cm apart. PWV was determined regionally over the aortic arch and locally in the proximal descending aorta. Subsequently, patients underwent a CMR examination to measure aortic PWV and aortic distention. Distensibility was determined locally from the aortic distension at the proximal descending aorta and the pulse pressure measured invasively during catheterization and non-invasively from brachial cuff-assessment. PWV was determined regionally in the aortic arch using through-plane and in-plane velocity-encoded CMR, and locally at the proximal descending aorta using in-plane velocity-encoded CMR. Validity of the Bramwell-Hill model was tested by evaluating associations between distensibility and PWV. Also, theoretical PWV was calculated from distensibility measurements and compared with pressure-assessed PWV.</p> <p>Results</p> <p>In-plane velocity-encoded CMR provides stronger correlation (p = 0.02) between CMR and pressure-assessed PWV than through-plane velocity-encoded CMR (r = 0.69 versus r = 0.26), with a non-significant mean error of 0.2 ± 1.6 m/s for in-plane versus a significant (p = 0.006) error of 1.3 ± 1.7 m/s for through-plane velocity-encoded CMR. The Bramwell-Hill model shows a significantly (p = 0.01) stronger association between distensibility and PWV for local assessment (r = 0.8) than for regional assessment (r = 0.7), both for CMR and for pressure-assessed PWV. Theoretical PWV is strongly correlated (r = 0.8) with pressure-assessed PWV, with a statistically significant (p = 0.04) mean underestimation of 0.6 ± 1.1 m/s. This theoretical PWV-estimation is more accurate when invasively-assessed pulse pressure is used instead of brachial cuff-assessment (p = 0.03).</p> <p>Conclusions</p> <p>CMR with in-plane velocity-encoding is the optimal approach for studying Bramwell-Hill associations between local PWV and aortic distensibility. This approach enables non-invasive estimation of local pulse pressure and distensibility.</p
Quantifying Selective Reporting and the Proteus Phenomenon for Multiple Datasets with Similar Bias
Meta-analyses play an important role in synthesizing evidence from diverse studies and datasets that address similar questions. A major obstacle for meta-analyses arises from biases in reporting. In particular, it is speculated that findings which do not achieve formal statistical significance are less likely reported than statistically significant findings. Moreover, the patterns of bias can be complex and may also depend on the timing of the research results and their relationship with previously published work. In this paper, we present an approach that is specifically designed to analyze large-scale datasets on published results. Such datasets are currently emerging in diverse research fields, particularly in molecular medicine. We use our approach to investigate a dataset on Alzheimer's disease (AD) that covers 1167 results from case-control studies on 102 genetic markers. We observe that initial studies on a genetic marker tend to be substantially more biased than subsequent replications. The chances for initial, statistically non-significant results to be published are estimated to be about 44% (95% CI, 32% to 63%) relative to statistically significant results, while statistically non-significant replications have almost the same chance to be published as statistically significant replications (84%; 95% CI, 66% to 107%). Early replications tend to be biased against initial findings, an observation previously termed Proteus phenomenon: The chances for non-significant studies going in the same direction as the initial result are estimated to be lower than the chances for non-significant studies opposing the initial result (73%; 95% CI, 55% to 96%). Such dynamic patters in bias are difficult to capture by conventional methods, where typically simple publication bias is assumed to operate. Our approach captures and corrects for complex dynamic patterns of bias, and thereby helps generating conclusions from published results that are more robust against the presence of different coexisting types of selective reporting
Systematic review and meta-analysis of the diagnostic accuracy of ultrasonography for deep vein thrombosis
Background
Ultrasound (US) has largely replaced contrast venography as the definitive diagnostic test for deep vein thrombosis (DVT). We aimed to derive a definitive estimate of the diagnostic accuracy of US for clinically suspected DVT and identify study-level factors that might predict accuracy.
Methods
We undertook a systematic review, meta-analysis and meta-regression of diagnostic cohort studies that compared US to contrast venography in patients with suspected DVT. We searched Medline, EMBASE, CINAHL, Web of Science, Cochrane Database of Systematic Reviews, Cochrane Controlled Trials Register, Database of Reviews of Effectiveness, the ACP Journal Club, and citation lists (1966 to April 2004). Random effects meta-analysis was used to derive pooled estimates of sensitivity and specificity. Random effects meta-regression was used to identify study-level covariates that predicted diagnostic performance.
Results
We identified 100 cohorts comparing US to venography in patients with suspected DVT. Overall sensitivity for proximal DVT (95% confidence interval) was 94.2% (93.2 to 95.0), for distal DVT was 63.5% (59.8 to 67.0), and specificity was 93.8% (93.1 to 94.4). Duplex US had pooled sensitivity of 96.5% (95.1 to 97.6) for proximal DVT, 71.2% (64.6 to 77.2) for distal DVT and specificity of 94.0% (92.8 to 95.1). Triplex US had pooled sensitivity of 96.4% (94.4 to 97.1%) for proximal DVT, 75.2% (67.7 to 81.6) for distal DVT and specificity of 94.3% (92.5 to 95.8). Compression US alone had pooled sensitivity of 93.8 % (92.0 to 95.3%) for proximal DVT, 56.8% (49.0 to 66.4) for distal DVT and specificity of 97.8% (97.0 to 98.4). Sensitivity was higher in more recently published studies and in cohorts with higher prevalence of DVT and more proximal DVT, and was lower in cohorts that reported interpretation by a radiologist. Specificity was higher in cohorts that excluded patients with previous DVT. No studies were identified that compared repeat US to venography in all patients. Repeat US appears to have a positive yield of 1.3%, with 89% of these being confirmed by venography.
Conclusion
Combined colour-doppler US techniques have optimal sensitivity, while compression US has optimal specificity for DVT. However, all estimates are subject to substantial unexplained heterogeneity. The role of repeat scanning is very uncertain and based upon limited data
Phospholipase C-eta enzymes as putative protein kinase C and Ca2+ signalling components in neuronal and neuroendocrine tissues
Phosphoinositol-specific phospholipase C enzymes (PLCs) are central to inositol lipid signalling pathways, facilitating intracellular Ca2+ release and protein kinase C activation. A sixth class of phosphoinositol-specific PLC with a novel domain structure, PLC-eta (PLCeta) has recently been discovered in mammals. Recent research, reviewed here, shows that this class consists of two enzymes, PLCeta1 and PLCeta2. Both enzymes hydrolyze phosphatidylinositol 4,5-bisphosphate and are more sensitive to Ca2+ than other PLC isozymes and are likely to mediate G-protein-coupled receptor (GPCR) signalling pathways. Both enzymes are expressed in neuron-enriched regions, being abundant in the brain. We demonstrate that they are also expressed in neuroendocrine cell lines. PLCeta enzymes therefore represent novel proteins influencing intracellular Ca2+ dynamics and protein kinase C activation in the brain and neuroendocrine systems as putative mediation of GPCR regulation
- …