76,676 research outputs found
Myocardial fibrosis in stroke survivors
Stroke survivors are most likely to die of cardiac death, yet few undergo comprehensive cardiac assessment to look for reversible causes. Myocardial fibrosis (MF) is not only the hallmark of cardiomyopathy, but also a substrate for sudden cardiac death, ventricular tachyarrhythmia and heart failure. Procollagen carboxyl-terminal telopeptide (PICP) was found to be a marker of MF. The relationship between PICP and cardiac abnormalities in stroke survivors is unknown. We recently showed that MF in stroke survivors can be treated by spironolactone and amiloride in a randomised placebo-controlled cross-over study with reduction in PICP levels and QTc [1]
Higher Accuracy for Bayesian and Frequentist Inference: Large Sample Theory for Small Sample Likelihood
Recent likelihood theory produces -values that have remarkable accuracy
and wide applicability. The calculations use familiar tools such as maximum
likelihood values (MLEs), observed information and parameter rescaling. The
usual evaluation of such -values is by simulations, and such simulations do
verify that the global distribution of the -values is uniform(0, 1), to high
accuracy in repeated sampling. The derivation of the -values, however,
asserts a stronger statement, that they have a uniform(0, 1) distribution
conditionally, given identified precision information provided by the data. We
take a simple regression example that involves exact precision information and
use large sample techniques to extract highly accurate information as to the
statistical position of the data point with respect to the parameter:
specifically, we examine various -values and Bayesian posterior survivor
-values for validity. With observed data we numerically evaluate the various
-values and -values, and we also record the related general formulas. We
then assess the numerical values for accuracy using Markov chain Monte Carlo
(McMC) methods. We also propose some third-order likelihood-based procedures
for obtaining means and variances of Bayesian posterior distributions, again
followed by McMC assessment. Finally we propose some adaptive McMC methods to
improve the simulation acceptance rates. All these methods are based on
asymptotic analysis that derives from the effect of additional data. And the
methods use simple calculations based on familiar maximizing values and related
informations. The example illustrates the general formulas and the ease of
calculations, while the McMC assessments demonstrate the numerical validity of
the -values as percentage position of a data point. The example, however, is
very simple and transparent, and thus gives little indication that in a wide
generality of models the formulas do accurately separate information for almost
any parameter of interest, and then do give accurate -value determinations
from that information. As illustration an enigmatic problem in the literature
is discussed and simulations are recorded; various examples in the literature
are cited.Comment: Published in at http://dx.doi.org/10.1214/07-STS240 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
A Practitioner’s Approach to Drucker’s Knowledge - Worker Productivity in the 21st Century: A New Model (Part One)
This article examines productivity in the context of the 21st century, focusing on Drucker’s prophecy of knowledge-worker productivity, the power of ‘unified strategy’, organisational interdependence and a practitioner’s approach to knowledge-worker productivity based on Drucker’s six determining factors. From these six factors, a nine building-block based questionnaire survey is developed to establish knowledge-worker productivity readiness status; a knowledge-worker review session to plan towards organisational interdependence and a Drucker-based knowledge-worker productivity implementation framework to manage organisational change. This proposal, intended for business organisations, should also accommodate non-business organisations. Knowledge-worker productivity practice is designed to improve productivity, the quality of work, empowers knowledge workers to accomplish their ‘tasks’ and, consequently, the ‘organisation tasks’ by following an organisational ‘unified strategy’ in an interdependent way that brings about a doing the right thing, the right way approach. This article provides answers to ‘what and how organisations can do to enhance productivity’ from their knowledge-workers, to embrace creativity and develop innovation to provide strategic advantage in sustaining growth in the current new economy of global competition. Team commitment is envisaged through the concept of organisational interdependence. In conclusion, a Drucker-based knowledge-worker productivity implementation framework is proposed, as a management practice to enhance knowledge worker productivity for creativity and commitment. It further demonstrates its competitive power by achieving a unified strategy with implication for organisational change and future applications.knowledge worker; creativity; commitment; productivity; change management; organisational interdependence; unified strategy.
Improved turbine disk design to increase reliability of aircraft jet engines
An analytical study was conducted on a bore entry cooled turbine disk for the first stage of the JT8D-17 high pressure turbine which had the potential to improve disk life over existing design. The disk analysis included the consideration of transient and steady state temperature, blade loading, creep, low cycle fatigue, fracture mechanics and manufacturing flaws. The improvement in life of the bore entry cooled turbine disk was determined by comparing it with the existing disk made of both conventional and advanced (Astroloy) disk materials. The improvement in crack initiation life of the Astroloy bore entry cooled disk is 87% and 67% over the existing disk made of Waspaloy and Astroloy, respectively. Improvement in crack propagation life is 124% over the Waspaloy and 465% over the Astroloy disks. The available kinetic energies of disk fragments calculated for the three disks indicate a lower fragment energy level for the bore entry cooled turbine disk
Tidal stability of giant molecular clouds in the Large Magellanic Cloud
Star formation does not occur until the onset of gravitational collapse
inside giant molecular clouds. However, the conditions that initiate cloud
collapse and regulate the star formation process remain poorly understood.
Local processes such as turbulence and magnetic fields can act to promote or
prevent collapse. On larger scales, the galactic potential can also influence
cloud stability and is traditionally assessed by the tidal and shear effects.
In this paper, we examine the stability of giant molecular clouds (GMCs) in the
Large Magellanic Cloud (LMC) against shear and the galactic tide using CO data
from the Magellanic Mopra Assessment (MAGMA) and rotation curve data from the
literature. We calculate the tidal acceleration experienced by individual GMCs
and determine the minimum cloud mass required for tidal stability. We also
calculate the shear parameter, which is a measure of a clouds susceptibility to
disruption via shearing forces in the galactic disk. We examine whether there
are correlations between the properties and star forming activity of GMCs and
their stability against shear and tidal disruption. We find that the GMCs are
in approximate tidal balance in the LMC, and that shear is unlikely to affect
their further evolution. GMCs with masses close to the minimal stable mass
against tidal disruption are not unusual in terms of their mass, location, or
CO brightness, but we note that GMCs with large velocity dispersion tend to be
more sensitive to tidal instability. We also note that GMCs with smaller radii,
which represent the majority of our sample, tend to more strongly resist tidal
and shear disruption. Our results demonstrate that star formation in the LMC is
not inhibited by to tidal or shear instability.Comment: 18 pages, 10 Figures, Accepted in PAS
Axion hot dark matter bounds
We derive cosmological limits on two-component hot dark matter consisting of
neutrinos and axions. We restrict the large-scale structure data to the safely
linear regime, excluding the Lyman-alpha forest. We derive Bayesian credible
regions in the two-parameter space consisting of m_a and sum(m_nu).
Marginalizing over sum(m_nu) provides m_a<1.02 eV (95% CL). In the absence of
axions the same data and methods give sum(m_nu)< 0.63 eV (95% CL).Comment: Contribution to Proc. 4th Patras Workshop on Axions, WIMPs and WISPs
(18-21 June 2008, DESY
A Modified "Bottom-up" Thermalization in Heavy Ion Collisions
In the initial stage of the bottom-up picture of thermalization in heavy ion
collisions, the gluon distribution is highly anisotropic which can give rise to
plasma instability. This has not been taken account in the original paper. It
is shown that in the presence of instability there are scaling solutions, which
depend on one parameter, that match smoothly onto the late stage of bottom-up
when thermalization takes place.Comment: 8 pages and 1 embedded figure, talk presented at the Workshop on
"Quark-Gluon Plasma Thermalization", Vienna, Austria, 10-12 August 200
On number fields with nontrivial subfields
What is the probability for a number field of composite degree to have a
nontrivial subfield? As the reader might expect the answer heavily depends on
the interpretation of probability. We show that if the fields are enumerated by
the smallest height of their generators the probability is zero, at least if
. This is in contrast to what one expects when the fields are enumerated
by the discriminant. The main result of this article is an estimate for the
number of algebraic numbers of degree and bounded height which generate
a field that contains an unspecified subfield of degree . If
we get the correct asymptotics as the height tends to
infinity
- …