963 research outputs found
Shrinking binary and planetary orbits by Kozai cycles with tidal friction
At least two arguments suggest that the orbits of a large fraction of binary
stars and extrasolar planets shrank by 1-2 orders of magnitude after formation:
(i) the physical radius of a star shrinks by a large factor from birth to the
main sequence, yet many main-sequence stars have companions orbiting only a few
stellar radii away, and (ii) in current theories of planet formation, the
region within ~0.1 AU of a protostar is too hot and rarefied for a Jupiter-mass
planet to form, yet many "hot Jupiters" are observed at such distances. We
investigate orbital shrinkage by the combined effects of secular perturbations
from a distant companion star (Kozai oscillations) and tidal friction. We
integrate the relevant equations of motion to predict the distribution of
orbital elements produced by this process. Binary stars with orbital periods of
0.1 to 10 days, with a median of ~2 d, are produced from binaries with much
longer periods (10 d to 10^5 d), consistent with observations indicating that
most or all short-period binaries have distant companions (tertiaries). We also
make two new testable predictions: (1) For periods between 3 and 10 d, the
distribution of the mutual inclination between the inner binary and the
tertiary orbit should peak strongly near 40 deg and 140 deg. (2) Extrasolar
planets whose host stars have a distant binary companion may also undergo this
process, in which case the orbit of the resulting hot Jupiter will typically be
misaligned with the equator of its host star.Comment: Submitted to ApJ; 18 pages, 10 figure
Observability of the General Relativistic Precession of Periastra in Exoplanets
The general relativistic precession rate of periastra in close-in exoplanets
can be orders of magnitude larger than the magnitude of the same effect for
Mercury. The realization that some of the close-in exoplanets have significant
eccentricities raises the possibility that this precession might be detectable.
We explore in this work the observability of the periastra precession using
radial velocity and transit light curve observations. Our analysis is
independent of the source of precession, which can also have significant
contributions due to additional planets and tidal deformations. We find that
precession of the periastra of the magnitude expected from general relativity
can be detectable in timescales of <~ 10 years with current observational
capabilities by measuring the change in the primary transit duration or in the
time difference between primary and secondary transits. Radial velocity curves
alone would be able to detect this precession for super-massive, close-in
exoplanets orbiting inactive stars if they have ~100 datapoints at each of two
epochs separated by ~20 years. We show that the contribution to the precession
by tidal deformations may dominate the total precession in cases where the
relativistic precession is detectable. Studies of transit durations with Kepler
might need to take into account effects arising from the general relativistic
and tidal induced precession of periastra for systems containing close-in,
eccentric exoplanets. Such studies may be able to detect additional planets
with masses comparable to that of Earth by detecting secular variations in the
transit duration induced by the changing longitude of periastron.Comment: 13 pages, 5 figures. Accepted for publication in Ap
On a new observable for measuring the Lense-Thirring effect with Satellite Laser Ranging
In this paper we present a rather extensive error budget for the difference
of the perigees of a pair of supplementary SLR satellites aimed to the
detection of the Lense-Thirring effect.Comment: LaTex2e, 14 pages, 1 table, no figures. Some changes and additions to
the abstract, Introduction and Conclusions. References updated, typos
corrected. Equation corrected. To appear in General Relativity and
Gravitatio
An exploratory study looking at the relationship marketing techniques used in the music festival industry
There are current issues and trends in the music festival
market, which may affect the success of an event, and market saturation
is at the forefront of these issues. Previous literature, maintaining
the need for a marketing approach to festivals, identifi es the need
for maintaining strong stakeholder relationships in order to succeed
in a business environment; attention has been focused to the theory
of relationship marketing (RM) because of the recognition that this
practice is complementary to the marketing of festivals. The very nature
of the music festival as an annual, usually, 4-day event means that
effective marketing is needed to keep connections with the consumer
throughout the year. This article focuses on the RM techniques
utilised within the music festival industry from the viewpoint of the
festival organiser in an attempt to establish how festival organisations
value and monitor organisational relationships. This article explores
the extent to which these relationships are valued and managed;
furthermore, the variations between these intricate relationships
are considered by focusing on those held with the organisation ’ s
consumers and sponsors, the results of which have provided the
ability to establish the importance and relevance of RM to the industry
and further identify the marketing communication methods employed
to establish and maintain such relationships. In-depth, convergent
interviews have been conducted with a segment of music festival
organisers from a range of events. The results have been integrated
with the study of current literature to best exemplify these issues. It
has been established that RM has a strong role in today ’ s commercial
and independent music festival industry; technological advances are
enabling the organiser to support online relationships further and
increase consumer loyalty. There is a need to expand the research
further because of the complexity of organisational relationships and
the varying categories of festivals
Recommended from our members
High current pulsed positron microprobe
We are developing a low energy, microscopically focused, pulsed positron beam for defect analysis by positron lifetime spectroscopy to provide a new defect analysis capability at the 10{sup 10} e{sup +}s{sup -l} beam at the Lawrence Livermore National Laboratory electron linac. When completed, the pulsed positron microprobe will enable defect specific, 3-dimensional maps of defect concentrations with sub-micron resolution of defect location. By coupling these data with first principles calculations of defect specific positron lifetimes and positron implantation profiles we will both map the identity and concentration of defect distributions
Real-space local polynomial basis for solid-state electronic-structure calculations: A finite-element approach
We present an approach to solid-state electronic-structure calculations based
on the finite-element method. In this method, the basis functions are strictly
local, piecewise polynomials. Because the basis is composed of polynomials, the
method is completely general and its convergence can be controlled
systematically. Because the basis functions are strictly local in real space,
the method allows for variable resolution in real space; produces sparse,
structured matrices, enabling the effective use of iterative solution methods;
and is well suited to parallel implementation. The method thus combines the
significant advantages of both real-space-grid and basis-oriented approaches
and so promises to be particularly well suited for large, accurate ab initio
calculations. We develop the theory of our approach in detail, discuss
advantages and disadvantages, and report initial results, including the first
fully three-dimensional electronic band structures calculated by the method.Comment: replacement: single spaced, included figures, added journal referenc
Metformin:historical overview
Metformin (dimethylbiguanide) has become the preferred first-line oral blood glucose-lowering agent to manage type 2 diabetes. Its history is linked to Galega officinalis (also known as goat's rue), a traditional herbal medicine in Europe, found to be rich in guanidine, which, in 1918, was shown to lower blood glucose. Guanidine derivatives, including metformin, were synthesised and some (not metformin) were used to treat diabetes in the 1920s and 1930s but were discontinued due to toxicity and the increased availability of insulin. Metformin was rediscovered in the search for antimalarial agents in the 1940s and, during clinical tests, proved useful to treat influenza when it sometimes lowered blood glucose. This property was pursued by the French physician Jean Sterne, who first reported the use of metformin to treat diabetes in 1957. However, metformin received limited attention as it was less potent than other glucose-lowering biguanides (phenformin and buformin), which were generally discontinued in the late 1970s due to high risk of lactic acidosis. Metformin's future was precarious, its reputation tarnished by association with other biguanides despite evident differences. The ability of metformin to counter insulin resistance and address adult-onset hyperglycaemia without weight gain or increased risk of hypoglycaemia gradually gathered credence in Europe, and after intensive scrutiny metformin was introduced into the USA in 1995. Long-term cardiovascular benefits of metformin were identified by the UK Prospective Diabetes Study (UKPDS) in 1998, providing a new rationale to adopt metformin as initial therapy to manage hyperglycaemia in type 2 diabetes. Sixty years after its introduction in diabetes treatment, metformin has become the most prescribed glucose-lowering medicine worldwide with the potential for further therapeutic applications
Stellar structure and compact objects before 1940: Towards relativistic astrophysics
Since the mid-1920s, different strands of research used stars as "physics
laboratories" for investigating the nature of matter under extreme densities
and pressures, impossible to realize on Earth. To trace this process this paper
is following the evolution of the concept of a dense core in stars, which was
important both for an understanding of stellar evolution and as a testing
ground for the fast-evolving field of nuclear physics. In spite of the divide
between physicists and astrophysicists, some key actors working in the
cross-fertilized soil of overlapping but different scientific cultures
formulated models and tentative theories that gradually evolved into more
realistic and structured astrophysical objects. These investigations culminated
in the first contact with general relativity in 1939, when J. Robert
Oppenheimer and his students George Volkoff and Hartland Snyder systematically
applied the theory to the dense core of a collapsing neutron star. This
pioneering application of Einstein's theory to an astrophysical compact object
can be regarded as a milestone in the path eventually leading to the emergence
of relativistic astrophysics in the early 1960s.Comment: 83 pages, 4 figures, submitted to the European Physical Journal
Artificial Stupidity
Public debate about AI is dominated by Frankenstein Syndrome, the fear that AI will become superhuman and escape human control. Although superintelligence is certainly a possibility, the interest it excites can distract the public from a more imminent concern: the rise of Artificial Stupidity (AS). This article discusses the roots of Frankenstein Syndrome in Mary Shelley’s famous novel of 1818. It then provides a philosophical framework for analysing the stupidity of artificial agents, demonstrating that modern intelligent systems can be seen to suffer from ‘stupidity of judgement’. Finally it identifies an alternative literary tradition that exposes the perils and benefits of AS. In the writings of Edmund Spenser, Jonathan Swift and E.T.A. Hoffmann, ASs replace, enslave or delude their human users. More optimistically, Joseph Furphy and Laurence Sterne imagine ASs that can serve human intellect as maps or as pipes. These writers provide a strong counternarrative to the myths that currently drive the AI debate. They identify ways in which even stupid artificial agents can evade human control, for instance by appealing to stereotypes or distancing us from reality. And they underscore the continuing importance of the literary imagination in an increasingly automated society
- …