2,550 research outputs found

### Quintessence reconstructed: new constraints and tracker viability

We update and extend our previous work reconstructing the potential of a quintessence field from current observational data. We extend the cosmological data set to include new supernova data, plus information from the cosmic microwave background and from baryon acoustic oscillations. We extend the modeling by considering PadÃ© approximant expansions as well as Taylor series, and by using observations to assess the viability of the tracker hypothesis. We find that parameter constraints have improved by a factor of 2, with a strengthening of the preference of the cosmological constant over evolving quintessence models. Present data show some signs, though inconclusive, of favoring tracker models over nontracker models under our assumptions

### Constraining models of $f(R)$ gravity with Planck and WiggleZ power spectrum data

In order to explain cosmic acceleration without invoking "dark" physics, we
consider $f(R)$ modified gravity models, which replace the standard
Einstein-Hilbert action in General Relativity with a higher derivative theory.
We use data from the WiggleZ Dark Energy survey to probe the formation of
structure on large scales which can place tight constraints on these models. We
combine the large-scale structure data with measurements of the cosmic
microwave background from the Planck surveyor. After parameterising the
modification of the action using the Compton wavelength parameter $B_0$, we
constrain this parameter using ISiTGR, assuming an initial non-informative log
prior probability distribution of this cross-over scale. We find that the
addition of the WiggleZ power spectrum provides the tightest constraints to
date on $B_0$ by an order of magnitude, giving ${\rm log}_{10}(B_0) < -4.07$ at
95% confidence limit. Finally, we test whether the effect of adding the lensing
amplitude $A_{\rm Lens}$ and the sum of the neutrino mass $\sum m_\nu$ is able
to reconcile current tensions present in these parameters, but find $f(R)$
gravity an inadequate explanation.Comment: 21 pages, 7 figures, matches version published in JCAP. The modified
version of ISiTGR used to produce the results in this paper is available at
http://isit.g

### Design Process and Organisational Strategy: A Storytelling Perspective

This paper explores the relationship between design process and organisational strategy through a storytelling perspective by providing a literature review; firstly, in relation to society in general; establishing a contextual background to the research. Secondly, by relating this to a) how designers and design researchers examine storytelling within design process, and b) how organisational strategists theorise storytelling. Then finally, through comparing and contrasting the literature, unearth the relevancies of using a storytelling perspective and uncover opportunities for understanding how design process impacts organisational strategy. It is apparent that certain underlying principles in adopting a storytelling perspective when employing organisational strategy and design process coexist. Foremost are the human centred focuses; in particular building relationships and constructing identities. Concerning the approaches to storytelling, a shared desire to elicit emotional resonance with audiences exists in the use of characterisation. During collaboration between designers and organisations, stories resulting from the design process will incontrovertibly have the potential to impact that companyâ€™s peoples. Examining collaborations between designers and organisations from the perspective of storytelling could lead to a deeper understanding of the impact design can have in an organisation, particularly along the themes of a sense of community, constructing meaning and affecting change within organisations

### The WiggleZ Dark Energy Survey: constraining the evolution of Newton's constant using the growth rate of structure

We constrain the evolution of Newton's constant using the growth rate of
large-scale structure measured by the WiggleZ Dark Energy Survey in the
redshift range $0.1 < z < 0.9$. We use this data in two ways. Firstly we
constrain the matter density of the Universe, \omms (assuming General
Relativity), and use this to construct a diagnostic to detect the presence of
an evolving Newton's constant. Secondly we directly measure the evolution of
Newton's constant, \Geff, that appears in Modified Gravity theories, without
assuming General Relativity to be true. The novelty of these approaches are
that, contrary to other methods, they do not require knowledge of the expansion
history of the Universe, $H(z)$, making them model independent tests. Our
constraints for the second derivative of Newton's constant at the present day,
assuming it is slowly evolving as suggested by Big Bang Nucleosynthesis
constraints, using the WiggleZ data is \ddotGeff(t_0)=-1.19\pm 0.95\cdot
10^{-20}h^2 \textrm{yr}^{-2}, where $h$ is defined via $H_0=100 h km
s^{-1}Mpc^{-1}$, while using both the WiggleZ and the Sloan Digital Sky Survey
Luminous Red Galaxy (SDSS LRG) data is \ddotGeff(t_0)=-3.6\pm 6.8\cdot
10^{-21}h^2 \rm{yr}^{-2}, both being consistent with General Relativity.
Finally, our constraint for the rms mass fluctuation $\sigma_8$ using the
WiggleZ data is $\sigma_8=0.75 \pm 0.08$, while using both the WiggleZ and the
SDSS LRG data $\sigma_8=0.77 \pm 0.07$, both in good agreement with the latest
measurements from the Cosmic Microwave Background radiation.Comment: 15 pages, 5 figures, 4 tables, changes match the published versio

### Combining Planck with Large Scale Structure gives strong neutrino mass constraint

We present the strongest current cosmological upper limit on the sum of
neutrino masses of < 0.18 (95% confidence). It is obtained by adding
observations of the large-scale matter power spectrum from the WiggleZ Dark
Energy Survey to observations of the cosmic microwave background data from the
Planck surveyor, and measurements of the baryon acoustic oscillation scale. The
limit is highly sensitive to the priors and assumptions about the neutrino
scenario. We explore scenarios with neutrino masses close to the upper limit
(degenerate masses), neutrino masses close to the lower limit where the
hierarchy plays a role, and addition of massive or massless sterile species.Comment: 7 pages, 4 figures. Found bug in analysis which is fixed in v2. The
resulting constraints on M_nu remain very strong. Additional info added on
hierarch

### Generating Travel Itineraries Based on User Interests

Generally, the present disclosure is directed to generating a travel itinerary for a user based on the userâ€™s interests. In particular, in some implementations, the systems and methods of the present disclosure can include or otherwise leverage one or more machine-learned models to predict interest of a user and generate a travel itinerary based on user preferences and interests

### Barbourâ€™s Black Douglas

A detailed discussion of the representation and characterization of Sir James Douglas ( Black Douglas ) in John Barbour\u27s poem The Bruce, examining the ways in which Barbour\u27s Douglas is shown not only as the flower of chivalry, but also as a Robin Hood-like denizen of the woods, and arguing that in the most highly colored Douglas episodes, Barbour feints toward the outrageous and transgressive, while also experimenting with his poem\u27s literary structure to incorporate disruption or incursions from a disorderly non-courtly world

### A Method for Population Segmentation by Event Based Machine Learning

The present disclosure provides systems and methods for automating the entire process of segmentation analytics for users of an application. More particularly, input data is received by a machine learning algorithm. The algorithm inspects the data to determine patterns. Upon discovering patterns of event information the algorithm generates a list of segmentations with applicable characteristics. An applicationâ€™s users are grouped together, matching user characteristics with those shared by a particular segment. Over time the algorithm learns from an increased amount of data, generating more accurate and comprehensive user segmentations with time and increased use. Keywords associated with the present disclosure include: machine learning; neural network; events; users; sessions; segmentation; population; event based machine learning

- â€¦