26,147 research outputs found
Recommended from our members
Human-Centered Approaches in Geovisualization Design: Investigating Multiple Methods Through a Long-Term Case Study
Working with three domain specialists we investigate human-centered approaches to geovisualization following an
ISO13407 taxonomy covering context of use, requirements and early stages of design. Our case study, undertaken over three years, draws attention to repeating trends: that generic approaches fail to elicit adequate requirements for geovis application design; that the use of real data is key to understanding needs and possibilities; that trust and knowledge must be built and developed with collaborators. These processes take time but modified human-centred approaches can be effective. A scenario developed through contextual inquiry but supplemented with domain data and graphics is useful to geovis designers. Wireframe, paper and digital prototypes enable successful communication between specialist and geovis domains when incorporating real and interesting data, prompting exploratory behaviour and eliciting previously unconsidered requirements. Paper prototypes are particularly successful at eliciting suggestions, especially for novel visualization. Enabling specialists to explore their data freely with a digital prototype is as effective as using a structured task protocol and is easier to administer. Autoethnography has potential for framing the design process. We conclude that a common understanding of context of use, domain data and visualization possibilities are essential to successful geovis design and develop as this progresses. HC approaches can make a significant contribution here. However, modified approaches, applied with flexibility, are most promising. We advise early, collaborative engagement with data – through simple, transient visual artefacts supported by data sketches and existing designs – before moving to successively more sophisticated data wireframes and data prototypes
Information, information processing and gravity
I discuss fundamental limits placed on information and information processing
by gravity. Such limits arise because both information and its processing
require energy, while gravitational collapse (formation of a horizon or black
hole) restricts the amount of energy allowed in a finite region. Specifically,
I use a criterion for gravitational collapse called the hoop conjecture. Once
the hoop conjecture is assumed a number of results can be obtained directly:
the existence of a fundamental uncertainty in spatial distance of order the
Planck length, bounds on information (entropy) in a finite region, and a bound
on the rate of information processing in a finite region. In the final section
I discuss some cosmological issues related to the total amount of information
in the universe, and note that almost all detailed aspects of the late universe
are determined by the randomness of quantum outcomes. This paper is based on a
talk presented at a 2007 Bellairs Research Institute (McGill University)
workshop on black holes and quantum information.Comment: 7 pages, 5 figures, revte
Recommended from our members
Understanding geovisualization users and their requirements: a user-centred approach
Recommended from our members
Mediating geovisualization to potential users and prototyping a geovisualization application
Recommended from our members
Using the Analytic Hierarchy Process to prioritise candidate improvements to a geovisualization application
X-ray Studies of Two Neutron Stars in 47 Tucanae: Toward Constraints on the Equation of State
We report spectral and variability analysis of two quiescent low mass X-ray
binaries (X5 and X7, previously detected with the ROSAT HRI) in a Chandra
ACIS-I observation of the globular cluster 47 Tuc. X5 demonstrates sharp
eclipses with an 8.666+-0.01 hr period, as well as dips showing an increased
N_H column. The thermal spectra of X5 and X7 are well-modeled by unmagnetized
hydrogen atmospheres of hot neutron stars. No hard power law component is
required. A possible edge or absorption feature is identified near 0.64 keV,
perhaps an OV edge from a hot wind. Spectral fits imply that X7 is
significantly more massive than the canonical 1.4 \Msun neutron star mass, with
M>1.8 \Msun for a radius range of 9-14 km, while X5's spectrum is consistent
with a neutron star of mass 1.4 \Msun for the same radius range. Alternatively,
if much of the X-ray luminosity is due to continuing accretion onto the neutron
star surface, the feature may be the 0.87 keV rest-frame absorption complex (O
VIII & other metal lines) intrinsic to the neutron star atmosphere, and a mass
of 1.4 \Msun for X7 may be allowed.Comment: 16 pages, 7 figures, accepted by Ap
A Block Minorization--Maximization Algorithm for Heteroscedastic Regression
The computation of the maximum likelihood (ML) estimator for heteroscedastic
regression models is considered. The traditional Newton algorithms for the
problem require matrix multiplications and inversions, which are bottlenecks in
modern Big Data contexts. A new Big Data-appropriate minorization--maximization
(MM) algorithm is considered for the computation of the ML estimator. The MM
algorithm is proved to generate monotonically increasing sequences of
likelihood values and to be convergent to a stationary point of the
log-likelihood function. A distributed and parallel implementation of the MM
algorithm is presented and the MM algorithm is shown to have differing time
complexity to the Newton algorithm. Simulation studies demonstrate that the MM
algorithm improves upon the computation time of the Newton algorithm in some
practical scenarios where the number of observations is large
- …