5,992 research outputs found
Linked data in BGS and its potential in model fusion
The British Geological Survey has been conducting a pilot project into the use
of Linked Data. Linked Data is a best practice for using the web to expose,
share and connect pieces of data, information and knowledge. It facilitates
connections between previously unrelated data, and lowers the barriers to
linking data currently linked using other methods. In essence, linked data involves publishing snippets of information as
independent ‘triples’, made up of a subject, a predicate and an object. A
subject is referenced by a URI and can represent any resource: a person,
organisation, concept, dataset, model, application etc. A predicate is a
property or relationship assigned to the subject, and is also referenced as a
URI. An object is the value of the property or object of the relationship; this
may be a resource referenced as a URI or a literal value such as a number or
text string.
Data linkages come about because anyone can publish a statement about
anyone else’s resources, and resource URIs for subjects and objects can be
matched up.
Data linkages are also enhanced because anyone can (and should where
possible) re-use anyone else’s predicates, thereby using a common language
to describe information.
BGS’s pilot project is about to publish three of our major vocabularies
(Lexicon of Named Rock Units, Geochronological timescale, Rock
Classification Scheme) and our 625k 2D geological map in linked data form.
We have added links between our resources and those defined in external
linked data sources where possible, including DBPedia (a linked data version
of Wikipedia), the Ordnance Survey and the BBC Wildlife Finder website.
Further work is necessary to improve the links to parallel vocabulary schemes
defined by international organisations.
The benefit of linked data is that rather than an end-user having to do
investigative work to uncover the syntax and semantics of disparate datasets
in order to integrate them, data published according to the Linked Data
recommendations provides this information up front in an unambiguous and
instantly available form. The user will have all the information at hand to
integrate the data in a logical and scientifically valid way.
This presentation will speculate as to how this approach may be applied to
enable models to communicate and exchange information at run-time, for
example using an interoperable vocabulary for physical properties, spatial and
temporal dimensions and methodologies. Linked data can also be used to
describe a common vocabulary for model parameters and the relationships
and dependencies between them, thereby exposing feedback mechanisms
between separate models or algorithms
Design and implementation of the node identity internetworking architecture
The Internet Protocol (IP) has been proven very flexible, being able to accommodate all kinds of link technologies and supporting a broad range of applications. The basic principles of the original Internet architecture include end-to-end addressing, global routeability and a single namespace of IP addresses that unintentionally serves both as locators and host identifiers. The commercial success and widespread use of the Internet have lead to new requirements, which include internetworking over business boundaries, mobility and multi-homing in an untrusted environment. Our approach to satisfy these new requirements is to introduce a new internetworking layer, the node identity layer. Such a layer runs on top of the different versions of IP, but could also run directly on top of other kinds of network technologies, such as MPLS and 2G/3G PDP contexts. This approach enables connectivity across different communication technologies, supports mobility, multi-homing, and security from ground up. This paper describes the Node Identity Architecture in detail and discusses the experiences from implementing and running a prototype
The dynamics of iterated transportation simulations
Iterating between a router and a traffic micro-simulation is an increasibly
accepted method for doing traffic assignment. This paper, after pointing out
that the analytical theory of simulation-based assignment to-date is
insufficient for some practical cases, presents results of simulation studies
from a real world study. Specifically, we look into the issues of uniqueness,
variability, and robustness and validation. Regarding uniqueness, despite some
cautionary notes from a theoretical point of view, we find no indication of
``meta-stable'' states for the iterations. Variability however is considerable.
By variability we mean the variation of the simulation of a given plan set by
just changing the random seed. We show then results from three different
micro-simulations under the same iteration scenario in order to test for the
robustness of the results under different implementations. We find the results
encouraging, also when comparing to reality and with a traditional assignment
result.
Keywords: dynamic traffic assignment (DTA); traffic micro-simulation;
TRANSIMS; large-scale simulations; urban planningComment: 24 pages, 7 figure
The Optimisation of Stochastic Grammars to Enable Cost-Effective Probabilistic Structural Testing
The effectiveness of probabilistic structural testing depends on the characteristics of the probability distribution from which test inputs are sampled at random. Metaheuristic search has been shown to be a practical method of optimis- ing the characteristics of such distributions. However, the applicability of the existing search-based algorithm is lim- ited by the requirement that the software’s inputs must be a fixed number of numeric values. In this paper we relax this limitation by means of a new representation for the probability distribution. The repre- sentation is based on stochastic context-free grammars but incorporates two novel extensions: conditional production weights and the aggregation of terminal symbols represent- ing numeric values. We demonstrate that an algorithm which combines the new representation with hill-climbing search is able to effi- ciently derive probability distributions suitable for testing software with structurally-complex input domains
Defining probability-based rail station catchments for demand modelling
The aggregate models commonly used in the UK to estimate demand for new local rail stations require the station catchment to be defined first, so that inputs into the model, such as the population from which demand will be generated, can be specified. The methods typically used to define the catchment implicitly assume that station choice is a deterministic process, and that stations exist in isolation from each other. However, studies show that pre-defined catchments account for only 50-60 percent of observed trips, choice of station is not homogeneous within zones, catchments overlap, and catchments vary by access mode and station type. This paper describes early work to implement an alternative probability-based approach, through the development of a station choice prediction model. To derive realistic station access journey explanatory variables, a routable multi-modal network, incorporating data from OpenStreetMap, the Traveline National Data Set and National Rail timetable, was built using OpenTripPlanner and queried using an API wrapper developed in R. Results from a series of multinomial logit models are presented and a method for generating probabilistic catchments using estimated parameter values is described. An example probabilistic catchment is found to provide a realistic representation of the observed catchment, and to perform better than deterministic catchments
On the interaction of Jupiter's Great Red Spot and zonal jet streams
In this paper, Jupiter's Great Red Spot (GRS) is used to determine properties
of the Jovian atmosphere that cannot otherwise be found. These properties
include the potential vorticity of the GRS and its neighboring jet streams, the
shear imposed on the GRS by the jet streams, and the vertical entropy gradient
(i.e., Rossby deformation radius). The cloud cover of the GRS, which is often
used to define the GRS's area and aspect ratio, is found to differ
significantly from the region of the GRS's potential vorticity anomaly. The
westward-going jet stream to the north of the GRS and the eastward-going jet
stream to its south are each found to have a large potential vorticity
``jump''. The jumps have opposite sign and as a consequence of their
interaction with the GRS, the shear imposed on the GRS is reduced. The
east-west to north-south aspect ratio of the GRS's potential vorticity anomaly
depends on the ratio of the imposed shear to the strength of the anomaly. The
aspect ratio is found to be 2:1, but without the opposing jumps it
would be much greater. The GRS's high-speed collar and quiescent interior
require that the potential vorticity in the interior be approximately half that
in the collar. No other persistent geophysical vortex has a significant minimum
of potential vorticity in its interior and laboratory vortices with such a
minimum are unstable.Comment: Manuscript accepted to Journal of the Atmospheric Sciences, March
2007. v2: minor stylistic changes (after journal proof reading
There\u27s an App for That: Promoting Health App Use in Rural Ireland
Problem Statement: Smartphones and mobile applications (commonly referred to as apps) were first introduced in the late 20th century and early 21st century. Due to the public’s time constraints, lack of transportation, lack of medical insurance, and a growing desire for healthier lifestyles, the total global mHealth market forecast to reach 100 billion dollars in 2021 – a fivefold increase from 21 billion in 2016. mHealth apps have been successfully used for health promotion activities but barriers such as lack of knowledge and comfort in using health apps exist.
Purpose: Evaluate readiness of a rural community and the effectiveness of health-related app educational sessions on increasing knowledge, comfort in using mHealth apps, and intent to use mHealth apps.
Methodology: A one-group pre-test/post-test design was used to evaluate mHealth app educational sessions offered at a community center in rural Ireland. A convenience sample of 56 individuals (middle/high school students) and adults who routinely access services at the community center participated in a mHealth app educational session.
Procedure: 56 participants were enrolled in the study, 36 females and 20 males. Participants ranged in age from 16-67 with a mode of 17. Nearly all of participants (96.4%) reported having access to a smart phone with time spent per day using the smartphone averaging 2 hours and 43 minutes. The majority (92.9%) of the participants used apps on their phones but only 41.1% used health-related apps. Ninety-eight percent of the participants have internet access at home but only 23.2% conducted health related research online and even fewer have electronic access to health records (7.1%) or communicate with a health care provider electronically (3.6%). Twenty one percent of participants reported using “wearables” to monitor their personal health. There were no changes in procedures or to the anticipated risks or benefits.
Results: After the educational session, participants reported they were more knowledgeable about mHealth apps, more comfortable using mHealth apps and were more likely to use mHealth apps. The self-reported post education knowledge mean was 68.09% on a scale of 0-100. The self-reported knowledge mean increased by 30, statistically significant at p
Conclusion: Providing educational sessions with hands-on demonstrations and practice is an effective strategy to increase knowledge, comfort and intent on utilization of mHealth apps for health promotion activities. Removing some of the common barriers to the utilization of mHealth apps increased the likelihood of their use and offers an accessible tool for health promotion activities to underserved populations in rural communities
Spin Amplification for Magnetic Sensors Employing Crystal Defects
Recently there have been several theoretical and experimental studies of the
prospects for magnetic field sensors based on crystal defects, especially
nitrogen vacancy (NV) centres in diamond. Such systems could potentially be
incorporated into an AFM-like apparatus in order to map the magnetic properties
of a surface at the single spin level. In this Letter we propose an augmented
sensor consisting of an NV centre for readout and an `amplifier' spin system
that directly senses the local magnetic field. Our calculations show that this
hybrid structure has the potential to detect magnetic moments with a
sensitivity and spatial resolution far beyond that of a simple NV centre, and
indeed this may be the physical limit for sensors of this class
- …