1,675 research outputs found
'BioNessie(G) - a grid enabled biochemical networks simulation environment
The simulation of biochemical networks provides insight and
understanding about the underlying biochemical processes and pathways
used by cells and organisms. BioNessie is a biochemical network simulator
which has been developed at the University of Glasgow. This paper
describes the simulator and focuses in particular on how it has been
extended to benefit from a wide variety of high performance compute resources
across the UK through Grid technologies to support larger scale
simulations
Some Issues in the Methodology of Attitude Research. ESRI Policy Series No. 3. November 1980
Following the publication of the paper Attitudes in the Republic of Ireland Relevant to the Northern Ireland Problem (ESRI Paper No. 97), by E. E. Davis and R. Sinnott, the Executive Committee of the lnstitute proposed that a paper be devoted to the subject which would enable the authors to respond fully to criticisms of Paper No. 97, and would allow the methodological and scholarly Issues that arise to be discussed in an appropriate academic forum. The
present paper is the outcome
The pros and cons of using SDL for creation of distributed services
In a competitive market for the creation of complex distributed services, time to market, development cost, maintenance and flexibility are key issues. Optimizing the development process is very much a matter of optimizing the technologies used during service creation. This paper reports on the experience gained in the Service Creation projects SCREEN and TOSCA on use of the language SDL for efficient service creation
Specifying ODP computational objects in Z
The computational viewpoint contained within the Reference Model of Open Distributed Processing (RM-ODP) shows how collections of objects can be configured within a distributed system to enable interworking. It prescribes certain capabilities that such objects are expected to possess and structuring rules that apply to how these objects can be configured with one another. This paper highlights how the specification language Z can be used to formalise these capabilities and the associated structuring rules, thereby enabling specifications of ODP systems from the computational viewpoint to be achieved
Learning Points and Routes to Recommend Trajectories
The problem of recommending tours to travellers is an important and broadly
studied area. Suggested solutions include various approaches of
points-of-interest (POI) recommendation and route planning. We consider the
task of recommending a sequence of POIs, that simultaneously uses information
about POIs and routes. Our approach unifies the treatment of various sources of
information by representing them as features in machine learning algorithms,
enabling us to learn from past behaviour. Information about POIs are used to
learn a POI ranking model that accounts for the start and end points of tours.
Data about previous trajectories are used for learning transition patterns
between POIs that enable us to recommend probable routes. In addition, a
probabilistic model is proposed to combine the results of POI ranking and the
POI to POI transitions. We propose a new F score on pairs of POIs that
capture the order of visits. Empirical results show that our approach improves
on recent methods, and demonstrate that combining points and routes enables
better trajectory recommendations
HypTrails: A Bayesian Approach for Comparing Hypotheses About Human Trails on the Web
When users interact with the Web today, they leave sequential digital trails
on a massive scale. Examples of such human trails include Web navigation,
sequences of online restaurant reviews, or online music play lists.
Understanding the factors that drive the production of these trails can be
useful for e.g., improving underlying network structures, predicting user
clicks or enhancing recommendations. In this work, we present a general
approach called HypTrails for comparing a set of hypotheses about human trails
on the Web, where hypotheses represent beliefs about transitions between
states. Our approach utilizes Markov chain models with Bayesian inference. The
main idea is to incorporate hypotheses as informative Dirichlet priors and to
leverage the sensitivity of Bayes factors on the prior for comparing hypotheses
with each other. For eliciting Dirichlet priors from hypotheses, we present an
adaption of the so-called (trial) roulette method. We demonstrate the general
mechanics and applicability of HypTrails by performing experiments with (i)
synthetic trails for which we control the mechanisms that have produced them
and (ii) empirical trails stemming from different domains including website
navigation, business reviews and online music played. Our work expands the
repertoire of methods available for studying human trails on the Web.Comment: Published in the proceedings of WWW'1
UTOPIA—User-Friendly Tools for Operating Informatics Applications
Bioinformaticians routinely analyse vast amounts of information held both in large
remote databases and in flat data files hosted on local machines. The contemporary
toolkit available for this purpose consists of an ad hoc collection of data manipulation
tools, scripting languages and visualization systems; these must often be combined in
complex and bespoke ways, the result frequently being an unwieldy artefact capable of
one specific task, which cannot easily be exploited or extended by other practitioners.
Owing to the sizes of current databases and the scale of the analyses necessary,
routine bioinformatics tasks are often automated, but many still require the unique
experience and intuition of human researchers: this requires tools that support real-time
interaction with complex datasets. Many existing tools have poor user interfaces
and limited real-time performance when applied to realistically large datasets; much
of the user's cognitive capacity is therefore focused on controlling the tool rather
than on performing the research. The UTOPIA project is addressing some of these
issues by building reusable software components that can be combined to make
useful applications in the field of bioinformatics. Expertise in the fields of human
computer interaction, high-performance rendering, and distributed systems is being
guided by bioinformaticians and end-user biologists to create a toolkit that is both
architecturally sound from a computing point of view, and directly addresses end-user
and application-developer requirements
- …