83,134 research outputs found
BlenX-based compositional modeling of complex reaction mechanisms
Molecular interactions are wired in a fascinating way resulting in complex
behavior of biological systems. Theoretical modeling provides a useful
framework for understanding the dynamics and the function of such networks. The
complexity of the biological networks calls for conceptual tools that manage
the combinatorial explosion of the set of possible interactions. A suitable
conceptual tool to attack complexity is compositionality, already successfully
used in the process algebra field to model computer systems. We rely on the
BlenX programming language, originated by the beta-binders process calculus, to
specify and simulate high-level descriptions of biological circuits. The
Gillespie's stochastic framework of BlenX requires the decomposition of
phenomenological functions into basic elementary reactions. Systematic
unpacking of complex reaction mechanisms into BlenX templates is shown in this
study. The estimation/derivation of missing parameters and the challenges
emerging from compositional model building in stochastic process algebras are
discussed. A biological example on circadian clock is presented as a case study
of BlenX compositionality
An Empirical Investigation of the Level of Users’ Acceptance of E-Banking in Nigeria
Nigeria was depicted to be the fastest growing telecommunications nation in African. Presently, all members of the Nigeria banking industry have engaged the use of Information and Communication Technology (ICT) as a platform for effective and efficient means of conducting financial transactions. This paper focuses on determining the level of users’ acceptance of the electronic banking services and investigating the factors that determine users’ behavioral intentions to use electronic banking systems in Nigeria.
The survey instrument employed involved design and administration of a total of 500 survey questionnaires within the Lagos metropolis and its environs. An extended Technology Acceptance Model (TAM) was employed as a conceptual framework to investigate the factors that influence users’ acceptance and intention to use electronic banking. To test the model, data was collected from 292 customers from various commercial banks in Nigeria. The model measured the impact of Perceived Credibility (PC), Computer Self-Efficacy (CSE), Perceived Usefulness (PU), and Perceived Ease of Use (PEOU) on customer attitude and customer attitude on customer adaptation.
The result of this research shows that ATM still remains the most widely used form e-Banking service. Banks’ customers who are active users of e-Banking system use it because it is convenient, easy to use, time saving and appropriate for their transaction needs. Also the network security and the security of the system in terms of privacy are the major concerns of the users and constitute hindrance to intending users
Index to Library Trends Volume 38
published or submitted for publicatio
An Instantiation-Based Approach for Solving Quantified Linear Arithmetic
This paper presents a framework to derive instantiation-based decision
procedures for satisfiability of quantified formulas in first-order theories,
including its correctness, implementation, and evaluation. Using this framework
we derive decision procedures for linear real arithmetic (LRA) and linear
integer arithmetic (LIA) formulas with one quantifier alternation. Our
procedure can be integrated into the solving architecture used by typical SMT
solvers. Experimental results on standardized benchmarks from model checking,
static analysis, and synthesis show that our implementation of the procedure in
the SMT solver CVC4 outperforms existing tools for quantified linear
arithmetic
Can a computer be "pushed" to perform faster-than-light?
We propose to "boost" the speed of communication and computation by immersing
the computing environment into a medium whose index of refraction is smaller
than one, thereby trespassing the speed-of-light barrier.Comment: 7 pages, 1 figure, presented at the UC10 Hypercomputation Workshop
"HyperNet 10" at The University of Tokyo on June 22, 201
Trends in Russian research output indexed in Scopus and Web of Science
Trends are analysed in the annual number of documents published by Russian
institutions and indexed in Scopus and Web of Science, giving special attention
to the time period starting in the year 2013 in which the Project 5-100 was
launched by the Russian Government. Numbers are broken down by document type,
publication language, type of source, research discipline, country and source.
It is concluded that Russian publication counts strongly depend upon the
database used, and upon changes in database coverage, and that one should be
cautious when using indicators derived from WoS, and especially from Scopus, as
tools in the measurement of research performance and international orientation
of the Russian science system.Comment: Author copy of a manuscript accepted for publication in the journal
Scientometrics, May 201
Towards MKM in the Large: Modular Representation and Scalable Software Architecture
MKM has been defined as the quest for technologies to manage mathematical
knowledge. MKM "in the small" is well-studied, so the real problem is to scale
up to large, highly interconnected corpora: "MKM in the large". We contend that
advances in two areas are needed to reach this goal. We need representation
languages that support incremental processing of all primitive MKM operations,
and we need software architectures and implementations that implement these
operations scalably on large knowledge bases.
We present instances of both in this paper: the MMT framework for modular
theory-graphs that integrates meta-logical foundations, which forms the base of
the next OMDoc version; and TNTBase, a versioned storage system for XML-based
document formats. TNTBase becomes an MMT database by instantiating it with
special MKM operations for MMT.Comment: To appear in The 9th International Conference on Mathematical
Knowledge Management: MKM 201
- …