2,979 research outputs found
ZETA - Zero-Trust Authentication: Relying on Innate Human Ability, not Technology
Reliable authentication requires the devices and
channels involved in the process to be trustworthy; otherwise
authentication secrets can easily be compromised. Given the
unceasing efforts of attackers worldwide such trustworthiness
is increasingly not a given. A variety of technical solutions,
such as utilising multiple devices/channels and verification
protocols, has the potential to mitigate the threat of untrusted
communications to a certain extent. Yet such technical solutions
make two assumptions: (1) users have access to multiple
devices and (2) attackers will not resort to hacking the human,
using social engineering techniques. In this paper, we propose
and explore the potential of using human-based computation
instead of solely technical solutions to mitigate the threat of
untrusted devices and channels. ZeTA (Zero Trust Authentication
on untrusted channels) has the potential to allow people to
authenticate despite compromised channels or communications
and easily observed usage. Our contributions are threefold:
(1) We propose the ZeTA protocol with a formal definition
and security analysis that utilises semantics and human-based
computation to ameliorate the problem of untrusted devices
and channels. (2) We outline a security analysis to assess
the envisaged performance of the proposed authentication
protocol. (3) We report on a usability study that explores the
viability of relying on human computation in this context
KARL: A Knowledge-Assisted Retrieval Language
Data classification and storage are tasks typically performed by application specialists. In contrast, information users are primarily non-computer specialists who use information in their decision-making and other activities. Interaction efficiency between such users and the computer is often reduced by machine requirements and resulting user reluctance to use the system. This thesis examines the problems associated with information retrieval for non-computer specialist users, and proposes a method for communicating in restricted English that uses knowledge of the entities involved, relationships between entities, and basic English language syntax and semantics to translate the user requests into formal queries. The proposed method includes an intelligent dictionary, syntax and semantic verifiers, and a formal query generator. In addition, the proposed system has a learning capability that can improve portability and performance. With the increasing demand for efficient human-machine communication, the significance of this thesis becomes apparent. As human resources become more valuable, software systems that will assist in improving the human-machine interface will be needed and research addressing new solutions will be of utmost importance. This thesis presents an initial design and implementation as a foundation for further research and development into the emerging field of natural language database query systems
An Introduction to Mechanized Reasoning
Mechanized reasoning uses computers to verify proofs and to help discover new
theorems. Computer scientists have applied mechanized reasoning to economic
problems but -- to date -- this work has not yet been properly presented in
economics journals. We introduce mechanized reasoning to economists in three
ways. First, we introduce mechanized reasoning in general, describing both the
techniques and their successful applications. Second, we explain how mechanized
reasoning has been applied to economic problems, concentrating on the two
domains that have attracted the most attention: social choice theory and
auction theory. Finally, we present a detailed example of mechanized reasoning
in practice by means of a proof of Vickrey's familiar theorem on second-price
auctions
Ontology-based methodology for error detection in software design
Improving the quality of a software design with the goal of producing a high quality software product continues to grow in importance due to the costs that result from poorly designed software. It is commonly accepted that multiple design views are required in order to clearly specify the required functionality of software. There is universal agreement as to the importance of identifying inconsistencies early in the software design process, but the challenge is how to reconcile the representations of the diverse views to ensure consistency. To address the problem of inconsistencies that occur across multiple design views, this research introduces the Methodology for Objects to Agents (MOA). MOA utilizes a new ontology, the Ontology for Software Specification and Design (OSSD), as a common information model to integrate specification knowledge and design knowledge in order to facilitate the interoperability of formal requirements modeling tools and design tools, with the end goal of detecting inconsistency errors in a design. The methodology, which transforms designs represented using the Unified Modeling Language (UML) into representations written in formal agent-oriented modeling languages, integrates object-oriented concepts and agent-oriented concepts in order to take advantage of the benefits that both approaches can provide. The OSSD model is a hierarchical decomposition of software development concepts, including ontological constructs of objects, attributes, behavior, relations, states, transitions, goals, constraints, and plans. The methodology includes a consistency checking process that defines a consistency framework and an Inter-View Inconsistency Detection technique. MOA enhances software design quality by integrating multiple software design views, integrating object-oriented and agent-oriented concepts, and defining an error detection method that associates rules with ontological properties
The use of data-mining for the automatic formation of tactics
This paper discusses the usse of data-mining for the automatic formation of tactics. It was presented at the Workshop on Computer-Supported Mathematical Theory Development held at IJCAR in 2004. The aim of this project is to evaluate the applicability of data-mining techniques to the automatic formation of tactics from large corpuses of proofs. We data-mine information from large proof corpuses to find commonly occurring patterns. These patterns are then evolved into tactics using genetic programming techniques
Equivalence-based Security for Querying Encrypted Databases: Theory and Application to Privacy Policy Audits
Motivated by the problem of simultaneously preserving confidentiality and
usability of data outsourced to third-party clouds, we present two different
database encryption schemes that largely hide data but reveal enough
information to support a wide-range of relational queries. We provide a
security definition for database encryption that captures confidentiality based
on a notion of equivalence of databases from the adversary's perspective. As a
specific application, we adapt an existing algorithm for finding violations of
privacy policies to run on logs encrypted under our schemes and observe low to
moderate overheads.Comment: CCS 2015 paper technical report, in progres
A Hybrid Analysis for Security Protocols with State
Cryptographic protocols rely on message-passing to coordinate activity among
principals. Each principal maintains local state in individual local sessions
only as needed to complete that session. However, in some protocols a principal
also uses state to coordinate its different local sessions. Sometimes the
non-local, mutable state is used as a means, for example with smart cards or
Trusted Platform Modules. Sometimes it is the purpose of running the protocol,
for example in commercial transactions.
Many richly developed tools and techniques, based on well-understood
foundations, are available for design and analysis of pure message-passing
protocols. But the presence of cross-session state poses difficulties for these
techniques.
In this paper we provide a framework for modeling stateful protocols. We
define a hybrid analysis method. It leverages theorem-proving---in this
instance, the PVS prover---for reasoning about computations over state. It
combines that with an "enrich-by-need" approach---embodied by CPSA---that
focuses on the message-passing part. As a case study we give a full analysis of
the Envelope Protocol, due to Mark Ryan
- ā¦