963 research outputs found
Biometrics-as-a-Service: A Framework to Promote Innovative Biometric Recognition in the Cloud
Biometric recognition, or simply biometrics, is the use of biological
attributes such as face, fingerprints or iris in order to recognize an
individual in an automated manner. A key application of biometrics is
authentication; i.e., using said biological attributes to provide access by
verifying the claimed identity of an individual. This paper presents a
framework for Biometrics-as-a-Service (BaaS) that performs biometric matching
operations in the cloud, while relying on simple and ubiquitous consumer
devices such as smartphones. Further, the framework promotes innovation by
providing interfaces for a plurality of software developers to upload their
matching algorithms to the cloud. When a biometric authentication request is
submitted, the system uses a criteria to automatically select an appropriate
matching algorithm. Every time a particular algorithm is selected, the
corresponding developer is rendered a micropayment. This creates an innovative
and competitive ecosystem that benefits both software developers and the
consumers. As a case study, we have implemented the following: (a) an ocular
recognition system using a mobile web interface providing user access to a
biometric authentication service, and (b) a Linux-based virtual machine
environment used by software developers for algorithm development and
submission
Mobile Computing in Physics Analysis - An Indicator for eScience
This paper presents the design and implementation of a Grid-enabled physics
analysis environment for handheld and other resource-limited computing devices
as one example of the use of mobile devices in eScience. Handheld devices offer
great potential because they provide ubiquitous access to data and
round-the-clock connectivity over wireless links. Our solution aims to provide
users of handheld devices the capability to launch heavy computational tasks on
computational and data Grids, monitor the jobs status during execution, and
retrieve results after job completion. Users carry their jobs on their handheld
devices in the form of executables (and associated libraries). Users can
transparently view the status of their jobs and get back their outputs without
having to know where they are being executed. In this way, our system is able
to act as a high-throughput computing environment where devices ranging from
powerful desktop machines to small handhelds can employ the power of the Grid.
The results shown in this paper are readily applicable to the wider eScience
community.Comment: 8 pages, 7 figures. Presented at the 3rd Int Conf on Mobile Computing
& Ubiquitous Networking (ICMU06. London October 200
VXA: A Virtual Architecture for Durable Compressed Archives
Data compression algorithms change frequently, and obsolete decoders do not
always run on new hardware and operating systems, threatening the long-term
usability of content archived using those algorithms. Re-encoding content into
new formats is cumbersome, and highly undesirable when lossy compression is
involved. Processor architectures, in contrast, have remained comparatively
stable over recent decades. VXA, an archival storage system designed around
this observation, archives executable decoders along with the encoded content
it stores. VXA decoders run in a specialized virtual machine that implements an
OS-independent execution environment based on the standard x86 architecture.
The VXA virtual machine strictly limits access to host system services, making
decoders safe to run even if an archive contains malicious code. VXA's adoption
of a "native" processor architecture instead of type-safe language technology
allows reuse of existing "hand-optimized" decoders in C and assembly language,
and permits decoders access to performance-enhancing architecture features such
as vector processing instructions. The performance cost of VXA's virtualization
is typically less than 15% compared with the same decoders running natively.
The storage cost of archived decoders, typically 30-130KB each, can be
amortized across many archived files sharing the same compression method.Comment: 14 pages, 7 figures, 2 table
Evaluation of Malware Target Recognition Deployed in a Cloud-Based Fileserver Environment
Cloud computing, or the migration of computing resources from the end user to remotely managed locations where they can be purchased on-demand, presents several new and unique security challenges. One of these challenges is how to efficiently detect malware amongst files that are possibly spread across multiple locations in the Internet over congested network connections. This research studies how such an environment will impact the performance of malware detection. A simplified cloud environment is created in which network conditions are fully controlled. This environment includes a fileserver, a detection server, the detection mechanism, and clean and malicious file sample sets. The performance of a novel malware detection algorithm called Malware Target Recognition (MaTR) is evaluated and compared with several commercial detection mechanisms at various levels of congestion. The research evaluates performance in terms of file response time and detection accuracy rates. Results show that there is no statistically significant difference in MaTR\u27s true mean response time when scanning clean files with low to moderate levels of congestion compared to the leading commercial response times with a 95% confidence level. MaTR demonstrates a slightly faster response time, by roughly 0.1s to 0.2s, at detecting malware than the leading commercial mechanisms\u27 response time at these congestion levels, but MaTR is also the only device that exhibits false positives with a 0.3% false positive rate. When exposed to high levels of congestion, MaTR\u27s response time is impacted by a factor of 88 to 817 for clean files and 227 to 334 for malicious files, losing its performance competitiveness with other leading detection mechanisms. MaTR\u27s true positive detection rates are extremely competitive at 99.1%
A Mobile Computing Architecture for Numerical Simulation
The domain of numerical simulation is a place where the parallelization of
numerical code is common. The definition of a numerical context means the
configuration of resources such as memory, processor load and communication
graph, with an evolving feature: the resources availability. A feature is often
missing: the adaptability. It is not predictable and the adaptable aspect is
essential. Without calling into question these implementations of these codes,
we create an adaptive use of these implementations. Because the execution has
to be driven by the availability of main resources, the components of a numeric
computation have to react when their context changes. This paper offers a new
architecture, a mobile computing architecture, based on mobile agents and
JavaSpace. At the end of this paper, we apply our architecture to several case
studies and obtain our first results
Agent-based Vs Agent-less Sandbox for Dynamic Behavioral Analysis
Malicious software is detected and classified by either static analysis or dynamic analysis. In static analysis, malware samples are reverse engineered and analyzed so that signatures of malware can be constructed. These techniques can be easily thwarted through polymorphic, metamorphic malware, obfuscation and packing techniques, whereas in dynamic analysis malware samples are executed in a controlled environment using the sandboxing technique, in order to model the behavior of malware. In this paper, we have analyzed Petya, Spyeye, VolatileCedar, PAFISH etc. through Agent-based and Agentless dynamic sandbox systems in order to investigate and benchmark their efficiency in advanced malware detection
- …