942 research outputs found
A personal identification biometric system based on back-of-hand vein patterns
This report describes research on the use of back-of-hand vein patterns as a means of uniquely identifying people. In particular it describes a prototype biometric system developed by the Australian Institute of Security and Applied Technology (AISAT). This system comprises an infrared cold source, a monochrome CCD camera, a monochrome frame-grabber, a personal computer, and custom image acquisition, processing, registration, and matching software. The image processing algorithms are based on Mathematical Morphology. Registration is performed using rotation and translation with respect to the centroid of the two-dimensional domain of a hand. Vein patterns are stored as medial axis representations. Matching involves comparing a given medial axis pattern against a library of patterns using constrained sequential correlation. The matching is two-fold: a newly acquired signature is matched against a dilated library signature, and then the library signature is matched against the dilated acquired signature; this is necessary because of the positional noise exhibited by the back-of-hand veins. The results of a cross-matching experiment for a sample of 20 adults and more than 100 hand images is detailed. In addition preliminary estimates of the false acceptance rate (FAR) and false rejection rate (FRR) for the prototype system are given. Fuzzy relaxation on an association graph is discussed as an alternative to sequential correlation for the matching of vein signatures. An example is provided (including a C program) illustrating the matching process for a pair of signatures obtained from the same hand. The example demonstrates the ability of the fuzzy relaxation method to deal with segmentation errors
Novel chromatin texture features for the classification of Pap smears
This paper presents a set of novel structural texture features for quantifying nuclear chromatin patterns in cells on a conventional Pap smear. The features are derived from an initial segmentation of the chromatin into bloblike texture primitives. The results of a comprehensive feature selection experiment, including the set of proposed structural texture features and a range of different cytology features drawn from the literature, show that two of the four top ranking features are structural texture features. They also show that a combination of structural and conventional features yields a classification performance of 0.954±0.019 (AUC±SE) for the discrimination of normal (NILM) and abnormal (LSIL and HSIL) slides. The results of a second classification experiment, using only normal-appearing cells from both normal and abnormal slides, demonstrates that a single structural texture feature measuring chromatin margination yields a classification performance of 0.815±0.019. Overall the results demonstrate the efficacy of the proposed structural approach and that it is possible to detect malignancy associated changes (MACs) in Papanicoloau stain
Data Descriptor:Simultaneous Acquisition of {EEG} and {NIRS} during Cognitive Tasks for an Open Access Dataset
Enhanced performance by a hybrid NIRS–EEG brain computer interface
Noninvasive Brain Computer Interfaces (BCI) have been promoted to be used for neuroprosthetics. However, reports on applications with electroencephalography (EEG) show a demand for a better accuracy and stability. Here we investigate whether near-infrared spectroscopy (NIRS) can be used to enhance the EEG approach. In our study both methods were applied simultaneously in a real-time Sensory Motor Rhythm (SMR)-based BCI paradigm, involving executed movements as well as motor imagery. We tested how the classification of NIRS data can complement ongoing real-time EEG classification. Our results show that simultaneous measurements of NIRS and EEG can significantly improve the classification accuracy of motor imagery in over 90% of considered subjects and increases performance by 5% on average (p < 0:01). However, the long time delay of the hemodynamic response may hinder an overall increase of bit-rates. Furthermore we find that EEG and NIRS complement each other in terms of information content and are thus a viable multimodal imaging technique, suitable for BCI
A Systematic Approach from a Comparison of Three Glucocorticoids
Solid lipid nanoparticles (SLNs) can enhance drug penetration into the skin,
yet the mechanism of the improved transport is not known in full. To unravel
the influence of the drug-particle interaction on penetration enhancement, 3
glucocorticoids (GCs), prednisolone (PD), the diester prednicarbate (PC) and
the monoester betamethasone 17-valerate (BMV), varying in structure and
lipophilicity, were loaded onto SLNs. Theoretical permeability coefficients
(cm/s) of the agents rank BMV (–6.38) ≧ PC (–6.57) > PD (–7.30). GC-particle
interaction, drug release and skin penetration were investigated including a
conventional oil-in-water cream for reference. Both with SLN and cream, PD
release was clearly superior to PC release which exceeded BMV release. With
the cream, the rank order did not change when studying skin penetration, and
skin penetration is thus predominantly influenced by drug release. Yet, the
penetration profile for the GCs loaded onto SLNs completely changed, and
differences between the steroids were almost lost. Thus, SLNs influence skin
penetration by an intrinsic mechanism linked to a specific interaction of the
drug-carrier complex and the skin surface, which becomes possible by the lipid
nature and nanosize of the carrier and appears not to be derived by testing
drug release. Interestingly, PC and PD uptake from SLN even resulted in
epidermal targeting. Thus, SLNs are not only able to improve skin penetration
of topically applied drugs, but may also be of particular interest when
specifically aiming to influence epidermal dysfunction
Verifying object-oriented programs with higher-order separation logic in Coq
We present a shallow Coq embedding of a higher-order separation logic with nested triples for an object-oriented programming language. Moreover, we develop novel specification and proof patterns for reasoning in higher-order separation logic with nested triples about programs that use interfaces and interface inheritance. In particular, we show how to use the higher-order features of the Coq formalisation to specify and reason modularly about programs that (1) depend on some unknown code satisfying a specification or that (2) return objects conforming to a certain specification. All of our results have been formally verified in the interactive theorem prover Coq
Recommended from our members
Engineering with logic: Rigorous test-oracle specification and validation for TCP/IP and the Sockets API
Conventional computer engineering relies on test-and-debug development processes, with the behavior of common interfaces described (at best) with prose specification documents. But prose specifications cannot be used in test-and-debug development in any automated way, and prose is a poor medium for expressing complex (and loose) specifications.
The TCP/IP protocols and Sockets API are a good example of this: they play a vital role in modern communication and computation, and interoperability between implementations is essential. But what exactly they are is surprisingly obscure: their original development focused on “rough consensus and running code,” augmented by prose RFC specifications that do not precisely define what it means for an implementation to be correct. Ultimately, the actual standard is the de facto one of the common implementations, including, for example, the 15 000 to 20 000 lines of the BSD implementation—optimized and multithreaded C code, time dependent, with asynchronous event handlers, intertwined with the operating system, and security critical.
This article reports on work done in the
Netsem
project to develop lightweight mathematically rigorous techniques that can be applied to such systems: to specify their behavior precisely (but loosely enough to permit the required implementation variation) and to test whether these specifications and the implementations correspond with specifications that are
executable as test oracles
. We developed post hoc specifications of TCP, UDP, and the Sockets API, both of the service that they provide to applications (in terms of TCP bidirectional stream connections) and of the internal operation of the protocol (in terms of TCP segments and UDP datagrams), together with a testable abstraction function relating the two. These specifications are rigorous, detailed, readable, with broad coverage, and rather accurate. Working within a general-purpose proof assistant (HOL4), we developed
language idioms
(within higher-order logic) in which to write the specifications: operational semantics with nondeterminism, time, system calls, monadic relational programming, and so forth. We followed an
experimental semantics
approach, validating the specifications against several thousand traces captured from three implementations (FreeBSD, Linux, and WinXP). Many differences between these were identified, as were a number of bugs. Validation was done using a special-purpose
symbolic model checker
programmed above HOL4.
Having demonstrated that our logic-based engineering techniques suffice for handling real-world protocols, we argue that similar techniques could be applied to future critical software infrastructure at design time, leading to cleaner designs and (via specification-based testing) more robust and predictable implementations. In cases where specification looseness can be controlled, this should be possible with lightweight techniques, without the need for a general-purpose proof assistant, at relatively little cost.EPSRC Programme Grant EP/K008528/1 REMS: Rigorous Engineering for Mainstream Systems
EPSRC Leadership Fellowship EP/H005633 (Sewell)
Royal Society University Research Fellowship (Sewell)
St Catharine's College Heller Research Fellowship (Wansbrough),
EPSRC grant GR/N24872 Wide-area programming: Language, Semantics and Infrastructure Design
EPSRC grant EP/C510712 NETSEM: Rigorous Semantics for Real
Systems
EC FET-GC project IST-2001-33234 PEPITO Peer-to-Peer Computing: Implementation and Theory
CMI UROP internship support (Smith)
EC Thematic Network IST-2001-38957 APPSEM 2
NICTA was funded by the Australian Government's Backing Australia's Ability initiative, in part through the Australian Research Council
Reshaping cortical activity with subthalamic stimulation in Parkinson's disease during finger tapping and gait mapped by near infrared spectroscopy
Exploration of motor cortex activity is essential to understanding the pathophysiology in Parkinson's Disease (PD), but only simple motor tasks can be investigated using a fMRI or PET. We aim to investigate the cortical activity of PD patients during a complex motor task (gait) to verify the impact of deep brain stimulation in the subthalamic nucleus (DBS-STN) by using Near-Infrared-Spectroscopy (NIRS). NIRS is a neuroimaging method of brain cortical activity using low-energy optical radiation to detect local changes in (de)oxyhemoglobin concentration. We used a multichannel portable NIRS during finger tapping (FT) and gait. To determine the signal activity, our methodology consisted of a pre-processing phase for the raw signal, followed by statistical analysis based on a general linear model. Processed recordings from 9 patients were statistically compared between the on and off states of DBS-STN. DBS-STN led to an increased activity in the contralateral motor cortex areas during FT. During gait, we observed a concentration of activity towards the cortex central area in the "stimulation-on" state. Our study shows how NIRS can be used to detect functional changes in the cortex of patients with PD with DBS-STN and indicates its future use for applications unsuited for PET and a fMRI
Formalized Verification of Snapshotable Trees: Separation and Sharing
Abstract. We use separation logic to specify and verify a Java program that implements snapshotable search trees, fully formalizing the specification and verification in the Coq proof assistant. We achieve local and modular reasoning about a tree and its snapshots and their iterators, although the implementation involves shared mutable heap data structures with no separation or ownership relation between the various data. The paper also introduces a series of four increasingly sophisticated implementations and verifies the first one. The others are included as future work and as a set of challenge problems for full functional specification and verification, whether by separation logic or by other formalisms.
Development and validation of the predicted heat strain model
Abstract
Eight laboratories participated in a concerted research project on the assessment of hot working conditions. The objectives were, among others, to co-ordinate the work of the main European research teams in the field of thermal factors and to improve the methods available to assess the risks of heat disorders at the workplace, and in particular the "Required Sweat Rate" model as presented in International Standard ISO 7933 Standard (1989). The scientific bases of this standard were thoroughly reviewed and a revised model, called "Predicted Heat Strain" (PHS), was developed. This model was then used to predict the minute by minute sweat rates and rectal temperatures during 909 laboratory and field experiments collected from the partners. The Pearson correlation coefficients between observed and predicted values were equal to 0.76 and 0.66 for laboratory experiments and 0.74 and 0.59 for field experiments, respectively, for the sweat rates and the rectal temperatures. The change in sweat rate with time was predicted more accurately by the PHS model than by the required sweat rate model. This suggests that the PHS model would provide an improved basis upon which to determine allowable exposure times from the predicted heat strain in terms of dehydration and increased core temperature
- …