38,427 research outputs found
Longitudinal Study of Child Face Recognition
We present a longitudinal study of face recognition performance on Children
Longitudinal Face (CLF) dataset containing 3,682 face images of 919 subjects,
in the age group [2, 18] years. Each subject has at least four face images
acquired over a time span of up to six years. Face comparison scores are
obtained from (i) a state-of-the-art COTS matcher (COTS-A), (ii) an open-source
matcher (FaceNet), and (iii) a simple sum fusion of scores obtained from COTS-A
and FaceNet matchers. To improve the performance of the open-source FaceNet
matcher for child face recognition, we were able to fine-tune it on an
independent training set of 3,294 face images of 1,119 children in the age
group [3, 18] years. Multilevel statistical models are fit to genuine
comparison scores from the CLF dataset to determine the decrease in face
recognition accuracy over time. Additionally, we analyze both the verification
and open-set identification accuracies in order to evaluate state-of-the-art
face recognition technology for tracing and identifying children lost at a
young age as victims of child trafficking or abduction
Multiple verification in computational modeling of bone pathologies
We introduce a model checking approach to diagnose the emerging of bone
pathologies. The implementation of a new model of bone remodeling in PRISM has
led to an interesting characterization of osteoporosis as a defective bone
remodeling dynamics with respect to other bone pathologies. Our approach allows
to derive three types of model checking-based diagnostic estimators. The first
diagnostic measure focuses on the level of bone mineral density, which is
currently used in medical practice. In addition, we have introduced a novel
diagnostic estimator which uses the full patient clinical record, here
simulated using the modeling framework. This estimator detects rapid (months)
negative changes in bone mineral density. Independently of the actual bone
mineral density, when the decrease occurs rapidly it is important to alarm the
patient and monitor him/her more closely to detect insurgence of other bone
co-morbidities. A third estimator takes into account the variance of the bone
density, which could address the investigation of metabolic syndromes, diabetes
and cancer. Our implementation could make use of different logical combinations
of these statistical estimators and could incorporate other biomarkers for
other systemic co-morbidities (for example diabetes and thalassemia). We are
delighted to report that the combination of stochastic modeling with formal
methods motivate new diagnostic framework for complex pathologies. In
particular our approach takes into consideration important properties of
biosystems such as multiscale and self-adaptiveness. The multi-diagnosis could
be further expanded, inching towards the complexity of human diseases. Finally,
we briefly introduce self-adaptiveness in formal methods which is a key
property in the regulative mechanisms of biological systems and well known in
other mathematical and engineering areas.Comment: In Proceedings CompMod 2011, arXiv:1109.104
Oscillation-based DFT for Second-order Bandpass OTA-C Filters
This document is the Accepted Manuscript version. Under embargo until 6 September 2018. The final publication is available at Springer via https://doi.org/10.1007/s00034-017-0648-9.This paper describes a design for testability technique for second-order bandpass operational transconductance amplifier and capacitor filters using an oscillation-based test topology. The oscillation-based test structure is a vectorless output test strategy easily extendable to built-in self-test. The proposed methodology converts filter under test into a quadrature oscillator using very simple techniques and measures the output frequency. Using feedback loops with nonlinear block, the filter-to-oscillator conversion techniques easily convert the bandpass OTA-C filter into an oscillator. With a minimum number of extra components, the proposed scheme requires a negligible area overhead. The validity of the proposed method has been verified using comparison between faulty and fault-free simulation results of Tow-Thomas and KHN OTA-C filters. Simulation results in 0.25μm CMOS technology show that the proposed oscillation-based test strategy for OTA-C filters is suitable for catastrophic and parametric faults testing and also effective in detecting single and multiple faults with high fault coverage.Peer reviewedFinal Accepted Versio
A Survey of Languages for Specifying Dynamics: A Knowledge Engineering Perspective
A number of formal specification languages for knowledge-based systems has been developed. Characteristics for knowledge-based systems are a complex knowledge base and an inference engine which uses this knowledge to solve a given problem. Specification languages for knowledge-based systems have to cover both aspects. They have to provide the means to specify a complex and large amount of knowledge and they have to provide the means to specify the dynamic reasoning behavior of a knowledge-based system. We focus on the second aspect. For this purpose, we survey existing approaches for specifying dynamic behavior in related areas of research. In fact, we have taken approaches for the specification of information systems (Language for Conceptual Modeling and TROLL), approaches for the specification of database updates and logic programming (Transaction Logic and Dynamic Database Logic) and the generic specification framework of abstract state machine
Active Sampling-based Binary Verification of Dynamical Systems
Nonlinear, adaptive, or otherwise complex control techniques are increasingly
relied upon to ensure the safety of systems operating in uncertain
environments. However, the nonlinearity of the resulting closed-loop system
complicates verification that the system does in fact satisfy those
requirements at all possible operating conditions. While analytical proof-based
techniques and finite abstractions can be used to provably verify the
closed-loop system's response at different operating conditions, they often
produce conservative approximations due to restrictive assumptions and are
difficult to construct in many applications. In contrast, popular statistical
verification techniques relax the restrictions and instead rely upon
simulations to construct statistical or probabilistic guarantees. This work
presents a data-driven statistical verification procedure that instead
constructs statistical learning models from simulated training data to separate
the set of possible perturbations into "safe" and "unsafe" subsets. Binary
evaluations of closed-loop system requirement satisfaction at various
realizations of the uncertainties are obtained through temporal logic
robustness metrics, which are then used to construct predictive models of
requirement satisfaction over the full set of possible uncertainties. As the
accuracy of these predictive statistical models is inherently coupled to the
quality of the training data, an active learning algorithm selects additional
sample points in order to maximize the expected change in the data-driven model
and thus, indirectly, minimize the prediction error. Various case studies
demonstrate the closed-loop verification procedure and highlight improvements
in prediction error over both existing analytical and statistical verification
techniques.Comment: 23 page
- …