15,895 research outputs found
Predictive biometrics: A review and analysis of predicting personal characteristics from biometric data
Interest in the exploitation of soft biometrics information has continued to develop over the last decade or so. In comparison with traditional biometrics, which focuses principally on person identification, the idea of soft biometrics processing is to study the utilisation of more general information regarding a system user, which is not necessarily unique. There are increasing indications that this type of data will have great value in providing complementary information for user authentication. However, the authors have also seen a growing interest in broadening the predictive capabilities of biometric data, encompassing both easily definable characteristics such as subject age and, most recently, `higher level' characteristics such as emotional or mental states. This study will present a selective review of the predictive capabilities, in the widest sense, of biometric data processing, providing an analysis of the key issues still adequately to be addressed if this concept of predictive biometrics is to be fully exploited in the future
Self-Evaluation Applied Mathematics 2003-2008 University of Twente
This report contains the self-study for the research assessment of the Department of Applied Mathematics (AM) of the Faculty of Electrical Engineering, Mathematics and Computer Science (EEMCS) at the University of Twente (UT). The report provides the information for the Research Assessment Committee for Applied Mathematics, dealing with mathematical sciences at the three universities of technology in the Netherlands. It describes the state of affairs pertaining to the period 1 January 2003 to 31 December 2008
Bibliographic Review on Distributed Kalman Filtering
In recent years, a compelling need has arisen to understand the effects of distributed information structures on estimation and filtering. In this paper, a bibliographical review on distributed Kalman filtering (DKF) is provided.\ud
The paper contains a classification of different approaches and methods involved to DKF. The applications of DKF are also discussed and explained separately. A comparison of different approaches is briefly carried out. Focuses on the contemporary research are also addressed with emphasis on the practical applications of the techniques. An exhaustive list of publications, linked directly or indirectly to DKF in the open literature, is compiled to provide an overall picture of different developing aspects of this area
Bibliometric studies on single journals: a review
This paper covers a total of 82 bibliometric studies on single journals (62 studies cover unique titles) published between 1998 and 2008 grouped into the following fields; Arts, Humanities and Social Sciences (12 items); Medical and Health Sciences (19 items); Sciences and Technology (30 items) and Library and Information Sciences (21 items). Under each field the studies are described in accordance to their geographical location in the following order, United Kingdom, United States and Americana, Europe, Asia (India, Africa and Malaysia). For each study, elements described are (a) the journal’s publication characteristics and indexation information; (b) the objectives; (c) the sampling and bibliometric measures used; and (d) the results observed. A list of journal titles studied is appended. The results show that (a)bibliometric studies cover journals in various fields; (b) there are several revisits of some journals which are considered important; (c) Asian and African contributions is high (41.4 of total studies; 43.5 covering unique titles), United States (30.4 of total; 31.0 on unique titles), Europe (18.2 of total and 14.5 on unique titles) and the United Kingdom (10 of total and 11 on unique titles); (d) a high number of bibliometrists are Indians and as such coverage of Indian journals is high (28 of total studies; 30.6 of unique titles); and (e) the quality of the journals and their importance either nationally or internationally are inferred from their indexation status
Recommended from our members
An Assessment of PIER Electric Grid Research 2003-2014 White Paper
This white paper describes the circumstances in California around the turn of the 21st century that led the California Energy Commission (CEC) to direct additional Public Interest Energy Research funds to address critical electric grid issues, especially those arising from integrating high penetrations of variable renewable generation with the electric grid. It contains an assessment of the beneficial science and technology advances of the resultant portfolio of electric grid research projects administered under the direction of the CEC by a competitively selected contractor, the University of California’s California Institute for Energy and the Environment, from 2003-2014
Assessing the Viability of Complex Electrical Impedance Tomography (EIT) with a Spatially Distributed Sensor Array for Imaging of River Bed Morphology: a Proof of Concept (Study)
This report was produced as part of a NERC funded ‘Connect A’ project to establish a new collaborative partnership between the University of Worcester (UW) and Q-par Angus Ltd. The project aim was to assess the potential of using complex Electrical Impedance Tomography (EIT) to image river bed morphology. An assessment of the viability of sensors inserted vertically into the channel margins to provide real-time or near real-time monitoring of bed morphology is reported. Funding has enabled UW to carry out a literature review of the use of EIT and existing methods used for river bed surveys, and outline the requirements of potential end-users. Q-par Angus has led technical developments and assessed the viability of EIT for this purpose.
EIT is one of a suite of tomographic imaging techniques and has already been used as an imaging tool for medical analysis, industrial processing and geophysical site survey work. The method uses electrodes placed on the margins or boundary of the entity being imaged, and a current is applied to some and measured on the remaining ones. Tomographic reconstruction uses algorithms to estimate the distribution of conductivity within the object and produce an image of this distribution from impedance measurements.
The advantages of the use of EIT lie with the inherent simplicity, low cost and portability of the hardware, the high speed of data acquisition for real-time or near real-time monitoring, robust sensors, and the object being monitored is done so in a non-invasive manner. The need for sophisticated image reconstruction algorithms, and providing images with adequate spatial resolution are key challenges.
A literature review of the use of EIT suggests that to date, despite its many other applications, to the best of our knowledge only one study has utilised EIT for river survey work (Sambuelli et al 2002). The Sambuelli (2002) study supported the notion that EIT may provide an innovative way of describing river bed morphology in a cost effective way. However this study used an invasive sensor array, and therefore the potential for using EIT in a non-invasive way in a river environment is still to be tested.
A review of existing methods to monitor river bed morphology indicates that a plethora of techniques have been applied by a range of disciplines including fluvial geomorphology, ecology and engineering. However, none provide non-invasive, low costs assessments in real-time or near real-time. Therefore, EIT has the potential to meet the requirements of end users that no existing technique can accomplish.
Work led by Q-par Angus Ltd. has assessed the technical requirements of the proposed approach, including probe design and deployment, sensor array parameters, data acquisition, image reconstruction and test procedure. Consequently, the success of this collaboration, literature review, identification of the proposed approach and potential applications of this technique have encouraged the authors to seek further funding to test, develop and market this approach through the development of a new environmental sensor
Grand Challenges of Traceability: The Next Ten Years
In 2007, the software and systems traceability community met at the first
Natural Bridge symposium on the Grand Challenges of Traceability to establish
and address research goals for achieving effective, trustworthy, and ubiquitous
traceability. Ten years later, in 2017, the community came together to evaluate
a decade of progress towards achieving these goals. These proceedings document
some of that progress. They include a series of short position papers,
representing current work in the community organized across four process axes
of traceability practice. The sessions covered topics from Trace Strategizing,
Trace Link Creation and Evolution, Trace Link Usage, real-world applications of
Traceability, and Traceability Datasets and benchmarks. Two breakout groups
focused on the importance of creating and sharing traceability datasets within
the research community, and discussed challenges related to the adoption of
tracing techniques in industrial practice. Members of the research community
are engaged in many active, ongoing, and impactful research projects. Our hope
is that ten years from now we will be able to look back at a productive decade
of research and claim that we have achieved the overarching Grand Challenge of
Traceability, which seeks for traceability to be always present, built into the
engineering process, and for it to have "effectively disappeared without a
trace". We hope that others will see the potential that traceability has for
empowering software and systems engineers to develop higher-quality products at
increasing levels of complexity and scale, and that they will join the active
community of Software and Systems traceability researchers as we move forward
into the next decade of research
Compact and accurate models of large single-wall carbon-nanotube interconnects
Single-wall carbon nanotubes (SWCNTs) have been proposed for very large scale integration interconnect applications and their modeling is carried out using the multiconductor transmission line (MTL) formulation. Their time-domain analysis has some simulation issues related to the high number of SWCNTs within each bundle, which results in a highly complex model and loss of accuracy in the case of long interconnects. In recent years, several techniques have been proposed to reduce the complexity of the model whose accuracy decreases as the interconnection length increases. This paper presents a rigorous new technique to generate accurate reduced-order models of large SWCNT interconnects. The frequency response of the MTL is computed by using the spectral form of the dyadic Green's function of the 1-D propagation problem and the model complexity is reduced using rational-model identification techniques. The proposed approach is validated by numerical results involving hundreds of SWCNTs, which confirm its capability of reducing the complexity of the model, while preserving accuracy over a wide frequency range
Grand Challenges of Traceability: The Next Ten Years
In 2007, the software and systems traceability community met at the first
Natural Bridge symposium on the Grand Challenges of Traceability to establish
and address research goals for achieving effective, trustworthy, and ubiquitous
traceability. Ten years later, in 2017, the community came together to evaluate
a decade of progress towards achieving these goals. These proceedings document
some of that progress. They include a series of short position papers,
representing current work in the community organized across four process axes
of traceability practice. The sessions covered topics from Trace Strategizing,
Trace Link Creation and Evolution, Trace Link Usage, real-world applications of
Traceability, and Traceability Datasets and benchmarks. Two breakout groups
focused on the importance of creating and sharing traceability datasets within
the research community, and discussed challenges related to the adoption of
tracing techniques in industrial practice. Members of the research community
are engaged in many active, ongoing, and impactful research projects. Our hope
is that ten years from now we will be able to look back at a productive decade
of research and claim that we have achieved the overarching Grand Challenge of
Traceability, which seeks for traceability to be always present, built into the
engineering process, and for it to have "effectively disappeared without a
trace". We hope that others will see the potential that traceability has for
empowering software and systems engineers to develop higher-quality products at
increasing levels of complexity and scale, and that they will join the active
community of Software and Systems traceability researchers as we move forward
into the next decade of research
- …