44 research outputs found
Pattern Recognition
Pattern recognition is a very wide research field. It involves factors as diverse as sensors, feature extraction, pattern classification, decision fusion, applications and others. The signals processed are commonly one, two or three dimensional, the processing is done in real- time or takes hours and days, some systems look for one narrow object class, others search huge databases for entries with at least a small amount of similarity. No single person can claim expertise across the whole field, which develops rapidly, updates its paradigms and comprehends several philosophical approaches. This book reflects this diversity by presenting a selection of recent developments within the area of pattern recognition and related fields. It covers theoretical advances in classification and feature extraction as well as application-oriented works. Authors of these 25 works present and advocate recent achievements of their research related to the field of pattern recognition
Unveiling the frontiers of deep learning: innovations shaping diverse domains
Deep learning (DL) enables the development of computer models that are
capable of learning, visualizing, optimizing, refining, and predicting data. In
recent years, DL has been applied in a range of fields, including audio-visual
data processing, agriculture, transportation prediction, natural language,
biomedicine, disaster management, bioinformatics, drug design, genomics, face
recognition, and ecology. To explore the current state of deep learning, it is
necessary to investigate the latest developments and applications of deep
learning in these disciplines. However, the literature is lacking in exploring
the applications of deep learning in all potential sectors. This paper thus
extensively investigates the potential applications of deep learning across all
major fields of study as well as the associated benefits and challenges. As
evidenced in the literature, DL exhibits accuracy in prediction and analysis,
makes it a powerful computational tool, and has the ability to articulate
itself and optimize, making it effective in processing data with no prior
training. Given its independence from training data, deep learning necessitates
massive amounts of data for effective analysis and processing, much like data
volume. To handle the challenge of compiling huge amounts of medical,
scientific, healthcare, and environmental data for use in deep learning, gated
architectures like LSTMs and GRUs can be utilized. For multimodal learning,
shared neurons in the neural network for all activities and specialized neurons
for particular tasks are necessary.Comment: 64 pages, 3 figures, 3 table
Engineering derivatives from biological systems for advanced aerospace applications
The present study consisted of a literature survey, a survey of researchers, and a workshop on bionics. These tasks produced an extensive annotated bibliography of bionics research (282 citations), a directory of bionics researchers, and a workshop report on specific bionics research topics applicable to space technology. These deliverables are included as Appendix A, Appendix B, and Section 5.0, respectively. To provide organization to this highly interdisciplinary field and to serve as a guide for interested researchers, we have also prepared a taxonomy or classification of the various subelements of natural engineering systems. Finally, we have synthesized the results of the various components of this study into a discussion of the most promising opportunities for accelerated research, seeking solutions which apply engineering principles from natural systems to advanced aerospace problems. A discussion of opportunities within the areas of materials, structures, sensors, information processing, robotics, autonomous systems, life support systems, and aeronautics is given. Following the conclusions are six discipline summaries that highlight the potential benefits of research in these areas for NASA's space technology programs
Bioelectrical User Authentication
There has been tremendous growth of mobile devices, which includes mobile phones, tablets etc. in recent years. The use of mobile phone is more prevalent due to their increasing functionality and capacity. Most of the mobile phones available now are smart phones and better processing capability hence their deployment for processing large volume of information. The information contained in these smart phones need to be protected against unauthorised persons from getting hold of personal data. To verify a legitimate user before accessing the phone information, the user authentication mechanism should be robust enough to meet present security challenge. The present approach for user authentication is cumbersome and fails to consider the human factor. The point of entry mechanism is intrusive which forces users to authenticate always irrespectively of the time interval. The use of biometric is identified as a more reliable method for implementing a transparent and non-intrusive user authentication. Transparent authentication using biometrics provides the opportunity for more convenient and secure authentication over secret-knowledge or token-based approaches. The ability to apply biometrics in a transparent manner improves the authentication security by providing a reliable way for smart phone user authentication. As such, research is required to investigate new modalities that would easily operate within the constraints of a continuous and transparent authentication system. This thesis explores the use of bioelectrical signals and contextual information for non-intrusive approach for authenticating a user of a mobile device. From fusion of bioelectrical signals and context awareness information, three algorithms where created to discriminate subjects with overall Equal Error Rate (EER of 3.4%, 2.04% and 0.27% respectively. Based vii | P a g e on the analysis from the multi-algorithm implementation, a novel architecture is proposed using a multi-algorithm biometric authentication system for authentication a user of a smart phone. The framework is designed to be continuous, transparent with the application of advanced intelligence to further improve the authentication result. With the proposed framework, it removes the inconvenience of password/passphrase etc. memorability, carrying of token or capturing a biometric sample in an intrusive manner. The framework is evaluated through simulation with the application of a voting scheme. The simulation of the voting scheme using majority voting improved to the performance of the combine algorithm (security level 2) to FRR of 22% and FAR of 0%, the Active algorithm (security level 2) to FRR of 14.33% and FAR of 0% while the Non-active algorithm (security level 3) to FRR of 10.33% and FAR of 0%
Technology 2002: the Third National Technology Transfer Conference and Exposition, Volume 1
The proceedings from the conference are presented. The topics covered include the following: computer technology, advanced manufacturing, materials science, biotechnology, and electronics
Securing clouds using cryptography and traffic classification
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Over the last decade, cloud computing has gained popularity and wide acceptance, especially within the health sector where it offers several advantages such as low costs, flexible processes, and access from anywhere.
Although cloud computing is widely used in the health sector, numerous issues remain unresolved. Several studies have attempted to review the state of the art in eHealth cloud privacy and security however, some of these studies are outdated or do not cover certain vital features of cloud security and privacy such as access control, revocation and data recovery plans. This study targets some of these problems and proposes protocols, algorithms and approaches to enhance the security and privacy of cloud computing with particular reference to eHealth clouds.
Chapter 2 presents an overview and evaluation of the state of the art in eHealth security and privacy. Chapter 3 introduces different research methods and describes the research design methodology and processes used to carry out the research objectives. Of particular importance are authenticated key exchange and block cipher modes. In Chapter 4, a three-party password-based authenticated key exchange (TPAKE) protocol is presented and its security analysed. The proposed TPAKE protocol shares no plaintext data; all data shared between the parties are either hashed or encrypted. Using the random oracle model (ROM), the security of the proposed TPAKE protocol is formally proven based on the computational Diffie-Hellman (CDH) assumption. Furthermore, the analysis included in this chapter shows that the proposed protocol can ensure perfect forward secrecy and resist many kinds of common attacks such as man-in-the-middle attacks, online and offline dictionary attacks, replay attacks and known key attacks. Chapter 5 proposes a parallel block cipher (PBC) mode in which blocks of cipher are processed in parallel. The results of speed performance tests for this PBC mode in various settings are presented and compared with the standard CBC mode. Compared to the CBC mode, the PBC mode is shown to give execution time savings of 60%. Furthermore, in addition to encryption based on AES 128, the hash value of the data file can be utilised to provide an integrity check. As a result, the PBC mode has a better speed performance while retaining the confidentiality and security provided by the CBC mode.
Chapter 6 applies TPAKE and PBC to eHealth clouds. Related work on security, privacy preservation and disaster recovery are reviewed. Next, two approaches focusing on security preservation and privacy preservation, and a disaster recovery plan are proposed. The security preservation approach is a robust means of ensuring the security and integrity of electronic health records and is based on the PBC mode, while the privacy preservation approach is an efficient authentication method which protects the privacy of personal health records and is based on the TPAKE protocol. A discussion about how these integrated approaches and the disaster recovery plan can ensure the reliability and security of cloud projects follows.
Distributed denial of service (DDoS) attacks are the second most common cybercrime attacks after information theft. The timely detection and prevention of such attacks in cloud projects are therefore vital, especially for eHealth clouds. Chapter 7 presents a new classification system for detecting and preventing DDoS TCP flood attacks (CS_DDoS) for public clouds, particularly in an eHealth cloud environment. The proposed CS_DDoS system offers a solution for securing stored records by classifying incoming packets and making a decision based on these classification results. During the detection phase, CS_DDOS identifies and determines whether a packet is normal or from an attacker. During the prevention phase, packets classified as malicious are denied access to the cloud service, and the source IP is blacklisted. The performance of the CS_DDoS system is compared using four different classifiers: a least-squares support vector machine (LS-SVM), naïve Bayes, K-nearest-neighbour, and multilayer perceptron. The results show that CS_DDoS yields the best performance when the LS-SVM classifier is used. This combination can detect DDoS TCP flood attacks with an accuracy of approximately 97% and a Kappa coefficient of 0.89 when under attack from a single source, and 94% accuracy and a Kappa coefficient of 0.9 when under attack from multiple attackers. These results are then discussed in terms of the accuracy and time complexity, and are validated using a k-fold cross-validation model.
Finally, a method to mitigate DoS attacks in the cloud and reduce excessive energy consumption through managing and limiting certain flows of packets is proposed. Instead of a system shutdown, the proposed method ensures the availability of service. The proposed method manages the incoming packets more effectively by dropping packets from the most frequent requesting sources. This method can process 98.4% of the accepted packets during an attack.
Practicality and effectiveness are essential requirements of methods for preserving the privacy and security of data in clouds. The proposed methods successfully secure cloud projects and ensure the availability of services in an efficient way
Quantifying Quality of Life
Describes technological methods and tools for objective and quantitative assessment of QoL Appraises technology-enabled methods for incorporating QoL measurements in medicine Highlights the success factors for adoption and scaling of technology-enabled methods This open access book presents the rise of technology-enabled methods and tools for objective, quantitative assessment of Quality of Life (QoL), while following the WHOQOL model. It is an in-depth resource describing and examining state-of-the-art, minimally obtrusive, ubiquitous technologies. Highlighting the required factors for adoption and scaling of technology-enabled methods and tools for QoL assessment, it also describes how these technologies can be leveraged for behavior change, disease prevention, health management and long-term QoL enhancement in populations at large. Quantifying Quality of Life: Incorporating Daily Life into Medicine fills a gap in the field of QoL by providing assessment methods, techniques and tools. These assessments differ from the current methods that are now mostly infrequent, subjective, qualitative, memory-based, context-poor and sparse. Therefore, it is an ideal resource for physicians, physicians in training, software and hardware developers, computer scientists, data scientists, behavioural scientists, entrepreneurs, healthcare leaders and administrators who are seeking an up-to-date resource on this subject