10 research outputs found

    An Electroencephalogram (EEG) Based Biometrics Investigation for Authentication: A Human-Computer Interaction (HCI) Approach

    Get PDF
    Encephalogram (EEG) devices are one of the active research areas in human-computer interaction (HCI). They provide a unique brain-machine interface (BMI) for interacting with a growing number of applications. EEG devices interface with computational systems, including traditional desktop computers and more recently mobile devices. These computational systems can be targeted by malicious users. There is clearly an opportunity to leverage EEG capabilities for increasing the efficiency of access control mechanisms, which are the first line of defense in any computational system. Access control mechanisms rely on a number of authenticators, including “what you know”, “what you have”, and “what you are”. The “what you are” authenticator, formally known as a biometrics authenticator, is increasingly gaining acceptance. It uses an individual’s unique features such as fingerprints and facial images to properly authenticate users. An emerging approach in physiological biometrics is cognitive biometrics, which measures brain’s response to stimuli. These stimuli can be measured by a number of devices, including EEG systems. This work shows an approach to authenticate users interacting with their computational devices through the use of EEG devices. The results demonstrate the feasibility of using a unique hard-to-forge trait as an absolute biometrics authenticator by exploiting the signals generated by different areas of the brain when exposed to visual stimuli. The outcome of this research highlights the importance of the prefrontal cortex and temporal lobes to capture unique responses to images that trigger emotional responses. Additionally, the utilization of logarithmic band power processing combined with LDA as the machine learning algorithm provides higher accuracy when compared against common spatial patterns or windowed means processing in combination with GMM and SVM machine learning algorithms. These results continue to validate the value of logarithmic band power processing and LDA when applied to oscillatory processes

    Current Challenges and Advances in Cataract Surgery

    Get PDF
    This reprint focuses on new trials related to cataract surgery, intraocular lens power calculations for cataracts after refractive surgery, problems related to high myopia, toric IOL power calculations, etc. Intraoperative use of the 3D Viewing System and OCT, studies on the spectacle dependence of EDOF, IOL fixation status and visual function, and dry eye after FLAC are also discussed. Proteomic analysis of aqueous humor proteins is also discussed

    Prenatal Diagnosis

    Get PDF
    This book provides detailed and comprehensive coverage on various aspects of prenatal diagnosis-with particular emphasis on sonographic and molecular diagnostic issues. It features sections dedicated to fundamentals of clinical, ultrasound and genetics diagnosis of human diseases, as well as current and future health strategies related to prenatal diagnosis. This book highlights the importance of utilizing fetal ultrasound/clinical/genetics knowledge to promote and achieve optimal health in fetal medicine. It will be a very useful resource to practitioners and scientists in fetal medicine

    Safety Analyses At Signalized Intersections Considering Spatial, Temporal And Site Correlation

    Get PDF
    Statistics show that signalized intersections are among the most dangerous locations of a roadway network. Different approaches including crash frequency and severity models have been used to establish the relationship between crash occurrence and intersection characteristics. In order to model crash occurrence at signalized intersections more efficiently and eventually to better identify the significant factors contributing to crashes, this dissertation investigated the temporal, spatial, and site correlations for total, rear-end, right-angle and left-turn crashes. Using the basic regression model for correlated crash data leads to invalid statistical inference, due to incorrect test statistics and standard errors based on the misspecified variance. In this dissertation, the Generalized Estimating Equations (GEEs) were applied, which provide an extension of generalized linear models to the analysis of longitudinal or clustered data. A series of frequency models are presented by using the GEE with a Negative Binomial as the link function. The GEE models for the crash frequency per year (using four correlation structures) were fitted for longitudinal data; the GEE models for the crash frequency per intersection (using three correlation structures) were fitted for the signalized intersections along corridors; the GEE models were applied for the rear-end crash data with temporal or spatial correlation separately. For right-angle crash frequency, models at intersection, roadway, and approach levels were fitted and the roadway and approach level models were estimated by using the GEE to account for the site correlation ; and for left-turn crashes, the approach level crash frequencies were modeled by using the GEE with a Negative Binomial link function for most patterns and using a binomial logit link function for the pattern having a higher proportion of zeros and ones in crash frequencies. All intersection geometry design features, traffic control and operational features, traffic flows, and crashes were obtained for selected intersections. Massive data collection work has been done. The autoregression structure is found to be the most appropriate correlation structure for both intersection temporal and spatial analyses, which indicates that the correlation between the multiple observations for a certain intersection will decrease as the time-gap increase and for spatially correlated signalized intersections along corridors the correlation between intersections decreases as spacing increases. The unstructured correlation structure was applied for roadway and approach level right-angle crashes and also for different patterns of left-turn crashes at the approach level. Usually two approaches at the same roadway have a higher correlation. At signalized intersections, differences exist in traffic volumes, site geometry, and signal operations, as well as safety performance on various approaches of intersections. Therefore, modeling the total number of left-turn crashes at intersections may obscure the real relationship between the crash causes and their effects. The dissertation modeled crashes at different levels. Particularly, intersection, roadway, and approach level models were compared for right-angle crashes, and different crash assignment criteria of at-fault driver or near-side were applied for disaggregated models. It shows that for the roadway and approach level models, the near-side models outperformed the at-fault driver models. Variables in traffic characteristics, geometric design features, traffic control and operational features, corridor level factor, and location type have been identified to be significant in crash occurrence. In specific, the safety relationship between crash occurrence and traffic volume has been investigated extensively at different studies. It has been found that the logarithm of traffic volumes per lane for the entire intersection is the best functional form for the total crashes in both the temporal and spatial analyses. The studies of right-angle and left-turn crashes confirm the assumption that the frequency of collisions is related to the traffic flows to which the colliding vehicles belong and not to the sum of the entering flows; the logarithm of the product of conflicting flows is usually the most significant functional form in the model. This study found that the left-turn protection on the minor roadway will increase rear-end crash occurrence, while the left-turn protection on the major roadway will reduce rear-end crashes. In addition, left-turn protection reduces Pattern 5 left-turn crashes (left-turning traffic collides with on-coming through traffic) specifically, but it increases Pattern 8 left-turn crashes (left-turning traffic collides with near-side crossing through traffic), and it has no significant effect on other patterns of left-turn crashes. This dissertation also investigated some other factors which have not been considered before. The safety effectiveness of many variables identified in this dissertation is consistent with previous studies. Some variables have unexpected signs and a justification is provided. Injury severity also has been studied for Patterns 5 left-turn crashes. Crashes were located to the approach with left-turning vehicles. The site correlation among the crashes occurred at the same approach was considered since these crashes may have similar propensity in crash severity. Many methodologies and applications have been attempted in this dissertation. Therefore, the study has both theoretical and implementational contribution in safety analysis at signalized intersections

    Separator fluid volume requirements in multi-infusion settings

    Get PDF
    INTRODUCTION. Intravenous (IV) therapy is a widely used method for the administration of medication in hospitals worldwide. ICU and surgical patients in particular often require multiple IV catheters due to incompatibility of certain drugs and the high complexity of medical therapy. This increases discomfort by painful invasive procedures, the risk of infections and costs of medication and disposable considerably. When different drugs are administered through the same lumen, it is common ICU practice to flush with a neutral fluid between the administration of two incompatible drugs in order to optimally use infusion lumens. An important constraint for delivering multiple incompatible drugs is the volume of separator fluid that is sufficient to safely separate them. OBJECTIVES. In this pilot study we investigated whether the choice of separator fluid, solvent, or administration rate affects the separator volume required in a typical ICU infusion setting. METHODS. A standard ICU IV line (2m, 2ml, 1mm internal diameter) was filled with methylene blue (40 mg/l) solution and flushed using an infusion pump with separator fluid. Independent variables were solvent for methylene blue (NaCl 0.9% vs. glucose 5%), separator fluid (NaCl 0.9% vs. glucose 5%), and administration rate (50, 100, or 200 ml/h). Samples were collected using a fraction collector until <2% of the original drug concentration remained and were analyzed using spectrophotometry. RESULTS. We did not find a significant effect of administration rate on separator fluid volume. However, NaCl/G5% (solvent/separator fluid) required significantly less separator fluid than NaCl/NaCl (3.6 ± 0.1 ml vs. 3.9 ± 0.1 ml, p <0.05). Also, G5%/G5% required significantly less separator fluid than NaCl/NaCl (3.6 ± 0.1 ml vs. 3.9 ± 0.1 ml, p <0.05). The significant decrease in required flushing volume might be due to differences in the viscosity of the solutions. However, mean differences were small and were most likely caused by human interactions with the fluid collection setup. The average required flushing volume is 3.7 ml. CONCLUSIONS. The choice of separator fluid, solvent or administration rate had no impact on the required flushing volume in the experiment. Future research should take IV line length, diameter, volume and also drug solution volumes into account in order to provide a full account of variables affecting the required separator fluid volume

    An Intelligent Citizen-Centric Oriented Model for Egovernance: A Uae Case Study

    Get PDF
    Tremendous advancements in information and communication technology, coupled with the usability of smart mobile devices, have brought enormous growth in the appeal of high-quality government services. This appeal has, in turn, encouraged governments to deploy services to citizens using electronic channels. Worldwide, governments have recognized the need to deliver better-integrated services to the public to meet their expectations. Therefore, the transition from the conventional modes of delivering government services to an electronic format involves substantial considerations in the operational aspects of services delivery and drastic changes in existing core business systems across governmental public institutions. The concepts of eGovernance and smart services have emerged as new ways to deliver such services to meet citizens’ demands by developing tools and setting practical standards for services delivery. These tools comprise process reengineering and the setting of guidelines, establishment of policies, delegating of authority, and continued monitoring of performance and control. From a research perspective, there is a need to identify the several factors that constitute online and mobile services delivery in the UAE and measure the adoption of these services by the public. Extant literature includes very few studies that evaluate the delivery of online and mobile services in the context of eGovernance. This study highlights these gaps in the field and conducted research in the UAE to address them. The major aim of this research is to develop and validate a citizen-centric oriented model, which examines factors that affect people’s acceptance of eGovernance services within governmental public sector organizations such as health and education. This research adopted mixed methods for data collection, including a quantitative survey and qualitative semi-structured interviews.     To test the proposed model, the research adopted structural equation modelling (SEM), which is a powerful tool that considers a confirmatory approach rather than an exploratory approach with regard to the data analysis. Second, the validated and evaluated model was used as a roadmap for eGovernance services adoption and implementation, in which new initiatives can be evaluated. Third, this research provides an intelligent system for evaluating eGovernance implementation across government entities. The proposed novel system features an intelligent login module as a service that enables users to access multiple public government services using secured unified entry access (UEA) through a single account. The users are only required to log in once to access many eGovernance services. In addition, the proposed system applied the model view controller (MVC), which is an exceedingly secure model, to leverage the system’s quality, efficiency, security, flexibility and reusability. The system applied a collaborative filtering technique to improve the delivery of eGovernance services, measuring entities’ performance and ranking government organizations. Finally, this research provides recommendations for future works, including the validation of the developed model in other countries, consideration of G2B and G2E digital services and approaches to solving world systems’ technical challenges pertinent to big data, data sparsity, cold start and scalability

    Design and Analysis of Security Schemes for Low-cost RFID Systems

    Get PDF
    With the remarkable progress in microelectronics and low-power semiconductor technologies, Radio Frequency IDentification technology (RFID) has moved from obscurity into mainstream applications, which essentially provides an indispensable foundation to realize ubiquitous computing and machine perception. However, the catching and exclusive characteristics of RFID systems introduce growing security and privacy concerns. To address these issues are particularly challenging for low-cost RFID systems, where tags are extremely constrained in resources, power and cost. The primary reasons are: (1) the security requirements of low-cost RFID systems are even more rigorous due to large operation range and mass deployment; and (2) the passive tags' modest capabilities and the necessity to keep their prices low present a novel problem that goes beyond the well-studied problems of traditional cryptography. This thesis presents our research results on the design and the analysis of security schemes for low-cost RFID systems. Motivated by the recent attention on exploiting physical layer resources in the design of security schemes, we investigate how to solve the eavesdropping, modification and one particular type of relay attacks toward the tag-to-reader communication in passive RFID systems without requiring lightweight ciphers. To this end, we propose a novel physical layer scheme, called Backscatter modulation- and Uncoordinated frequency hopping-assisted Physical Layer Enhancement (BUPLE). The idea behind it is to use the amplitude of the carrier to transmit messages as normal, while to utilize its periodically varied frequency to hide the transmission from the eavesdropper/relayer and to exploit a random sequence modulated to the carrier's phase to defeat malicious modifications. We further improve its eavesdropping resistance through the coding in the physical layer, since BUPLE ensures that the tag-to-eavesdropper channel is strictly noisier than the tag-to-reader channel. Three practical Wiretap Channel Codes (WCCs) for passive tags are then proposed: two of them are constructed from linear error correcting codes, and the other one is constructed from a resilient vector Boolean function. The security and usability of BUPLE in conjunction with WCCs are further confirmed by our proof-of-concept implementation and testing. Eavesdropping the communication between a legitimate reader and a victim tag to obtain raw data is a basic tool for the adversary. However, given the fundamentality of eavesdropping attacks, there are limited prior work investigating its intension and extension for passive RFID systems. To this end, we firstly identified a brand-new attack, working at physical layer, against backscattered RFID communications, called unidirectional active eavesdropping, which defeats the customary impression that eavesdropping is a ``passive" attack. To launch this attack, the adversary transmits an un-modulated carrier (called blank carrier) at a certain frequency while a valid reader and a tag interacts at another frequency channel. Once the tag modulates the amplitude of reader's signal, it causes fluctuations on the blank carrier as well. By carefully examining the amplitude of the backscattered versions of the blank carrier and the reader's carrier, the adversary could intercept the ongoing reader-tag communication with either significantly lower bit error rate or from a significantly greater distance away. Our concept is demonstrated and empirically analyzed towards a popular low-cost RFID system, i.e., EPC Gen2. Although active eavesdropping in general is not trivial to be prohibited, for a particular type of active eavesdropper, namely a greedy proactive eavesdropper, we propose a simple countermeasure without introducing extra cost to current RFID systems. The needs of cryptographic primitives on constraint devices keep increasing with the growing pervasiveness of these devices. One recent design of the lightweight block cipher is Hummingbird-2. We study its cryptographic strength under a novel technique we developed, called Differential Sequence Attack (DSA), and present the first cryptanalytic result on this cipher. In particular, our full attack can be divided into two phases: preparation phase and key recovery phase. During the key recovery phase, we exploit the fact that the differential sequence for the last round of Hummingbird-2 can be retrieved by querying the full cipher, due to which, the search space of the secret key can be significantly reduced. Thus, by attacking the encryption (decryption resp.) of Hummingbird-2, our algorithm recovers 36-bit (another 28-bit resp.) out of 128-bit key with 2682^{68} (2602^{60} resp.) time complexity if particular differential conditions of the internal states and of the keys at one round can be imposed. Additionally, the rest 64-bit of the key can be exhaustively searched and the overall time complexity is dominated by 2682^{68}. During the preparation phase, by investing 2812^{81} effort in time, the adversary is able to create the differential conditions required in the key recovery phase with at least 0.5 probability. As an additional effort, we examine the cryptanalytic strength of another lightweight candidate known as A2U2, which is the most lightweight cryptographic primitive proposed so far for low-cost tags. Our chosen-plaintext-attack fully breaks this cipher by recovering its secret key with only querying the encryption twice on the victim tag and solving 32 sparse systems of linear equations (where each system has 56 unknowns and around 28 unknowns can be directly obtained without computation) in the worst case, which takes around 0.16 second on a Thinkpad T410 laptop
    corecore