7 research outputs found

    Universal Spam Detection using Transfer Learning of BERT Model

    Get PDF
    Several machine learning and deep learning algorithms were limited to one dataset of spam emails/texts, which waste valuable resources due to individual models. This research applied efficient classification of ham or spam emails in real-time scenarios. Deep learning transformer models become important by training on text data based on self-attention mechanisms. This manuscript demonstrated a novel universal spam detection model using pre-trained Google's Bidirectional Encoder Representations from Transformers (BERT) base uncased models with multiple spam datasets. Different methods for Enron, Spamassain, Lingspam, and Spamtext message classification datasets, were used to train models individually. The combined model is finetuned with hyperparameters of each model. When each model using its corresponding datasets, an F1-score is at 0.9 in the model architecture. The "universal model" was trained with four datasets and leveraged hyperparameters from each model. An overall accuracy reached 97%, with an F1 score at 0.96 combined across all four datasets

    Evaluation of M-Sites Using PDAs

    Get PDF
    As mobile sites (m-sites) are introduced a very relevant question to ask is “How should these sites be different from the typical websites developed for desktop PCs?” This paper presents an initial, exploratory attempt to address some issues related to m-sites. This evaluation of sites was conducted using wireless PDAs in a WLAN environment. The results indicated that regular sites and m-sites differed significantly in perceived search engine functionality. The evaluated m-sites showed little differences across various industries. A discussion of these results as well as recommendations for managers and academic researchers are provided

    Kernel-Segregated Transpose Convolution Operation

    Get PDF
    Transpose convolution has shown prominence in many deep learning applications. However, transpose convolution layers are computationally intensive due to the increased feature map size due to adding zeros after each element in each row and column. Thus, convolution operation on the expanded input feature map leads to poor utilization of hardware resources. The main reason for unnecessary multiplication operations is zeros at predefined positions in the input feature map. We propose an algorithmic-level optimization technique for the effective transpose convolution implementation to solve these problems. Based on kernel activations, we segregated the original kernel into four sub-kernels. This scheme could reduce memory requirements and unnecessary multiplications. Our proposed method was 3.09(3.02)× faster computation using the Titan X GPU (Intel Dual Core CPU) with a flower dataset from the Kaggle website. Furthermore, the proposed optimization method can be generalized to existing devices without additional hardware requirements. A simple deep learning model containing one transpose convolution layer was used to evaluate the optimization method. It showed 2.2× faster training using the MNIST dataset with an Intel Dual-core CPU than the conventional implementation

    Privacy-Preserving Deep Learning Model for Covid-19 Disease Detection

    Get PDF
    Recent studies demonstrated that X-ray radiography showed higher accuracy than Polymerase Chain Reaction (PCR) testing for COVID-19 detection. Therefore, applying deep learning models to X-rays and radiography images increases the speed and accuracy of determining COVID-19 cases. However, due to Health Insurance Portability and Accountability (HIPAA) compliance, the hospitals were unwilling to share patient data due to privacy concerns. To maintain privacy, we propose using differential private deep learning models to secure the patients' private information. The dataset from the Kaggle website is used to evaluate the designed model for COVID-19 detection. The EfficientNet model version was selected according to its highest test accuracy. The injection of differential privacy constraints into the best-obtained model was made to evaluate performance. The accuracy is noted by varying the trainable layers, privacy loss, and limiting information from each sample. We obtained 84\% accuracy with a privacy loss of 10 during the fine-tuning process

    A Workaround of EHR - A Logistics/Reporting System Development

    Get PDF
    This project presents a use case - The Lab (TL) services multiple hospitals, medical centers, and physicians’ offices in the southern area of the United States. Applying systematic methods of business process management, the project manager and development team clarify requirements, analyze the processes, develop logistics and create a reporting system for TL. The system must be designed to retrieve data with limited time and costs for an inundated EHR system. In this project, the authors try to define the logistic requirements of TL and the needs an electronic form for an information management system. For example, automate processes and eliminate waste. Besides the EHR software, the goal of this project is to improve web-based logistics and reporting system while maintaining HIPAA compliant controls. The project achieves the goals, but the workaround system is still cumbersome yet workable

    Telecommunication Infrastructure Investments and Firm Performance

    No full text
    This research adopts Barua's [1] three-tier Business Value Complementarity (BVC) model to study the performance of telephone companies (Telcos). Our study integrates constructs such as spending on telecommunication infrastructure, assets, plant investment, and operating expenses into the bottom tier of BVC. The second layer incorporates measures of operational efficiency and customer satisfaction. The top tier includes market share as a performance measure of Telco firms. Data were extracted for the year 2001 from the FCC Automated Reporting Management Information System (ARMIS) provided by the Industry Analysis Division of the Common Carrier Bureau. As intermediary constructs, operational efficiency and customer satisfaction moderate the relationships between the bottom tier variables and firm performance. Overall, our study provides support for the BVC model approach. Conclusions and implications of this research are discussed. 1
    corecore