163 research outputs found

    Statistical analysis and design of subthreshold operation memories

    Get PDF
    This thesis presents novel methods based on a combination of well-known statistical techniques for faster estimation of memory yield and their application in the design of energy-efficient subthreshold memories. The emergence of size-constrained Internet-of-Things (IoT) devices and proliferation of the wearable market has brought forward the challenge of achieving the maximum energy efficiency per operation in these battery operated devices. Achieving this sought-after minimum energy operation is possible under sub-threshold operation of the circuit. However, reliable memory operation is currently unattainable at these ultra-low operating voltages because of the memory circuit's vanishing noise margins which shrink further in the presence of random process variations. The statistical methods, presented in this thesis, make the yield optimization of the sub-threshold memories computationally feasible by reducing the SPICE simulation overhead. We present novel modifications to statistical sampling techniques that reduce the SPICE simulation overhead in estimating memory failure probability. These sampling scheme provides 40x reduction in finding most probable failure point and 10x reduction in estimating failure probability using the SPICE simulations compared to the existing proposals. We then provide a novel method to create surrogate models of the memory margins with better extrapolation capability than the traditional regression methods. These models, based on Gaussian process regression, encode the sensitivity of the memory margins with respect to each individual threshold variation source in a one-dimensional kernel. We find that our proposed additive kernel based models have 32% smaller out-of-sample error (that is, better extrapolation capability outside training set) than using the six-dimensional universal kernel like Radial Basis Function (RBF). The thesis also explores the topological modifications to the SRAM bitcell to achieve faster read operation at the sub-threshold operating voltages. We present a ten-transistor SRAM bitcell that achieves 2x faster read operation than the existing ten-transistor sub-threshold SRAM bitcells, while ensuring similar noise margins. The SRAM bitcell provides 70% reduction in dynamic energy at the cost of 42% increase in the leakage energy per read operation. Finally, we investigate the energy efficiency of the eDRAM gain-cells as an alternative to the SRAM bitcells in the size-constrained IoT devices. We find that reducing their write path leakage current is the only way to reduce the read energy at Minimum Energy operation Point (MEP). Further, we study the effect of transistor up-sizing under the presence of threshold voltage variations on the mean MEP read energy by performing statistical analysis based on the ANOVA test of the full-factorial experimental design.Esta tesis presenta nuevos métodos basados en una combinación de técnicas estadísticas conocidas para la estimación rápida del rendimiento de la memoria y su aplicación en el diseño de memorias de energia eficiente de sub-umbral. La aparición de los dispositivos para el Internet de las cosas (IOT) y la proliferación del mercado portátil ha presentado el reto de lograr la máxima eficiencia energética por operación de estos dispositivos operados con baterias. La eficiencia de energía es posible si se considera la operacion por debajo del umbral de los circuitos. Sin embargo, la operación confiable de memoria es actualmente inalcanzable en estos bajos niveles de voltaje debido a márgenes de ruido de fuga del circuito de memoria, los cuales se pueden reducir aún más en presencia de variaciones randomicas de procesos. Los métodos estadísticos, que se presentan en esta tesis, hacen que la optimización del rendimiento de las memorias por debajo del umbral computacionalmente factible mediante la simulación SPICE. Presentamos nuevas modificaciones a las técnicas de muestreo estadístico que reducen la sobrecarga de simulación SPICE en la estimación de la probabilidad de fallo de memoria. Estos esquemas de muestreo proporciona una reducción de 40 veces en la búsqueda de puntos de fallo más probable, y 10 veces la reducción en la estimación de la probabilidad de fallo mediante las simulaciones SPICE en comparación con otras propuestas existentes. A continuación, se proporciona un método novedoso para crear modelos sustitutos de los márgenes de memoria con una mejor capacidad de extrapolación que los métodos tradicionales de regresión. Estos modelos, basados en el proceso de regresión Gaussiano, codifican la sensibilidad de los márgenes de memoria con respecto a cada fuente de variación de umbral individual en un núcleo de una sola dimensión. Los modelos propuestos, basados en kernel aditivos, tienen un error 32% menor que el error out-of-sample (es decir, mejor capacidad de extrapolación fuera del conjunto de entrenamiento) en comparacion con el núcleo universal de seis dimensiones como la función de base radial (RBF). La tesis también explora las modificaciones topológicas a la celda binaria SRAM para alcanzar velocidades de lectura mas rapidas dentro en el contexto de operaciones en el umbral de tensiones de funcionamiento. Presentamos una celda binaria SRAM de diez transistores que consigue aumentar en 2 veces la operación de lectura en comparacion con las celdas sub-umbral de SRAM de diez transistores existentes, garantizando al mismo tiempo los márgenes de ruido similares. La celda binaria SRAM proporciona una reducción del 70% en energía dinámica a costa del aumento del 42% en la energía de fuga por las operaciones de lectura. Por último, se investiga la eficiencia energética de las células de ganancia eDRAM como una alternativa a los bitcells SRAM en los dispositivos de tamaño limitado IOT. Encontramos que la reducción de la corriente de fuga en el path de escritura es la única manera de reducir la energía de lectura en el Punto Mínimo de Energía (MEP). Además, se estudia el efecto del transistor de dimensionamiento en virtud de la presencia de variaciones de voltaje de umbral en la media de energia de lecture MEP mediante el análisis estadístico basado en la prueba de ANOVA del diseño experimental factorial completo.Postprint (published version

    Social Commerce Platform for Artists

    Get PDF
    In the digital age, many painters and artists are using online platforms to share their work and engage with a global audience. However, despite the widespread use of social media and e-commerce, there has been a glaring need for a specialized solution designed to meet the specific requirements of artists who want to simultaneously create portfolios and sell their works. This research paper examines the conception and creation of "ArtFeast," a cutting-edge social commerce platform created specifically for artists looking for a centralized arena to showcase their abilities and make profits from their works

    Exploring Sentiment Analysis in Social Media: A Natural Language Processing Case Study

    Get PDF
    Social media plays an integral role in our daily lives, influencing and reflecting global perspectives through the consumption and creation of content. Platforms like YouTube are incredibly active, with a constant influx of video uploads, views, and comments. While the YouTube app allows us to browse videos and comments, it offers only a limited glimpse into the interests and trends of others. Analysing this vast data pool, encompassing diverse language styles, presents a significant challenge. This article delves into the YouTube Data API and its application in Python for accessing raw data. The process involves data cleaning using advanced Natural Language Processing (NLP) techniques, harnessing Python-based machine learning to explore social media interactions, and automating the extraction of trends and influential factors. The journey towards trend analysis is meticulously detailed, featuring examples that leverage a variety of open-source Python tools

    Enhancing Data Security: A Comprehensive Study on the Efficacy of JSON Web Token (JWT) and HMAC SHA-256 Algorithm for Web Application Security

    Get PDF
    In today's digital era, data security is a very important aspect in various applications and services. In order to protect the integrity, confidentiality and authentication of data, security technologies such as JSON Web Token (JWT) and HMAC SHA256 algorithm are widely used. JWT is an open standard (RFC 7519) that is used to represent information in the form of tokens that can be signed digitally. The research methodology used in this research is a descriptive research method. The descriptive method is a method that describes the purpose of the data collected and records every aspect of the situation being investigated to get a clear picture of what is needed. It was found that there were several data leaks when data security was not implemented in layers, including cases that had occurred such as loss of important data contained in the website and leaks of important data which caused identities to be spread widely. Conclusions from the use of JSON Web Token (JWT) and HMAC-SHA-256 algorithm for website security is that this combination provides a strong layer of protection against security threats that are common in the online environment

    Face Mask Detection System Using Machine Learning Algorithms

    Get PDF
    The project presented in this report is a real-time face mask detection system using computer vision and deep learning techniques. The primary objective of this project is to develop a system that can detect whether a person is wearing a face mask or not, with a focus on real-time performance. The system utilizes pre-trained deep learning models for face detection and mask classification. It leverages the MobileNetV2 architecture as a feature extractor and deploys the model in real-time video streams. When a face is detected, the system classifies it as "Mask" or "No Mask" with associated confidence scores. The project involves key components, including the use of OpenCV for image processing and real-time video capture, TensorFlow/ Keras for deep learning, and the integration of pre-trained models. The code is well- structured, and it demonstrates proficiency in model loading, image preprocessing, and real-time video processing. The findings of the project showcase a practical application for face mask detection, which has gained significance in the context of public health and safety. The system provides a valuable tool for monitoring mask compliance in public spaces and can contribute to efforts to mitigate the spread of contagious diseases. The project demonstrates the importance of combining computer vision and deep learning in real-world applications, and it serves as a reference for those interested in similar projects or applications in the field of image processing and object detection. In summary, this project illustrates the successful implementation of a real-time face mask detection system and underscores its potential contributions to public health and safety measures

    Bridging the Gap: Exploring new ways to deliver online Grocery shopping using Smart Software

    Get PDF
    Its great convenience and the availability of delivery and pickup options, online grocery shopping has transformed the procurement of food and household goods. These platforms frequently outperform brick-and-mortar businesses in terms of product selection. Technology, such as mobile apps and websites, has played a critical role in this process by delivering user-friendly interfaces, personalized recommendations, and smooth transactions. Artificial intelligence and data analytics improve the shopping experience even further by personalizing options and streamlining inventory management for shops.However, there are significant hurdles to online grocery buying, including delivery problems, the need for strong cybersecurity, and worries about the environmental impact of packing and transportation. Despite these obstacles, the online grocery business continues to be a dynamic and exciting component of modern retail as technology evolves and customer behaviors change

    Handling Large-Scale Document Collections using Information Retrieval in the Age of Big Data

    Get PDF
    This paper's primary goal is to present an overview of big data and its analysis utilizing various methodologies, particularly evolutionary computing techniques, which improve information retrieval over standard search methods. is obtainable. Big data is defined as a huge, diverse collection of data that is difficult to handle with conventional computational methods and necessitates the use of more sophisticated statistical approaches in order to extract pertinent information from the data. Along with providing an overview of evolutionary computational approaches, this study also discusses some of the main models used for information retrieval

    Survey and Analysis of Production Distributed Computing Infrastructures

    Full text link
    This report has two objectives. First, we describe a set of the production distributed infrastructures currently available, so that the reader has a basic understanding of them. This includes explaining why each infrastructure was created and made available and how it has succeeded and failed. The set is not complete, but we believe it is representative. Second, we describe the infrastructures in terms of their use, which is a combination of how they were designed to be used and how users have found ways to use them. Applications are often designed and created with specific infrastructures in mind, with both an appreciation of the existing capabilities provided by those infrastructures and an anticipation of their future capabilities. Here, the infrastructures we discuss were often designed and created with specific applications in mind, or at least specific types of applications. The reader should understand how the interplay between the infrastructure providers and the users leads to such usages, which we call usage modalities. These usage modalities are really abstractions that exist between the infrastructures and the applications; they influence the infrastructures by representing the applications, and they influence the ap- plications by representing the infrastructures
    • …
    corecore