1,385 research outputs found

    UMSL Bulletin 2023-2024

    Get PDF
    The 2023-2024 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1088/thumbnail.jp

    UMSL Bulletin 2022-2023

    Get PDF
    The 2022-2023 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1087/thumbnail.jp

    Investigating the learning potential of the Second Quantum Revolution: development of an approach for secondary school students

    Get PDF
    In recent years we have witnessed important changes: the Second Quantum Revolution is in the spotlight of many countries, and it is creating a new generation of technologies. To unlock the potential of the Second Quantum Revolution, several countries have launched strategic plans and research programs that finance and set the pace of research and development of these new technologies (like the Quantum Flagship, the National Quantum Initiative Act and so on). The increasing pace of technological changes is also challenging science education and institutional systems, requiring them to help to prepare new generations of experts. This work is placed within physics education research and contributes to the challenge by developing an approach and a course about the Second Quantum Revolution. The aims are to promote quantum literacy and, in particular, to value from a cultural and educational perspective the Second Revolution. The dissertation is articulated in two parts. In the first, we unpack the Second Quantum Revolution from a cultural perspective and shed light on the main revolutionary aspects that are elevated to the rank of principles implemented in the design of a course for secondary school students, prospective and in-service teachers. The design process and the educational reconstruction of the activities are presented as well as the results of a pilot study conducted to investigate the impact of the approach on students' understanding and to gather feedback to refine and improve the instructional materials. The second part consists of the exploration of the Second Quantum Revolution as a context to introduce some basic concepts of quantum physics. We present the results of an implementation with secondary school students to investigate if and to what extent external representations could play any role to promote students’ understanding and acceptance of quantum physics as a personal reliable description of the world

    Approximate Computing Survey, Part I: Terminology and Software & Hardware Approximation Techniques

    Full text link
    The rapid growth of demanding applications in domains applying multimedia processing and machine learning has marked a new era for edge and cloud computing. These applications involve massive data and compute-intensive tasks, and thus, typical computing paradigms in embedded systems and data centers are stressed to meet the worldwide demand for high performance. Concurrently, the landscape of the semiconductor field in the last 15 years has constituted power as a first-class design concern. As a result, the community of computing systems is forced to find alternative design approaches to facilitate high-performance and/or power-efficient computing. Among the examined solutions, Approximate Computing has attracted an ever-increasing interest, with research works applying approximations across the entire traditional computing stack, i.e., at software, hardware, and architectural levels. Over the last decade, there is a plethora of approximation techniques in software (programs, frameworks, compilers, runtimes, languages), hardware (circuits, accelerators), and architectures (processors, memories). The current article is Part I of our comprehensive survey on Approximate Computing, and it reviews its motivation, terminology and principles, as well it classifies and presents the technical details of the state-of-the-art software and hardware approximation techniques.Comment: Under Review at ACM Computing Survey

    APPROXIMATE COMPUTING BASED PROCESSING OF MEA SIGNALS ON FPGA

    Get PDF
    The Microelectrode Array (MEA) is a collection of parallel electrodes that may measure the extracellular potential of nearby neurons. It is a crucial tool in neuroscience for researching the structure, operation, and behavior of neural networks. Using sophisticated signal processing techniques and architectural templates, the task of processing and evaluating the data streams obtained from MEAs is a computationally demanding one that needs time and parallel processing.This thesis proposes enhancing the capability of MEA signal processing systems by using approximate computing-based algorithms. These algorithms can be implemented in systems that process parallel MEA channels using the Field Programmable Gate Arrays (FPGAs). In order to develop approximate signal processing algorithms, three different types of approximate adders are investigated in various configurations. The objective is to maximize performance improvements in terms of area, power consumption, and latency associated with real-time processing while accepting lower output accuracy within certain bounds. On FPGAs, the methods are utilized to construct approximate processing systems, which are then contrasted with the precise system. Real biological signals are used to evaluate both precise and approximative systems, and the findings reveal notable improvements, especially in terms of speed and area. Processing speed enhancements reach up to 37.6%, and area enhancements reach 14.3% in some approximate system modes without sacrificing accuracy. Additional cases demonstrate how accuracy, area, and processing speed may be traded off. Using approximate computing algorithms allows for the design of real-time MEA processing systems with higher speeds and more parallel channels. The application of approximate computing algorithms to process biological signals on FPGAs in this thesis is a novel idea that has not been explored before

    A Phase Change Memory and DRAM Based Framework For Energy-Efficient and High-Speed In-Memory Stochastic Computing

    Get PDF
    Convolutional Neural Networks (CNNs) have proven to be highly effective in various fields related to Artificial Intelligence (AI) and Machine Learning (ML). However, the significant computational and memory requirements of CNNs make their processing highly compute and memory-intensive. In particular, the multiply-accumulate (MAC) operation, which is a fundamental building block of CNNs, requires enormous arithmetic operations. As the input dataset size increases, the traditional processor-centric von-Neumann computing architecture becomes ill-suited for CNN-based applications. This results in exponentially higher latency and energy costs, making the processing of CNNs highly challenging. To overcome these challenges, researchers have explored the Processing-In Memory (PIM) technique, which involves placing the processing unit inside or near the memory unit. This approach reduces data migration length and utilizes the internal memory bandwidth at the memory chip level. However, developing a reliable PIM-based system with minimal hardware modifications and design complexity remains a significant challenge. The proposed solution in the report suggests utilizing different memory technologies, such as Dynamic RAM (DRAM) and phase change memory (PCM), with Stochastic arithmetic and minimal add-on logic. Stochastic computing is a technique that uses random numbers to perform arithmetic operations instead of traditional binary representation. This technique reduces hardware requirements for CNN\u27s arithmetic operations, making it possible to implement them with minimal add-on logic. The report details the workflow for performing arithmetical operations used by CNNs, including MAC, activation, and floating-point functions. The proposed solution includes designs for scalable Stochastic Number Generator (SNG), DRAM CNN accelerator, non-volatile memory (NVM) class PCRAM-based CNN accelerator, and DRAM-based stochastic to binary conversion (StoB) for in-situ deep learning. These designs utilize stochastic computing to reduce the hardware requirements for CNN\u27s arithmetic operations and enable energy and time-efficient processing of CNNs. The report also identifies future research directions for the proposed designs, including in-situ PCRAM-based SNG, ODIN (A Bit-Parallel Stochastic Arithmetic Based Accelerator for In-Situ Neural Network Processing in Phase Change RAM), ATRIA (Bit-Parallel Stochastic Arithmetic Based Accelerator for In-DRAM CNN Processing), and AGNI (In-Situ, Iso-Latency Stochastic-to-Binary Number Conversion for In-DRAM Deep Learning), and presents initial findings for these ideas. In summary, the proposed solution in the report offers a comprehensive approach to address the challenges of processing CNNs, and the proposed designs have the potential to improve the energy and time efficiency of CNNs significantly. Using Stochastic Computing and different memory technologies enables the development of reliable PIM-based systems with minimal hardware modifications and design complexity, providing a promising path for the future of CNN-based applications

    GPU devices for safety-critical systems: a survey

    Get PDF
    Graphics Processing Unit (GPU) devices and their associated software programming languages and frameworks can deliver the computing performance required to facilitate the development of next-generation high-performance safety-critical systems such as autonomous driving systems. However, the integration of complex, parallel, and computationally demanding software functions with different safety-criticality levels on GPU devices with shared hardware resources contributes to several safety certification challenges. This survey categorizes and provides an overview of research contributions that address GPU devices’ random hardware failures, systematic failures, and independence of execution.This work has been partially supported by the European Research Council with Horizon 2020 (grant agreements No. 772773 and 871465), the Spanish Ministry of Science and Innovation under grant PID2019-107255GB, the HiPEAC Network of Excellence and the Basque Government under grant KK-2019-00035. The Spanish Ministry of Economy and Competitiveness has also partially supported Leonidas Kosmidis with a Juan de la Cierva Incorporación postdoctoral fellowship (FJCI-2020- 045931-I).Peer ReviewedPostprint (author's final draft

    Application of knowledge management principles to support maintenance strategies in healthcare organisations

    Get PDF
    Healthcare is a vital service that touches people's lives on a daily basis by providing treatment and resolving patients' health problems through the staff. Human lives are ultimately dependent on the skilled hands of the staff and those who manage the infrastructure that supports the daily operations of the service, making it a compelling reason for a dedicated research study. However, the UK healthcare sector is undergoing rapid changes, driven by rising costs, technological advancements, changing patient expectations, and increasing pressure to deliver sustainable healthcare. With the global rise in healthcare challenges, the need for sustainable healthcare delivery has become imperative. Sustainable healthcare delivery requires the integration of various practices that enhance the efficiency and effectiveness of healthcare infrastructural assets. One critical area that requires attention is the management of healthcare facilities. Healthcare facilitiesis considered one of the core elements in the delivery of effective healthcare services, as shortcomings in the provision of facilities management (FM) services in hospitals may have much more drastic negative effects than in any other general forms of buildings. An essential element in healthcare FM is linked to the relationship between action and knowledge. With a full sense of understanding of infrastructural assets, it is possible to improve, manage and make buildings suitable to the needs of users and to ensure the functionality of the structure and processes. The premise of FM is that an organisation's effectiveness and efficiency are linked to the physical environment in which it operates and that improving the environment can result in direct benefits in operational performance. The goal of healthcare FM is to support the achievement of organisational mission and goals by designing and managing space and infrastructural assets in the best combination of suitability, efficiency, and cost. In operational terms, performance refers to how well a building contributes to fulfilling its intended functions. Therefore, comprehensive deployment of efficient FM approaches is essential for ensuring quality healthcare provision while positively impacting overall patient experiences. In this regard, incorporating knowledge management (KM) principles into hospitals' FM processes contributes significantly to ensuring sustainable healthcare provision and enhancement of patient experiences. Organisations implementing KM principles are better positioned to navigate the constantly evolving business ecosystem easily. Furthermore, KM is vital in processes and service improvement, strategic decision-making, and organisational adaptation and renewal. In this regard, KM principles can be applied to improve hospital FM, thereby ensuring sustainable healthcare delivery. Knowledge management assumes that organisations that manage their organisational and individual knowledge more effectively will be able to cope more successfully with the challenges of the new business ecosystem. There is also the argument that KM plays a crucial role in improving processes and services, strategic decision-making, and adapting and renewing an organisation. The goal of KM is to aid action – providing "a knowledge pull" rather than the information overload most people experience in healthcare FM. Other motivations for seeking better KM in healthcare FM include patient safety, evidence-based care, and cost efficiency as the dominant drivers. The most evidence exists for the success of such approaches at knowledge bottlenecks, such as infection prevention and control, working safely, compliances, automated systems and reminders, and recall based on best practices. The ability to cultivate, nurture and maximise knowledge at multiple levels and in multiple contexts is one of the most significant challenges for those responsible for KM. However, despite the potential benefits, applying KM principles in hospital facilities is still limited. There is a lack of understanding of how KM can be effectively applied in this context, and few studies have explored the potential challenges and opportunities associated with implementing KM principles in hospitals facilities for sustainable healthcare delivery. This study explores applying KM principles to support maintenance strategies in healthcare organisations. The study also explores the challenges and opportunities, for healthcare organisations and FM practitioners, in operationalising a framework which draws the interconnectedness between healthcare. The study begins by defining healthcare FM and its importance in the healthcare industry. It then discusses the concept of KM and the different types of knowledge that are relevant in the healthcare FM sector. The study also examines the challenges that healthcare FM face in managing knowledge and how the application of KM principles can help to overcome these challenges. The study then explores the different KM strategies that can be applied in healthcare FM. The KM benefits include improved patient outcomes, reduced costs, increased efficiency, and enhanced collaboration among healthcare professionals. Additionally, issues like creating a culture of innovation, technology, and benchmarking are considered. In addition, a framework that integrates the essential concepts of KM in healthcare FM will be presented and discussed. The field of KM is introduced as a complex adaptive system with numerous possibilities and challenges. In this context, and in consideration of healthcare FM, five objectives have been formulated to achieve the research aim. As part of the research, a number of objectives will be evaluated, including appraising the concept of KM and how knowledge is created, stored, transferred, and utilised in healthcare FM, evaluating the impact of organisational structure on job satisfaction as well as exploring how cultural differences impact knowledge sharing and performance in healthcare FM organisations. This study uses a combination of qualitative methods, such as meetings, observations, document analysis (internal and external), and semi-structured interviews, to discover the subjective experiences of healthcare FM employees and to understand the phenomenon within a real-world context and attitudes of healthcare FM as the data collection method, using open questions to allow probing where appropriate and facilitating KM development in the delivery and practice of healthcare FM. The study describes the research methodology using the theoretical concept of the "research onion". The qualitative research was conducted in the NHS acute and non-acute hospitals in Northwest England. Findings from the research study revealed that while the concept of KM has grown significantly in recent years, KM in healthcare FM has received little or no attention. The target population was fifty (five FM directors, five academics, five industry experts, ten managers, ten supervisors, five team leaders and ten operatives). These seven groups were purposively selected as the target population because they play a crucial role in KM enhancement in healthcare FM. Face-to-face interviews were conducted with all participants based on their pre-determined availability. Out of the 50-target population, only 25 were successfully interviewed to the point of saturation. Data collected from the interview were coded and analysed using NVivo to identify themes and patterns related to KM in healthcare FM. The study is divided into eight major sections. First, it discusses literature findings regarding healthcare FM and KM, including underlying trends in FM, KM in general, and KM in healthcare FM. Second, the research establishes the study's methodology, introducing the five research objectives, questions and hypothesis. The chapter introduces the literature on methodology elements, including philosophical views and inquiry strategies. The interview and data analysis look at the feedback from the interviews. Lastly, a conclusion and recommendation summarise the research objectives and suggest further research. Overall, this study highlights the importance of KM in healthcare FM and provides insights for healthcare FM directors, managers, supervisors, academia, researchers and operatives on effectively leveraging knowledge to improve patient care and organisational effectiveness

    SoC-based FPGA architecture for image analysis and other highly demanding applications

    Get PDF
    Al giorno d'oggi, lo sviluppo di algoritmi si concentra su calcoli efficienti in termini di prestazioni ed efficienza energetica. Tecnologie come il field programmable gate array (FPGA) e il system on chip (SoC) basato su FPGA (FPGA/SoC) hanno dimostrato la loro capacità di accelerare applicazioni di calcolo intensive risparmiando al contempo il consumo energetico, grazie alla loro capacità di elevato parallelismo e riconfigurazione dell'architettura. Attualmente, i cicli di progettazione esistenti per FPGA/SoC sono lunghi, a causa della complessità dell'architettura. Pertanto, per colmare il divario tra le applicazioni e le architetture FPGA/SoC e ottenere un design hardware efficiente per l'analisi delle immagini e altri applicazioni altamente demandanti utilizzando lo strumento di sintesi di alto livello, vengono prese in considerazione due strategie complementari: tecniche ad hoc e stima delle prestazioni. Per quanto riguarda le tecniche ad-hoc, tre applicazioni molto impegnative sono state accelerate attraverso gli strumenti HLS: discriminatore di forme di impulso per i raggi cosmici, classificazione automatica degli insetti e re-ranking per il recupero delle informazioni, sottolineando i vantaggi quando questo tipo di applicazioni viene attraversato da tecniche di compressione durante il targeting dispositivi FPGA/SoC. Inoltre, in questa tesi viene proposto uno stimatore delle prestazioni per l'accelerazione hardware per prevedere efficacemente l'utilizzo delle risorse e la latenza per FPGA/SoC, costruendo un ponte tra l'applicazione e i domini architetturali. Lo strumento integra modelli analitici per la previsione delle prestazioni e un motore design space explorer (DSE) per fornire approfondimenti di alto livello agli sviluppatori di hardware, composto da due motori indipendenti: DSE basato sull'ottimizzazione a singolo obiettivo e DSE basato sull'ottimizzazione evolutiva multiobiettivo.Nowadays, the development of algorithms focuses on performance-efficient and energy-efficient computations. Technologies such as field programmable gate array (FPGA) and system on chip (SoC) based on FPGA (FPGA/SoC) have shown their ability to accelerate intensive computing applications while saving power consumption, owing to their capability of high parallelism and reconfiguration of the architecture. Currently, the existing design cycles for FPGA/SoC are time-consuming, owing to the complexity of the architecture. Therefore, to address the gap between applications and FPGA/SoC architectures and to obtain an efficient hardware design for image analysis and highly demanding applications using the high-level synthesis tool, two complementary strategies are considered: ad-hoc techniques and performance estimator. Regarding ad-hoc techniques, three highly demanding applications were accelerated through HLS tools: pulse shape discriminator for cosmic rays, automatic pest classification, and re-ranking for information retrieval, emphasizing the benefits when this type of applications are traversed by compression techniques when targeting FPGA/SoC devices. Furthermore, a comprehensive performance estimator for hardware acceleration is proposed in this thesis to effectively predict the resource utilization and latency for FPGA/SoC, building a bridge between the application and architectural domains. The tool integrates analytical models for performance prediction, and a design space explorer (DSE) engine for providing high-level insights to hardware developers, composed of two independent sub-engines: DSE based on single-objective optimization and DSE based on evolutionary multi-objective optimization

    Undergraduate and Graduate Course Descriptions, 2023 Spring

    Get PDF
    Wright State University undergraduate and graduate course descriptions from Spring 2023
    corecore