6,460 research outputs found

    Resource-aware scheduling for 2D/3D multi-/many-core processor-memory systems

    Get PDF
    This dissertation addresses the complexities of 2D/3D multi-/many-core processor-memory systems, focusing on two key areas: enhancing timing predictability in real-time multi-core processors and optimizing performance within thermal constraints. The integration of an increasing number of transistors into compact chip designs, while boosting computational capacity, presents challenges in resource contention and thermal management. The first part of the thesis improves timing predictability. We enhance shared cache interference analysis for set-associative caches, advancing the calculation of Worst-Case Execution Time (WCET). This development enables accurate assessment of cache interference and the effectiveness of partitioned schedulers in real-world scenarios. We introduce TCPS, a novel task and cache-aware partitioned scheduler that optimizes cache partitioning based on task-specific WCET sensitivity, leading to improved schedulability and predictability. Our research explores various cache and scheduling configurations, providing insights into their performance trade-offs. The second part focuses on thermal management in 2D/3D many-core systems. Recognizing the limitations of Dynamic Voltage and Frequency Scaling (DVFS) in S-NUCA many-core processors, we propose synchronous thread migrations as a thermal management strategy. This approach culminates in the HotPotato scheduler, which balances performance and thermal safety. We also introduce 3D-TTP, a transient temperature-aware power budgeting strategy for 3D-stacked systems, reducing the need for Dynamic Thermal Management (DTM) activation. Finally, we present 3QUTM, a novel method for 3D-stacked systems that combines core DVFS and memory bank Low Power Modes with a learning algorithm, optimizing response times within thermal limits. This research contributes significantly to enhancing performance and thermal management in advanced processor-memory systems

    Towards A Practical High-Assurance Systems Programming Language

    Full text link
    Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation. Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code. To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process

    Towards Fast and Scalable Private Inference

    Full text link
    Privacy and security have rapidly emerged as first order design constraints. Users now demand more protection over who can see their data (confidentiality) as well as how it is used (control). Here, existing cryptographic techniques for security fall short: they secure data when stored or communicated but must decrypt it for computation. Fortunately, a new paradigm of computing exists, which we refer to as privacy-preserving computation (PPC). Emerging PPC technologies can be leveraged for secure outsourced computation or to enable two parties to compute without revealing either users' secret data. Despite their phenomenal potential to revolutionize user protection in the digital age, the realization has been limited due to exorbitant computational, communication, and storage overheads. This paper reviews recent efforts on addressing various PPC overheads using private inference (PI) in neural network as a motivating application. First, the problem and various technologies, including homomorphic encryption (HE), secret sharing (SS), garbled circuits (GCs), and oblivious transfer (OT), are introduced. Next, a characterization of their overheads when used to implement PI is covered. The characterization motivates the need for both GCs and HE accelerators. Then two solutions are presented: HAAC for accelerating GCs and RPU for accelerating HE. To conclude, results and effects are shown with a discussion on what future work is needed to overcome the remaining overheads of PI.Comment: Appear in the 20th ACM International Conference on Computing Frontier

    Revisiting Language Support for Generic Programming: When Genericity Is a Core Design Goal

    Get PDF
    ContextGeneric programming, as defined by Stepanov, is a methodology for writing efficient and reusable algorithms by considering only the required properties of their underlying data types and operations. Generic programming has proven to be an effective means of constructing libraries of reusable software components in languages that support it. Generics-related language design choices play a major role in how conducive generic programming is in practice.InquirySeveral mainstream programming languages (e.g. Java and C++) were first created without generics; features to support generic programming were added later, gradually. Much of the existing literature on supporting generic programming focuses thus on retrofitting generic programming into existing languages and identifying related implementation challenges. Is the programming experience significantly better, or different when programming with a language designed for generic programming without limitations from prior language design choices?ApproachWe examine Magnolia, a language designed to embody generic programming. Magnolia is representative of an approach to language design rooted in algebraic specifications. We repeat a well-known experiment, where we put Magnolia’s generic programming facilities under scrutiny by implementing a subset of the Boost Graph Library, and reflect on our development experience.KnowledgeWe discover that the idioms identified as key features for supporting Stepanov-style generic programming in the previous studies and work on the topic do not tell a full story. We clarify which of them are more of a means to an end, rather than fundamental features for supporting generic programming. Based on the development experience with Magnolia, we identify variadics as an additional key feature for generic programming and point out limitations and challenges of genericity by property.GroundingOur work uses a well-known framework for evaluating the generic programming facilities of a language from the literature to evaluate the algebraic approach through Magnolia, and we draw comparisons with well-known programming languages.ImportanceThis work gives a fresh perspective on generic programming, and clarifies what are fundamental language properties and their trade-offs when considering supporting Stepanov-style generic programming. The understanding of how to set the ground for generic programming will inform future language design.</p

    Pengembangan Deteksi Realtime untuk Bahasa Isyarat Indonesia dengan Menggunakan Metode Deep Learning Long Short Term Memory dan Convolutional Neural Network

    Get PDF
    Kehadiran Computer Vision mampu mempengaruhi bidang kajian Sign Language Recognition System (SLRS). Adapun penelitian dibidang SLRS terhadap dua standar bahasa isyarat di Indonesia yaitu standar SIBI (Sistem Bahasa Isyarat Indonesia) dan BISINDO (Bahasa Isyarat Indonesia). Tantangan dalam penelitian ini adalah kendala dalam memproses gambar dinamis dan gambar statis ketika setelah melalui preprocessing rekognisi. Perlakuan yang berbeda saat recognisi awal pada gambar bergerak dengan gambar statis mempengaruhi waktu memunculkan hasil dengan cepat sehingga dibutuhkan model dengan training yang baik dan cepat. Tujuan penelitian ini adalah untuk mengetahui faktor akurasi yang mempengaruhi tingkat akurasi penerapan objek deteksi dan klasifikasi gambar maupun video secara realtime untuk BISINDO (Bahasa Isyarat Indonesia) dengan menggunakan metode Deep Learning Long Short Term Memory (LSTM) dan Convolution Neural Network (CNN). Pentingnya penelitian ini karena hasilnya dapat dijadikan dasar untuk mempercepat pengembangan lebih lanjut aplikasi sign language recognition khusus untuk BISINDO yang bisa dimanfaatkan oleh penyandang disabilitas maupun masyarakat agar komunikasi dua arah lebih mudah dilakukan dimasa depan secara real-time

    ANALISA MUTU PADA REDESIGN ALAT PIROLISIS PENGUBAH SAMPAH PLASTIK MENJADI BAHAN BAKAR CAIR

    Get PDF
    Plastik pada dasarnya diwujudkan dari minyak alam, maka  sungguh  mengharuskan guna  mengembalikannya ke bentuk awal  yakni  dengan metode pirolisis. Pirolisis adalah teknologi metamorphosis yang sesuai  buat  menangani  keburukan ilmu lingkungan  serta hambatan kawasan  yang dikarenakan oleh pengurusan  pra atau sesudah pemakaian kotoran plastik yang tidak efisien serta akumulasi massal. teknik pirolisis adalah mengkonversi sesuatu materi organik pada hawa teratas serta mendetail  jadi hubungan  unsur  yang lebih kecil. Hasil pirolisis yakni produk cair, yakni bahan bakar cair. Dalam penelitian ini menggunakan plastik jenis polypropylene (PP) tidak berwarna dan berwarna. Untuk plastik yang tidak berwarna menghasilkan 58 ml, dan untuk yang berwarna menghasilkan 78 ml, masing – masing membutuhkan waktu pembakaran 200 menit, dengan temperature maksimal 175°C. Kata Kunci: sampah plastik, pirolisis, bahan bakar cai

    A Survey of Large Language Models

    Full text link
    Language is essentially a complex, intricate system of human expressions governed by grammatical rules. It poses a significant challenge to develop capable AI algorithms for comprehending and grasping a language. As a major approach, language modeling has been widely studied for language understanding and generation in the past two decades, evolving from statistical language models to neural language models. Recently, pre-trained language models (PLMs) have been proposed by pre-training Transformer models over large-scale corpora, showing strong capabilities in solving various NLP tasks. Since researchers have found that model scaling can lead to performance improvement, they further study the scaling effect by increasing the model size to an even larger size. Interestingly, when the parameter scale exceeds a certain level, these enlarged language models not only achieve a significant performance improvement but also show some special abilities that are not present in small-scale language models. To discriminate the difference in parameter scale, the research community has coined the term large language models (LLM) for the PLMs of significant size. Recently, the research on LLMs has been largely advanced by both academia and industry, and a remarkable progress is the launch of ChatGPT, which has attracted widespread attention from society. The technical evolution of LLMs has been making an important impact on the entire AI community, which would revolutionize the way how we develop and use AI algorithms. In this survey, we review the recent advances of LLMs by introducing the background, key findings, and mainstream techniques. In particular, we focus on four major aspects of LLMs, namely pre-training, adaptation tuning, utilization, and capacity evaluation. Besides, we also summarize the available resources for developing LLMs and discuss the remaining issues for future directions.Comment: ongoing work; 51 page

    2023-2024 Boise State University Undergraduate Catalog

    Get PDF
    This catalog is primarily for and directed at students. However, it serves many audiences, such as high school counselors, academic advisors, and the public. In this catalog you will find an overview of Boise State University and information on admission, registration, grades, tuition and fees, financial aid, housing, student services, and other important policies and procedures. However, most of this catalog is devoted to describing the various programs and courses offered at Boise State

    Explanatory note for the graduate thesis «Development of simulator software on the topic “Derivatives” of the distance learning course “Higher and applied mathematics»

    Get PDF
    The purpose of the Grade work. The purpose of the grade work is development of software for application of derivatives in C++ programming language. The object of the Grade work is distance learning system for students. The subject of the Grade work is software for application of derivatives in C++ programming language.Samuel AMPAI. Кваліфікаційна робота бакалавра на тему: «Development of simulator software on the topic “Derivatives” of the distance learning course “Higher and applied mathematics». Полтава, 2023 р

    RANCANG BANGUN SISTEM PENERBITAN SERTIFIKAT KOMPETENSI SEBAGAI ASET NON-FUNGIBLE-TOKEN (NFT) BERBASIS BLOCKCHAIN DAN WEB3

    Get PDF
    Surat Keterangan Pendamping Ijazah (SKPI) sangat berharga bagi para lulusan perguruan tinggi yang sedang mencari pekerjaan, oleh karena itu rentan terhadap pemalsuan dan manipulasi. Blockchain merupakan teknologi basis data terdistribusi yang mendukung integritas data, keterbukaan, dan ketelurusuran. Teknologi ini tepat digunakan untuk mencatat dan menyimpan surat berharga sebagai suatu aset digital yang bersifat terbuka dan interoperable. Untuk itu standard ERC721 digunakan untuk merepresentasikan SKPI sebagai Non Fungible Token. Penelitian ini membangun sistem penerbitan SKPI sebagai aset digital dalam standar NFT yang membutuhan ekosistem pengembangan teknologi Web3. Sistem berhasil dibangun, diimplementasikan, dan diujicobakan pada sebuah jaringan blockchain (Test-Net)
    corecore