14 research outputs found

    Manufacturing of three dimensional integrated circuits

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Materials Science and Engineering, 2007.Includes bibliographical references (p. 221-231).Along with scaling down in size, novel materials have been introduced into the semiconductor industry to enable continued improvements in performance and cost as predicted by Moore's law. It has become important now more than ever to include an environmental impact evaluation of future technologies, before they are introduced into manufacturing, in order to identify potentially environmentally harmful materials or processes and understand their implications, costs, and mitigation requirements. In this thesis, we introduce a methodology to compare alternative options on the environmental axis, along with the cost and performance axes, in order to create environmentally aware and benign technologies. This methodology also helps to identify potential performance and cost issues in novel technologies by taking a transparent and bottoms-up assessment approach. This methodology is applied to the evaluation of the MIT 3D IC technology in comparison to a standard CMOS 2D IC approach. Both options are compared on all three axes - performance, cost and environmental impact.(cont.) The "handle wafer" unit process in the existing 3D IC technology, which is a crucial process for back-to-face integration, is found to have a large environmental impact because of its use of thick metal sacrificial layers and high energy consumption. We explore three different handle wafer options, between-die channel, oxide release layer, and alternative low-temperature permanent bonding. The first two approaches use a chemical handle wafer release mechanism; while the third explores solid liquid inter-diffusion (SLID) bonding using copper-indium at 2000C. Preliminary results for copper-indium bonding indicate that a sub-micron thick multi-layer copper-indium stack, when bonded to a 300 nm thick copper film results in large voids in the bonding interface primarily due to rough as-deposited films. Finally, we conduct an overall assessment of these and other proposed handle wafer technologies. The overall assessment shows that but the oxide release layer approach appears promising; however, each process option has its strength and weaknesses, which need to be understood and pursued accordingly.by Ajay Somani.Ph.D

    Efficient means of Achieving Composability using Transactional Memory

    Get PDF
    A major focus of software transaction memory systems (STMs) has been to felicitate the multiprocessor programming and provide parallel programmers an abstraction for speedy and efficient development of parallel applications. To this end, different models for incorporating object/higher level semantics into STM have recently been proposed in transactional boosting, transactional data structure library, open nested transactions and abstract nested transactions. We build an alternative object model STM (OSTM) by adopting the transactional tree model of Weikum et al. originally given for databases and extend the current work by proposing comprehensive legality definitions and conflict notion which allows efficient composability of \otm{}. We first time show the proposed OSTM to be co-opacity. We build OSTM using chained hash table data structure. Lazyskip-list is used to implement chaining using lazy approach. We notice that major concurrency hotspot is the chaining data structure within the hash table. Lazyskip-list is time efficient compared to lists in terms of traversal overhead by average case O(log(n)). We optimise lookups as they are validated at the instant they execute and they have not validated again in commit phase. This allows lookup dominated transactions to be more efficient and at the same time co-opacity

    Smart Street Light System using Embedded System

    Get PDF
    In today’s world energy saving has become a major factor and need. This project is developed keeping this problem in mind. The huge amount of electrical power of many countries is consumed in lighting the streets. However, there are stages of time when there is less vehicle density during night time or even no vehicles during late night time. The main principle of this system is object detection and then triggering the respective circuit and to provide light only at that part of road where it is needed. Logically, this system may save a large amount of the electrical power. This paper focuses on the proposal of different possible architectures of this system

    Comparative Study of Cost and Power Consumption of HVAC System Using Phase Change Material

    Get PDF
    Phase change materials (PCM) are materials which absorb the latent heat of the surrounding air. They usually have characteristics such as near to constant temperature operating range with high energy density of melting from solidified state. Nowadays, heating ventilation & air conditioning (HVAC) of commercial & domestic buildings, green rooms in pharmaceutical companies is necessary for maintaining desired atmospheric condition inside the building compound for optimum working environment. For doing so, lot of energy is consumed in this process. Therefore, there is a need of reduction in power consumption where PCM finds a huge market. These materials store the latent heat of the net heat available as latent heat thermal energy without increase in its temperature. Thus, there is a large scope for its usage for reducing the heat load for refrigerating effect for a specified area. This paper intends to compare the variation in heat load calculation of air refrigerated areas like that in commercial buildings, domestic purposes, industrial applications etc. with and without the use of PCM materials

    RecD: Deduplication for End-to-End Deep Learning Recommendation Model Training Infrastructure

    Full text link
    We present RecD (Recommendation Deduplication), a suite of end-to-end infrastructure optimizations across the Deep Learning Recommendation Model (DLRM) training pipeline. RecD addresses immense storage, preprocessing, and training overheads caused by feature duplication inherent in industry-scale DLRM training datasets. Feature duplication arises because DLRM datasets are generated from interactions. While each user session can generate multiple training samples, many features' values do not change across these samples. We demonstrate how RecD exploits this property, end-to-end, across a deployed training pipeline. RecD optimizes data generation pipelines to decrease dataset storage and preprocessing resource demands and to maximize duplication within a training batch. RecD introduces a new tensor format, InverseKeyedJaggedTensors (IKJTs), to deduplicate feature values in each batch. We show how DLRM model architectures can leverage IKJTs to drastically increase training throughput. RecD improves the training and preprocessing throughput and storage efficiency by up to 2.48x, 1.79x, and 3.71x, respectively, in an industry-scale DLRM training system.Comment: Published in the Proceedings of the Sixth Conference on Machine Learning and Systems (MLSys 2023

    Impact of safety-related dose reductions or discontinuations on sustained virologic response in HCV-infected patients: Results from the GUARD-C Cohort

    Get PDF
    BACKGROUND: Despite the introduction of direct-acting antiviral agents for chronic hepatitis C virus (HCV) infection, peginterferon alfa/ribavirin remains relevant in many resource-constrained settings. The non-randomized GUARD-C cohort investigated baseline predictors of safety-related dose reductions or discontinuations (sr-RD) and their impact on sustained virologic response (SVR) in patients receiving peginterferon alfa/ribavirin in routine practice. METHODS: A total of 3181 HCV-mono-infected treatment-naive patients were assigned to 24 or 48 weeks of peginterferon alfa/ribavirin by their physician. Patients were categorized by time-to-first sr-RD (Week 4/12). Detailed analyses of the impact of sr-RD on SVR24 (HCV RNA <50 IU/mL) were conducted in 951 Caucasian, noncirrhotic genotype (G)1 patients assigned to peginterferon alfa-2a/ribavirin for 48 weeks. The probability of SVR24 was identified by a baseline scoring system (range: 0-9 points) on which scores of 5 to 9 and <5 represent high and low probability of SVR24, respectively. RESULTS: SVR24 rates were 46.1% (754/1634), 77.1% (279/362), 68.0% (514/756), and 51.3% (203/396), respectively, in G1, 2, 3, and 4 patients. Overall, 16.9% and 21.8% patients experienced 651 sr-RD for peginterferon alfa and ribavirin, respectively. Among Caucasian noncirrhotic G1 patients: female sex, lower body mass index, pre-existing cardiovascular/pulmonary disease, and low hematological indices were prognostic factors of sr-RD; SVR24 was lower in patients with 651 vs. no sr-RD by Week 4 (37.9% vs. 54.4%; P = 0.0046) and Week 12 (41.7% vs. 55.3%; P = 0.0016); sr-RD by Week 4/12 significantly reduced SVR24 in patients with scores <5 but not 655. CONCLUSIONS: In conclusion, sr-RD to peginterferon alfa-2a/ribavirin significantly impacts on SVR24 rates in treatment-naive G1 noncirrhotic Caucasian patients. Baseline characteristics can help select patients with a high probability of SVR24 and a low probability of sr-RD with peginterferon alfa-2a/ribavirin
    corecore