3,867 research outputs found

    Character Projection Lithography for Application-Specific Integrated Circuits

    Get PDF

    E-BLOW: E-Beam Lithography Overlapping aware Stencil Planning for MCC System

    Full text link
    Electron beam lithography (EBL) is a promising maskless solution for the technology beyond 14nm logic node. To overcome its throughput limitation, recently the traditional EBL system is extended into MCC system. %to further improve the throughput. In this paper, we present E-BLOW, a tool to solve the overlapping aware stencil planning (OSP) problems in MCC system. E-BLOW is integrated with several novel speedup techniques, i.e., successive relaxation, dynamic programming and KD-Tree based clustering, to achieve a good performance in terms of runtime and solution quality. Experimental results show that, compared with previous works, E-BLOW demonstrates better performance for both conventional EBL system and MCC system

    Data mining and predictive analytics application on cellular networks to monitor and optimize quality of service and customer experience

    Get PDF
    This research study focuses on the application models of Data Mining and Machine Learning covering cellular network traffic, in the objective to arm Mobile Network Operators with full view of performance branches (Services, Device, Subscribers). The purpose is to optimize and minimize the time to detect service and subscriber patterns behaviour. Different data mining techniques and predictive algorithms have been applied on real cellular network datasets to uncover different data usage patterns using specific Key Performance Indicators (KPIs) and Key Quality Indicators (KQI). The following tools will be used to develop the concept: RStudio for Machine Learning and process visualization, Apache Spark, SparkSQL for data and big data processing and clicData for service Visualization. Two use cases have been studied during this research. In the first study, the process of Data and predictive Analytics are fully applied in the field of Telecommunications to efficiently address users’ experience, in the goal of increasing customer loyalty and decreasing churn or customer attrition. Using real cellular network transactions, prediction analytics are used to predict customers who are likely to churn, which can result in revenue loss. Prediction algorithms and models including Classification Tree, Random Forest, Neural Networks and Gradient boosting have been used with an exploratory Data Analysis, determining relationship between predicting variables. The data is segmented in to two, a training set to train the model and a testing set to test the model. The evaluation of the best performing model is based on the prediction accuracy, sensitivity, specificity and the Confusion Matrix on the test set. The second use case analyses Service Quality Management using modern data mining techniques and the advantages of in-memory big data processing with Apache Spark and SparkSQL to save cost on tool investment; thus, a low-cost Service Quality Management model is proposed and analyzed. With increase in Smart phone adoption, access to mobile internet services, applications such as streaming, interactive chats require a certain service level to ensure customer satisfaction. As a result, an SQM framework is developed with Service Quality Index (SQI) and Key Performance Index (KPI). The research concludes with recommendations and future studies around modern technology applications in Telecommunications including Internet of Things (IoT), Cloud and recommender systems.Cellular networks have evolved and are still evolving, from traditional GSM (Global System for Mobile Communication) Circuit switched which only supported voice services and extremely low data rate, to LTE all Packet networks accommodating high speed data used for various service applications such as video streaming, video conferencing, heavy torrent download; and for say in a near future the roll-out of the Fifth generation (5G) cellular networks, intended to support complex technologies such as IoT (Internet of Things), High Definition video streaming and projected to cater massive amount of data. With high demand on network services and easy access to mobile phones, billions of transactions are performed by subscribers. The transactions appear in the form of SMSs, Handovers, voice calls, web browsing activities, video and audio streaming, heavy downloads and uploads. Nevertheless, the stormy growth in data traffic and the high requirements of new services introduce bigger challenges to Mobile Network Operators (NMOs) in analysing the big data traffic flowing in the network. Therefore, Quality of Service (QoS) and Quality of Experience (QoE) turn in to a challenge. Inefficiency in mining, analysing data and applying predictive intelligence on network traffic can produce high rate of unhappy customers or subscribers, loss on revenue and negative services’ perspective. Researchers and Service Providers are investing in Data mining, Machine Learning and AI (Artificial Intelligence) methods to manage services and experience. This research study focuses on the application models of Data Mining and Machine Learning covering network traffic, in the objective to arm Mobile Network Operators with full view of performance branches (Services, Device, Subscribers). The purpose is to optimize and minimize the time to detect service and subscriber patterns behaviour. Different data mining techniques and predictive algorithms will be applied on cellular network datasets to uncover different data usage patterns using specific Key Performance Indicators (KPIs) and Key Quality Indicators (KQI). The following tools will be used to develop the concept: R-Studio for Machine Learning, Apache Spark, SparkSQL for data processing and clicData for Visualization.Electrical and Mining EngineeringM. Tech (Electrical Engineering

    Data systems elements technology assessment and system specifications, issue no. 2

    Get PDF
    The ability to satisfy the objectives of future NASA Office of Applications programs is dependent on technology advances in a number of areas of data systems. The hardware and software technology of end-to-end systems (data processing elements through ground processing, dissemination, and presentation) are examined in terms of state of the art, trends, and projected developments in the 1980 to 1985 timeframe. Capability is considered in terms of elements that are either commercially available or that can be implemented from commercially available components with minimal development

    Development of computational methods for electronic structural characterization of strongly correlated materials: from different ab-initio perspectives

    Get PDF
    The electronic correlations in materials drive a variety of fascinating phenomena from magnetism to metal-to-insulator transitions (MIT), which are due to the coupling between electron spin, charge, ionic displacements, and orbital ordering. Although Density Functional Theory (DFT) successfully describes the electronic structure of weakly interacting material systems, being a static mean-field approach, it fails to predict the properties of Strongly Correlated Materials (SCM) that include transition and rare earth metals where there is a prominent electron localization as in the case of d and f orbitals due to the nature of their spatial confinement. Dynamical Mean Field Theory (DMFT) is a Green’s function based method that has shown success in treating SCM. This dissertation focuses on the development of a user-friendly, open-source Python/Fortran framework, “DMFTwDFT” combining DFT and DMFT to characterize properties of both weakly and strongly correlated materials. The DFT Kohn- Sham orbitals are projected onto Maximally Localized Wannier Functions (MLWF) which essentially maps the Hubbard model to a local impurity model which we solve numerically using quantum Monte Carlo methods to capture both itinerant and localized nature of electrons. Additionally, we provide a library mode for computing the DMFT density matrix which can be linked and internally called from any DFT package allowing developers of other DFT codes to interface with our package and achieve full charge-self-consistency within DFT+DMFT. We then study the stability and diffusion of oxygen vacancies in the correlated material LaNiO3. By treating Ni-d as correlated orbitals along with a Ni-O hybridization manifold, we show that certain configurations undergo a MIT based on the environment of their vacancies. We also compute the transition path energy of a single oxygen vacancy through means of the nudged elastic band (NEB) method. We show that the diffusion energy profile calculated through DFT+U differs from that of DMFT, due to correlation effects that are not quite well captured with static mean-field theories. Additionally, DMFTwDFT was utilized to study strongly correlated alloys and materials useful for neuromorphic computing applications

    The AURORA Gigabit Testbed

    Get PDF
    AURORA is one of five U.S. networking testbeds charged with exploring applications of, and technologies necessary for, networks operating at gigabit per second or higher bandwidths. The emphasis of the AURORA testbed, distinct from the other four testbeds, BLANCA, CASA, NECTAR, and VISTANET, is research into the supporting technologies for gigabit networking. Like the other testbeds, AURORA itself is an experiment in collaboration, where government initiative (in the form of the Corporation for National Research Initiatives, which is funded by DARPA and the National Science Foundation) has spurred interaction among pre-existing centers of excellence in industry, academia, and government. AURORA has been charged with research into networking technologies that will underpin future high-speed networks. This paper provides an overview of the goals and methodologies employed in AURORA, and points to some preliminary results from our first year of research, ranging from analytic results to experimental prototype hardware. This paper enunciates our targets, which include new software architectures, network abstractions, and hardware technologies, as well as applications for our work

    Rapid response to pandemic threats: immunogenic epitope detection of pandemic pathogens for diagnostics and vaccine development using peptide microarrays

    Get PDF
    Emergence and re-emergence of pathogens bearing the risk of becoming a pandemic threat are on the rise. Increased travel and trade, growing population density, changes in urbanization, and climate have a critical impact on infectious disease spread. Currently, the world is confronted with the emergence of a novel coronavirus SARS-CoV-2_{2}, responsible for yet more than 800 000 deaths globally. Outbreaks caused by viruses, such as SARS-CoV-2_{2}, HIV, Ebola, influenza, and Zika, have increased over the past decade, underlining the need for a rapid development of diagnostics and vaccines. Hence, the rational identification of biomarkers for diagnostic measures on the one hand, and antigenic targets for vaccine development on the other, are of utmost importance. Peptide microarrays can display large numbers of putative target proteins translated into overlapping linear (and cyclic) peptides for a multiplexed, high-throughput antibody analysis. This enabled for example the identification of discriminant/diagnostic epitopes in Zika or influenza and mapping epitope evolution in natural infections versus vaccinations. In this review, we highlight synthesis platforms that facilitate fast and flexible generation of high-density peptide microarrays. We further outline the multifaceted applications of these peptide array platforms for the development of serological tests and vaccines to quickly encounter pandemic threats

    Earth resources data processing center study. Volume 2 - Study findings Final report

    Get PDF
    Basic objectives and requirements of Earth Resources Progra
    • 

    corecore