383 research outputs found

    Driving the Network-on-Chip Revolution to Remove the Interconnect Bottleneck in Nanoscale Multi-Processor Systems-on-Chip

    Get PDF
    The sustained demand for faster, more powerful chips has been met by the availability of chip manufacturing processes allowing for the integration of increasing numbers of computation units onto a single die. The resulting outcome, especially in the embedded domain, has often been called SYSTEM-ON-CHIP (SoC) or MULTI-PROCESSOR SYSTEM-ON-CHIP (MP-SoC). MPSoC design brings to the foreground a large number of challenges, one of the most prominent of which is the design of the chip interconnection. With a number of on-chip blocks presently ranging in the tens, and quickly approaching the hundreds, the novel issue of how to best provide on-chip communication resources is clearly felt. NETWORKS-ON-CHIPS (NoCs) are the most comprehensive and scalable answer to this design concern. By bringing large-scale networking concepts to the on-chip domain, they guarantee a structured answer to present and future communication requirements. The point-to-point connection and packet switching paradigms they involve are also of great help in minimizing wiring overhead and physical routing issues. However, as with any technology of recent inception, NoC design is still an evolving discipline. Several main areas of interest require deep investigation for NoCs to become viable solutions: • The design of the NoC architecture needs to strike the best tradeoff among performance, features and the tight area and power constraints of the onchip domain. • Simulation and verification infrastructure must be put in place to explore, validate and optimize the NoC performance. • NoCs offer a huge design space, thanks to their extreme customizability in terms of topology and architectural parameters. Design tools are needed to prune this space and pick the best solutions. • Even more so given their global, distributed nature, it is essential to evaluate the physical implementation of NoCs to evaluate their suitability for next-generation designs and their area and power costs. This dissertation performs a design space exploration of network-on-chip architectures, in order to point-out the trade-offs associated with the design of each individual network building blocks and with the design of network topology overall. The design space exploration is preceded by a comparative analysis of state-of-the-art interconnect fabrics with themselves and with early networkon- chip prototypes. The ultimate objective is to point out the key advantages that NoC realizations provide with respect to state-of-the-art communication infrastructures and to point out the challenges that lie ahead in order to make this new interconnect technology come true. Among these latter, technologyrelated challenges are emerging that call for dedicated design techniques at all levels of the design hierarchy. In particular, leakage power dissipation, containment of process variations and of their effects. The achievement of the above objectives was enabled by means of a NoC simulation environment for cycleaccurate modelling and simulation and by means of a back-end facility for the study of NoC physical implementation effects. Overall, all the results provided by this work have been validated on actual silicon layout

    Doctor of Philosophy

    Get PDF
    dissertationPrecise optical neural stimulation is an essential element in the use of optogenetics to elicit predictable neural action potentials within the brain, but accessing specific neocortical layers, light scattering, columniation, and ease of tissue damage pose unique challenges to the device engineer. This dissertation presents the design, simulation, microfabrication, and characterization of the Utah Optrode Array (UOA) for precise neural tissue targeting through three main objectives: 1. Maskless wafer-level microfabrication of optical penetrating neural arrays out of soda- lime glass: Utah Optrode Array. 2. Utah Optrode Array customization using stereotactic brain atlases and 3D CAD modeling for optogenetic neocortical interrogation in small rodents and nonhuman primates. 3. Single optrode characterization of the UOA for neocortical illumination. Maskless microfabrication techniques were used to create 169 individual 9 × 9 arrays 3.85 mm × 3.85 mm with 1.1 mm long optrodes from a single two inch glass wafer. The 9 × 9 UOA was too large for precise targeting of the upper layers of the cortex in smaller animals such as mice, so an array customization method was developed using Solidworks and off-the-shelf brain atlases to create 8 × 6 arrays 3.45 mm × 2.45 mm with 400 μm long optrodes. Stereotactic atlases were imported into Solidworks, splined, and lofted together to create a single 3D CAD model of a specific region of interest in the brain. Chronic and acute brain trauma showed excellent results for the 8 × 6 arrays in C57BL/6 wild-type mice (Mus musculus) and macaque monkey (Macaca fascicularis). Simulation, characterization, and radiometric testing of a single optrode of the 9 × 9 array was necessary to prove the ability to transmit light directly to specific tissue. Zemax optical design software was used to predict the light transmission capabilities, and then these results were compared to actual bench-top results. Insertion loss was both predicted and measured to be 3.7 dB. Power budgeting showed 9% of the light was lost at the interfaces of the UOA's backplane and tip in air, and 48% was lost through back-scattering, leaving 43% transmitting through the optrode with no measurable taper loss. Scanning electron microscopy showed small amounts of devitrification of the glass, and atomic force microscopy showed average surface roughness to be 13.5 nm and a root mean square roughness of 20.6 nm. The output beam was profiled in fluorescein dye with a total divergence angle of 63◦ with a cross over distance to adjacent beams at 255 μm

    Cross-core Microarchitectural Attacks and Countermeasures

    Get PDF
    In the last decade, multi-threaded systems and resource sharing have brought a number of technologies that facilitate our daily tasks in a way we never imagined. Among others, cloud computing has emerged to offer us powerful computational resources without having to physically acquire and install them, while smartphones have almost acquired the same importance desktop computers had a decade ago. This has only been possible thanks to the ever evolving performance optimization improvements made to modern microarchitectures that efficiently manage concurrent usage of hardware resources. One of the aforementioned optimizations is the usage of shared Last Level Caches (LLCs) to balance different CPU core loads and to maintain coherency between shared memory blocks utilized by different cores. The latter for instance has enabled concurrent execution of several processes in low RAM devices such as smartphones. Although efficient hardware resource sharing has become the de-facto model for several modern technologies, it also poses a major concern with respect to security. Some of the concurrently executed co-resident processes might in fact be malicious and try to take advantage of hardware proximity. New technologies usually claim to be secure by implementing sandboxing techniques and executing processes in isolated software environments, called Virtual Machines (VMs). However, the design of these isolated environments aims at preventing pure software- based attacks and usually does not consider hardware leakages. In fact, the malicious utilization of hardware resources as covert channels might have severe consequences to the privacy of the customers. Our work demonstrates that malicious customers of such technologies can utilize the LLC as the covert channel to obtain sensitive information from a co-resident victim. We show that the LLC is an attractive resource to be targeted by attackers, as it offers high resolution and, unlike previous microarchitectural attacks, does not require core-colocation. Particularly concerning are the cases in which cryptography is compromised, as it is the main component of every security solution. In this sense, the presented work does not only introduce three attack variants that can be applicable in different scenarios, but also demonstrates the ability to recover cryptographic keys (e.g. AES and RSA) and TLS session messages across VMs, bypassing sandboxing techniques. Finally, two countermeasures to prevent microarchitectural attacks in general and LLC attacks in particular from retrieving fine- grain information are presented. Unlike previously proposed countermeasures, ours do not add permanent overheads in the system but can be utilized as preemptive defenses. The first identifies leakages in cryptographic software that can potentially lead to key extraction, and thus, can be utilized by cryptographic code designers to ensure the sanity of their libraries before deployment. The second detects microarchitectural attacks embedded into innocent-looking binaries, preventing them from being posted in official application repositories that usually have the full trust of the customer

    Foundations and Technological Landscape of Cloud Computing

    Get PDF
    The cloud computing paradigm has brought the benefits of utility computing to a global scale. It has gained paramount attention in recent years. Companies are seriously considering to adopt this new paradigm and expecting to receive significant benefits. In fact, the concept of cloud computing is not a revolution in terms of technology; it has been established based on the solid ground of virtualization, distributed system, and web services. To comprehend cloud computing, its foundations and technological landscape need to be adequately understood. This paper provides a comprehensive review on the building blocks of cloud computing and relevant technological aspects. It focuses on four key areas including architecture, virtualization, data management, and security issues

    Sustain Water Conservation Behaviors Using Nonparametric Ranking and Social Marketing

    Get PDF
    Circulation of water use feedback and conservation messaging are strategies implemented to reduce water demand on a short-term (seasonal) basis. Often considered a less impactful strategy than other tactics such as price increments and usage restrictions, authorities mostly use feedback in informational campaigns with a focus to apprise users about their water use. Such conservation programs have had limited success that has been attributed to the fact that the information provided with the feedback campaigns was generic and did not motivate users enough to sustain their water-saving behaviors. However, the advent of disaggregation technologies that can provide appliance-wise water use data to households can drastically upgrade prior feedback and normative messaging approaches. Nevertheless, usage information is not enough to encourage users to adopt water-saving actions. Message selection should depend on its motivational potency, and users\u27 understanding of water conservation, which requires careful psychological analysis that feedback campaigns often ignore. Using the community-based social marketing strategy—a system that combines usage information and behavioral psychology to develop user-specific interventions—we developed a conceptual framework for a customized strategic messaging system that will provide household-specific water use information, behavior-specific water-saving tips, with strategic messages to motivate households adopt and sustain their water-saving habits. Result of this study will be beneficial to water managers who intend to use feedbacks and normative messaging for conservation purposes

    Information Leakage Attacks and Countermeasures

    Get PDF
    The scientific community has been consistently working on the pervasive problem of information leakage, uncovering numerous attack vectors, and proposing various countermeasures. Despite these efforts, leakage incidents remain prevalent, as the complexity of systems and protocols increases, and sophisticated modeling methods become more accessible to adversaries. This work studies how information leakages manifest in and impact interconnected systems and their users. We first focus on online communications and investigate leakages in the Transport Layer Security protocol (TLS). Using modern machine learning models, we show that an eavesdropping adversary can efficiently exploit meta-information (e.g., packet size) not protected by the TLS’ encryption to launch fingerprinting attacks at an unprecedented scale even under non-optimal conditions. We then turn our attention to ultrasonic communications, and discuss their security shortcomings and how adversaries could exploit them to compromise anonymity network users (even though they aim to offer a greater level of privacy compared to TLS). Following up on these, we delve into physical layer leakages that concern a wide array of (networked) systems such as servers, embedded nodes, Tor relays, and hardware cryptocurrency wallets. We revisit location-based side-channel attacks and develop an exploitation neural network. Our model demonstrates the capabilities of a modern adversary but also presents an inexpensive tool to be used by auditors for detecting such leakages early on during the development cycle. Subsequently, we investigate techniques that further minimize the impact of leakages found in production components. Our proposed system design distributes both the custody of secrets and the cryptographic operation execution across several components, thus making the exploitation of leaks difficult

    KAVUAKA: a low-power application-specific processor architecture for digital hearing aids

    Get PDF
    The power consumption of digital hearing aids is very restricted due to their small physical size and the available hardware resources for signal processing are limited. However, there is a demand for more processing performance to make future hearing aids more useful and smarter. Future hearing aids should be able to detect, localize, and recognize target speakers in complex acoustic environments to further improve the speech intelligibility of the individual hearing aid user. Computationally intensive algorithms are required for this task. To maintain acceptable battery life, the hearing aid processing architecture must be highly optimized for extremely low-power consumption and high processing performance.The integration of application-specific instruction-set processors (ASIPs) into hearing aids enables a wide range of architectural customizations to meet the stringent power consumption and performance requirements. In this thesis, the application-specific hearing aid processor KAVUAKA is presented, which is customized and optimized with state-of-the-art hearing aid algorithms such as speaker localization, noise reduction, beamforming algorithms, and speech recognition. Specialized and application-specific instructions are designed and added to the baseline instruction set architecture (ISA). Among the major contributions are a multiply-accumulate (MAC) unit for real- and complex-valued numbers, architectures for power reduction during register accesses, co-processors and a low-latency audio interface. With the proposed MAC architecture, the KAVUAKA processor requires 16 % less cycles for the computation of a 128-point fast Fourier transform (FFT) compared to related programmable digital signal processors. The power consumption during register file accesses is decreased by 6 %to 17 % with isolation and by-pass techniques. The hardware-induced audio latency is 34 %lower compared to related audio interfaces for frame size of 64 samples.The final hearing aid system-on-chip (SoC) with four KAVUAKA processor cores and ten co-processors is integrated as an application-specific integrated circuit (ASIC) using a 40 nm low-power technology. The die size is 3.6 mm2. Each of the processors and co-processors contains individual customizations and hardware features with a varying datapath width between 24-bit to 64-bit. The core area of the 64-bit processor configuration is 0.134 mm2. The processors are organized in two clusters that share memory, an audio interface, co-processors and serial interfaces. The average power consumption at a clock speed of 10 MHz is 2.4 mW for SoC and 0.6 mW for the 64-bit processor.Case studies with four reference hearing aid algorithms are used to present and evaluate the proposed hardware architectures and optimizations. The program code for each processor and co-processor is generated and optimized with evolutionary algorithms for operation merging,instruction scheduling and register allocation. The KAVUAKA processor architecture is com-pared to related processor architectures in terms of processing performance, average power consumption, and silicon area requirements

    Identifying and combating cyber-threats in the field of online banking

    Get PDF
    This thesis has been carried out in the industrial environment external to the University, as an industrial PhD. The results of this PhD have been tested, validated, and implemented in the production environment of Caixabank and have been used as models for others who have followed the same ideas. The most burning threats against banks throughout the Internet environment are based on software tools developed by criminal groups, applications running on web environment either on the computer of the victim (Malware) or on their mobile device itself through downloading rogue applications (fake app's with Malware APP). Method of the thesis has been used is an approximation of qualitative exploratory research on the problem, the answer to this problem and the use of preventive methods to this problem like used authentication systems. This method is based on samples, events, surveys, laboratory tests, experiments, proof of concept; ultimately actual data that has been able to deduce the thesis proposal, using both laboratory research and grounded theory methods of data pilot experiments conducted in real environments. I've been researching the various aspects related to e-crime following a line of research focusing on intrinsically related topics: - The methods, means and systems of attack: Malware, Malware families of banker Trojans, Malware cases of use, Zeus as case of use. - The fixed platforms, mobile applications and as a means for malware attacks. - forensic methods to analyze the malware and infrastructure attacks. - Continuous improvement of methods of authentication of customers and users as a first line of defense anti- malware. - Using biometrics as innovative factor authentication.The line investigating Malware and attack systems intrinsically is closed related to authentication methods and systems to infect customer (executables, APP's, etc.), because the main purpose of malware is precisely steal data entered in the "logon "authentication system, to operate and thus, fraudulently, steal money from online banking customers. Experiments in the Malware allowed establishing a new method of decryption establishing guidelines to combat its effects describing his fraudulent scheme and operation infection. I propose a general methodology to break the encryption communications malware (keystream), extracting the system used to encrypt such communications and a general approach of the Keystream technique. We show that this methodology can be used to respond to the threat of Zeus and finally provide lessons learned highlighting some general principles of Malware (in general) and in particular proposing Zeus Cronus, an IDS that specifically seeks the Zeus malware, testing it experimentally in a network production and providing an effective skills to combat the Malware are discussed. The thesis is a research interrelated progressive evolution between malware infection systems and authentication methods, reflected in the research work cumulatively, showing an evolution of research output and looking for a progressive improvement of methods authentication and recommendations for prevention and preventing infections, a review of the main app stores for mobile financial services and a proposal to these stores. The most common methods eIDAMS (authentication methods and electronic identification) implemented in Europe and its robustness are analyzed. An analysis of adequacy is presented in terms of efficiency, usability, costs, types of operations and segments including possibilities of use as authentication method with biometrics as innovation.Este trabajo de tesis se ha realizado en el entorno industrial externo a la Universidad como un PhD industrial Los resultados de este PhD han sido testeados, validados, e implementados en el entorno de producción de Caixabank y han sido utilizados como modelos por otras que han seguido las mismas ideas. Las amenazas más candentes contra los bancos en todo el entorno Internet, se basan en herramientas software desarrolladas por los grupos delincuentes, aplicaciones que se ejecutan tanto en entornos web ya sea en el propio ordenador de la víctima (Malware) o en sus dispositivos móviles mediante la descarga de falsas aplicaciones (APP falsa con Malware). Como método se ha utilizado una aproximación de investigación exploratoria cualitativa sobre el problema, la respuesta a este problema y el uso de métodos preventivos a este problema a través de la autenticación. Este método se ha basado en muestras, hechos, encuestas, pruebas de laboratorio, experimentos, pruebas de concepto; en definitiva datos reales de los que se ha podido deducir la tesis propuesta, utilizando tanto investigación de laboratorio como métodos de teoría fundamentada en datos de experimentos pilotos realizados en entornos reales. He estado investigando los diversos aspectos relacionados con e-crime siguiendo una línea de investigación focalizada en temas intrínsecamente relacionadas: - Los métodos, medios y sistemas de ataque: Malware, familias de Malware de troyanos bancarios, casos de usos de Malware, Zeus como caso de uso. - Las plataformas fijas, los móviles y sus aplicaciones como medio para realizar los ataques de Malware. - Métodos forenses para analizar el Malware y su infraestructura de ataque. - Mejora continuada de los métodos de autenticación de los clientes y usuarios como primera barrera de defensa anti- malware. - Uso de la biometría como factor de autenticación innovador. La línea investiga el Malware y sus sistemas de ataque intrínsecamente relacionada con los métodos de autenticación y los sistemas para infectar al cliente (ejecutables, APP's, etc.) porque el objetivo principal del malware es robar precisamente los datos que se introducen en el "logon" del sistema de autenticación para operar de forma fraudulenta y sustraer así el dinero de los clientes de banca electrónica. Los experimentos realizados en el Malware permitieron establecer un método novedoso de descifrado que estableció pautas para combatir sus efectos fraudulentos describiendo su esquema de infección y funcionamiento Propongo una metodología general para romper el cifrado de comunicaciones del malware (keystream) extrayendo el sistema utilizado para cifrar dichas comunicaciones y una generalización de la técnica de Keystream. Se demuestra que esta metodología puede usarse para responder a la amenaza de Zeus y finalmente proveemos lecciones aprendidas resaltando algunos principios generales del Malware (en general) y Zeus en particular proponiendo Cronus, un IDS que persigue específicamente el Malware Zeus, probándolo experimentalmente en una red de producción y se discuten sus habilidades y efectividad. En la tesis hay una evolución investigativa progresiva interrelacionada entre el Malware, sistemas de infección y los métodos de autenticación, que se refleja en los trabajos de investigación de manera acumulativa, mostrando una evolución del output de investigación y buscando una mejora progresiva de los métodos de autenticación y de la prevención y recomendaciones para evitar las infecciones, una revisión de las principales tiendas de Apps para servicios financieros para móviles y una propuesta para estas tiendas. Se analizan los métodos más comunes eIDAMS (Métodos de Autenticación e Identificación electrónica) implementados en Europa y su robustez y presentamos un análisis de adecuación en función de eficiencia, usabilidad, costes, tipos de operación y segmentos incluyendo un análisis de posibilidades con métodos biométricos como innovación.Postprint (published version
    • …
    corecore