12 research outputs found

    Zynq SoC based acceleration of the lattice Boltzmann method

    Get PDF
    Cerebral aneurysm is a life‐threatening condition. It is a weakness in a blood vessel that may enlarge and bleed into the surrounding area. In order to understand the surrounding environmental conditions during the interventions or surgical procedures, a simulation of blood flow in cerebral arteries is needed. One of the effective simulation approaches is to use the lattice Boltzmann (LB) method. Due to the computational complexity of the algorithm, the simulation is usually performed on high performance computers. In this paper, efficient hardware architectures of the LB method on a Zynq system‐on‐chip (SoC) are designed and implemented. The proposed architectures have first been simulated in Vivado HLS environment and later implemented on a ZedBoard using the software‐defined SoC (SDSoC) development environment. In addition, a set of evaluations of different hardware architectures of the LB implementation is discussed in this paper. The experimental results show that the proposed implementation is able to accelerate the processing speed by a factor of 52 compared to a dual‐core ARM processor‐based software implementation

    Encryption by Heart (EbH)-Using ECG for time-invariant symmetric key generation

    Get PDF
    Wearable devices are a part of Internet-of-Things (IoT) that may offer valuable data of their porting user. This paper explores the use of ElectroCardioGram (ECG) records to encrypt user data. Previous attempts have shown that ECG can be taken as a basis for key generation. However, these approaches do not consider time-invariant keys. This feature enables using these so-created keys for symmetrically encrypting data (e.g. smartphone pictures), enabling their decryption using the key derived from the current ECG readings. This paper addresses this challenge by proposing EbH, a mechanism for persistent key generation based on ECG. EbH produces seeds from which encryption keys are generated. Experimental results over 24 h for 199 users show that EbH, under certain settings, can produce permanent seeds (thus time-invariant keys) computed on-the-fly and different for each user up to 95.97% of users produce unique keys. In addition, EbH can be tuned to produce seeds of different length (up to 300 bits) and with variable min-entropy (up to 93.51). All this supports the workability of EbH in a real setting. (C) 2017 Elsevier B.V. All rights reserved.Funding: This work was supported by the MINECO grants TIN2013-46469-R (SPINY: Security and Privacy in the Internet of You) and TIN2016-79095-C2-2-R (SMOG-DEV); by the CAM grant S2013/ICE-3095 (CIBERDINE: Cybersecurity, Data, and Risks), which is co-funded by European Funds (FEDER); and by the Programa de Ayudas para la Movilidad of Carlos III University of Madrid, Spain (J. M. de Fuentes and L. Gonzalez-Manzano grants). Data used for this research was provided by the Telemetric and ECG Warehouse (THEW) of University of Rochester, NY

    The Extent and Coverage of Current Knowledge of Connected Health: Systematic Mapping Study

    Get PDF
    Background: This paper examines the development of the Connected Health research landscape with a view on providing a historical perspective on existing Connected Health research. Connected Health has become a rapidly growing research field as our healthcare system is facing pressured to become more proactive and patient centred. Objective: We aimed to identify the extent and coverage of the current body of knowledge in Connected Health. With this, we want to identify which topics have drawn the attention of Connected health researchers, and if there are gaps or interdisciplinary opportunities for further research. Methods: We used a systematic mapping study that combines scientific contributions from research on medicine, business, computer science and engineering. We analyse the papers with seven classification criteria, publication source, publication year, research types, empirical types, contribution types research topic and the condition studied in the paper. Results: Altogether, our search resulted in 208 papers which were analysed by a multidisciplinary group of researchers. Our results indicate a slow start for Connected Health research but a more recent steady upswing since 2013. The majority of papers proposed healthcare solutions (37%) or evaluated Connected Health approaches (23%). Case studies (28%) and experiments (26%) were the most popular forms of scientific validation employed. Diabetes, cancer, multiple sclerosis, and heart conditions are among the most prevalent conditions studied. Conclusions: We conclude that Connected Health research seems to be an established field of research, which has been growing strongly during the last five years. There seems to be more focus on technology driven research with a strong contribution from medicine, but business aspects of Connected health are not as much studied

    Recent Advances in Embedded Computing, Intelligence and Applications

    Get PDF
    The latest proliferation of Internet of Things deployments and edge computing combined with artificial intelligence has led to new exciting application scenarios, where embedded digital devices are essential enablers. Moreover, new powerful and efficient devices are appearing to cope with workloads formerly reserved for the cloud, such as deep learning. These devices allow processing close to where data are generated, avoiding bottlenecks due to communication limitations. The efficient integration of hardware, software and artificial intelligence capabilities deployed in real sensing contexts empowers the edge intelligence paradigm, which will ultimately contribute to the fostering of the offloading processing functionalities to the edge. In this Special Issue, researchers have contributed nine peer-reviewed papers covering a wide range of topics in the area of edge intelligence. Among them are hardware-accelerated implementations of deep neural networks, IoT platforms for extreme edge computing, neuro-evolvable and neuromorphic machine learning, and embedded recommender systems

    A Model-based Design Framework for Application-specific Heterogeneous Systems

    Get PDF
    The increasing heterogeneity of computing systems enables higher performance and power efficiency. However, these improvements come at the cost of increasing the overall complexity of designing such systems. These complexities include constructing implementations for various types of processors, setting up and configuring communication protocols, and efficiently scheduling the computational work. The process for developing such systems is iterative and time consuming, with no well-defined performance goal. Current performance estimation approaches use source code implementations that require experienced developers and time to produce. We present a framework to aid in the design of heterogeneous systems and the performance tuning of applications. Our framework supports system construction: integrating custom hardware accelerators with existing cores into processors, integrating processors into cohesive systems, and mapping computations to processors to achieve overall application performance and efficient hardware usage. It also facilitates effective design space exploration using processor models (for both existing and future processors) that do not require source code implementations to estimate performance. We evaluate our framework using a variety of applications and implement them in systems ranging from low power embedded systems-on-chip (SoC) to high performance systems consisting of commercial-off-the-shelf (COTS) components. We show how the design process is improved, reducing the number of design iterations and unnecessary source code development ultimately leading to higher performing efficient systems

    Towards trustworthy computing on untrustworthy hardware

    Get PDF
    Historically, hardware was thought to be inherently secure and trusted due to its obscurity and the isolated nature of its design and manufacturing. In the last two decades, however, hardware trust and security have emerged as pressing issues. Modern day hardware is surrounded by threats manifested mainly in undesired modifications by untrusted parties in its supply chain, unauthorized and pirated selling, injected faults, and system and microarchitectural level attacks. These threats, if realized, are expected to push hardware to abnormal and unexpected behaviour causing real-life damage and significantly undermining our trust in the electronic and computing systems we use in our daily lives and in safety critical applications. A large number of detective and preventive countermeasures have been proposed in literature. It is a fact, however, that our knowledge of potential consequences to real-life threats to hardware trust is lacking given the limited number of real-life reports and the plethora of ways in which hardware trust could be undermined. With this in mind, run-time monitoring of hardware combined with active mitigation of attacks, referred to as trustworthy computing on untrustworthy hardware, is proposed as the last line of defence. This last line of defence allows us to face the issue of live hardware mistrust rather than turning a blind eye to it or being helpless once it occurs. This thesis proposes three different frameworks towards trustworthy computing on untrustworthy hardware. The presented frameworks are adaptable to different applications, independent of the design of the monitored elements, based on autonomous security elements, and are computationally lightweight. The first framework is concerned with explicit violations and breaches of trust at run-time, with an untrustworthy on-chip communication interconnect presented as a potential offender. The framework is based on the guiding principles of component guarding, data tagging, and event verification. The second framework targets hardware elements with inherently variable and unpredictable operational latency and proposes a machine-learning based characterization of these latencies to infer undesired latency extensions or denial of service attacks. The framework is implemented on a DDR3 DRAM after showing its vulnerability to obscured latency extension attacks. The third framework studies the possibility of the deployment of untrustworthy hardware elements in the analog front end, and the consequent integrity issues that might arise at the analog-digital boundary of system on chips. The framework uses machine learning methods and the unique temporal and arithmetic features of signals at this boundary to monitor their integrity and assess their trust level

    SHELDON Smart habitat for the elderly.

    Get PDF
    An insightful document concerning active and assisted living under different perspectives: Furniture and habitat, ICT solutions and Healthcare

    Runtime Hardware Reconfiguration in Wireless Sensor Networks for Condition Monitoring

    Get PDF
    The integration of miniaturized heterogeneous electronic components has enabled the deployment of tiny sensing platforms empowered by wireless connectivity known as wireless sensor networks. Thanks to an optimized duty-cycled activity, the energy consumption of these battery-powered devices can be reduced to a level where several years of operation is possible. However, the processing capability of currently available wireless sensor nodes does not scale well with the observation of phenomena requiring a high sampling resolution. The large amount of data generated by the sensors cannot be handled efficiently by low-power wireless communication protocols without a preliminary filtering of the information relevant for the application. For this purpose, energy-efficient, flexible, fast and accurate processing units are required to extract important features from the sensor data and relieve the operating system from computationally demanding tasks. Reconfigurable hardware is identified as a suitable technology to fulfill these requirements, balancing implementation flexibility with performance and energy-efficiency. While both static and dynamic power consumption of field programmable gate arrays has often been pointed out as prohibitive for very-low-power applications, recent programmable logic chips based on non-volatile memory appear as a potential solution overcoming this constraint. This thesis first verifies this assumption with the help of a modular sensor node built around a field programmable gate array based on Flash technology. Short and autonomous duty-cycled operation combined with hardware acceleration efficiently drop the energy consumption of the device in the considered context. However, Flash-based devices suffer from restrictions such as long configuration times and limited resources, which reduce their suitability for complex processing tasks. A template of a dynamically reconfigurable architecture built around coarse-grained reconfigurable function units is proposed in a second part of this work to overcome these issues. The module is conceived as an overlay of the sensor node FPGA increasing the implementation flexibility and introducing a standardized programming model. Mechanisms for virtual reconfiguration tailored for resource-constrained systems are introduced to minimize the overhead induced by this genericity. The definition of this template architecture leaves room for design space exploration and application- specific customization. Nevertheless, this aspect must be supported by appropriate design tools which facilitate and automate the generation of low-level design files. For this purpose, a software tool is introduced to graphically configure the architecture and operation of the hardware accelerator. A middleware service is further integrated into the wireless sensor network operating system to bridge the gap between the hardware and the design tools, enabling remote reprogramming and scheduling of the hardware functionality at runtime. At last, this hardware and software toolchain is applied to real-world wireless sensor network deployments in the domain of condition monitoring. This category of applications often require the complex analysis of signals in the considered range of sampling frequencies such as vibrations or electrical currents, making the proposed system ideally suited for the implementation. The flexibility of the approach is demonstrated by taking examples with heterogeneous algorithmic specifications. Different data processing tasks executed by the sensor node hardware accelerator are modified at runtime according to application requests

    Comunicações sem-fios de tempo-real para ambientes abertos

    Get PDF
    Doutoramento em Engenharia InformáticaWireless communication technologies have become widely adopted, appearing in heterogeneous applications ranging from tracking victims, responders and equipments in disaster scenarios to machine health monitoring in networked manufacturing systems. Very often, applications demand a strictly bounded timing response, which, in distributed systems, is generally highly dependent on the performance of the underlying communication technology. These systems are said to have real-time timeliness requirements since data communication must be conducted within predefined temporal bounds, whose unfulfillment may compromise the correct behavior of the system and cause economic losses or endanger human lives. The potential adoption of wireless technologies for an increasingly broad range of application scenarios has made the operational requirements more complex and heterogeneous than before for wired technologies. On par with this trend, there is an increasing demand for the provision of cost-effective distributed systems with improved deployment, maintenance and adaptation features. These systems tend to require operational flexibility, which can only be ensured if the underlying communication technology provides both time and event triggered data transmission services while supporting on-line, on-the-fly parameter modification. Generally, wireless enabled applications have deployment requirements that can only be addressed through the use of batteries and/or energy harvesting mechanisms for power supply. These applications usually have stringent autonomy requirements and demand a small form factor, which hinders the use of large batteries. As the communication support may represent a significant part of the energy requirements of a station, the use of power-hungry technologies is not adequate. Hence, in such applications, low-range technologies have been widely adopted. In fact, although low range technologies provide smaller data rates, they spend just a fraction of the energy of their higher-power counterparts. The timeliness requirements of data communications, in general, can be met by ensuring the availability of the medium for any station initiating a transmission. In controlled (close) environments this can be guaranteed, as there is a strict regulation of which stations are installed in the area and for which purpose. Nevertheless, in open environments, this is hard to control because no a priori abstract knowledge is available of which stations and technologies may contend for the medium at any given instant. Hence, the support of wireless real-time communications in unmanaged scenarios is a highly challenging task. Wireless low-power technologies have been the focus of a large research effort, for example, in the Wireless Sensor Network domain. Although bringing extended autonomy to battery powered stations, such technologies are known to be negatively influenced by similar technologies contending for the medium and, especially, by technologies using higher power transmissions over the same frequency bands. A frequency band that is becoming increasingly crowded with competing technologies is the 2.4 GHz Industrial, Scientific and Medical band, encompassing, for example, Bluetooth and ZigBee, two lowpower communication standards which are the base of several real-time protocols. Although these technologies employ mechanisms to improve their coexistence, they are still vulnerable to transmissions from uncoordinated stations with similar technologies or to higher power technologies such as Wi- Fi, which hinders the support of wireless dependable real-time communications in open environments. The Wireless Flexible Time-Triggered Protocol (WFTT) is a master/multi-slave protocol that builds on the flexibility and timeliness provided by the FTT paradigm and on the deterministic medium capture and maintenance provided by the bandjacking technique. This dissertation presents the WFTT protocol and argues that it allows supporting wireless real-time communication services with high dependability requirements in open environments where multiple contention-based technologies may dispute the medium access. Besides, it claims that it is feasible to provide flexible and timely wireless communications at the same time in open environments. The WFTT protocol was inspired on the FTT paradigm, from which higher layer services such as, for example, admission control has been ported. After realizing that bandjacking was an effective technique to ensure the medium access and maintenance in open environments crowded with contention-based communication technologies, it was recognized that the mechanism could be used to devise a wireless medium access protocol that could bring the features offered by the FTT paradigm to the wireless domain. The performance of the WFTT protocol is reported in this dissertation with a description of the implemented devices, the test-bed and a discussion of the obtained results.As tecnologias de comunicação sem fios tornaram-se amplamente adoptadas, surgindo em aplicações heterógeneas que vão desde a localização de vítimas, pessoal médico e equipamentos em cenários de desastre à monitorização da condição física de máquinas em ambientes industrials. Muito frequentemente, as aplicações exigem uma resposta limitada no tempo que, geralmente, em sistemas distribuídos, é substancialmente dependente do desempenho da tecnologia de comunicação utilizada. Estes sistemas tendem a possuir requisitos de tempo-real uma vez que a comunicação de dados tem de ser conduzida dentro de limites temporais pré-definidos que, quando não cumpridos, podem comprometer o correcto funcionamento do sistema e resultar em perdas económicas ou colocar em risco vidas humanas. A potencial adopção de tecnologias sem-fios para um crescente número de cenários traduz-se num aumento da complexidade e heterogeneidade dos requisitos operacionais relativamente às tecnologias cabladas. A acompanhar esta tendência verifica-se uma crescente procura de sistemas distribuídos, caracterizados quer por uma boa relação custo-eficácia, quer pela simplicidade de instalação, manutenção e adaptação. Ao mesmo tempo, estes sistemas tendem a requerer flexibilidade operacional, que apenas pode ser assegurada se a tecnlogia de comunicação empregue supportar transmissões de dados dispoletadas quer por eventos (event-triggered), quer por tempo (timetriggered) e se, ao mesmo tempo, em funcionamento, permitir a alteração dos parâmetros de comunicação correspondentes. Frequentemente, as aplicações com comunicações sem fios caracterizam-se por exigências de instalação que apenas podem ser endereçadas usando alimentação através de baterias e/ou mecanismos de recolha de energia do ambiente envolvente. Estas aplicações têm tipicamente requisitos exigentes de autonomia e de tamanho, impedindo o recurso a baterias de grande dimensão. Dado que o suporte de comunicações pode representar uma parte significativa dos requisitos de energia da estação, o uso de tecnologias de comunicação de elevado consumo não é adequado. Desta forma, nestas aplicações, as tecnologias de comunicação de curto-alcance tornaram-se amplamente adoptadas uma vez que, apesar de se caracterizarem por taxas de transmissão inferiores, consomem apenas uma fracção da energia das tecnologias de maior alcance. resumo Em geral, os requisitos de pontualidade da comunicação de dados podem ser cumpridos através da garantia da disponibilidade do meio no instante em que qualquer estação inicie uma transmissão. Em ambientes controlados esta disponibilidade pode ser garantida, na medida em que existe um controlo de quais as estações que foram instaladas na área e qual a sua função. Contrariamente, em ambientes abertos, tal controlo é difícil de garantir uma vez que não existe conhecimento a priori de que estações ou tecnologias podem competir pelo meio, tornando o suporte de comunicações de temporeal um desafio difícil de implementar em cenários com estações de comunicação não controladas. As comunicações de baixo consumo têm sido o foco de um esforço de investigação bastante amplo, por exemplo, no domínio das redes de sensores sem fios. Embora possam permitir uma maior autonomia a estações baseadas em baterias, estas tecnologias são reconhecidas como sendo negativamente influenciadas por tecnologias semelhantes competindo pelo mesmo meio e, em particular, por tecnologias que utilizem níveis de potência de transmissão mais elevados em bandas de frequências comuns. De forma cada vez mais acentuada, a banda industrial, científica e médica (ISM) dos 2.4 GHz tem-se tornado mais saturada com tecnologias que competem entre si pelo acesso ao meio tais como, por exemplo, Bluetooth e ZigBee, dois padrões de comunicação que são a base de vários protocolos de tempo-real. Apesar destas tecnologias aplicarem mecanismos para melhorar a sua coexistência, são vulneráveis a transmissões de estações não controladas que usem as mesmas tecnologias ou que usem tecnologias com níveis de potência de transmissão mais elevados, impedindo, desta forma, o suporte de comunicações de tempo-real fiáveis em ambientes abertos. O protocolo de comunicação sem fios flexível disparado por tempo (WFTT) é baseado numa arquitectura mestre/múltiplo escravo alavancado na flexibilidade e pontualidade promovidas pelo paradigma FTT e na captura e manutenção determinística do meio suportadas pela técnica de bandjacking (captura de banda). Esta tese apresenta o protocolo WFTT e argumenta que este permite suportar serviços de comunicação de tempo-real com requisitos elevados de fiabilidade em ambientes abertos onde várias tecnologias de comunicação baseadas em contenção disputam o acesso ao meio. Adicionalmente, esta tese reivindica que é possível suportar comunicações sem-fios simultaneamente flexíveis e pontuais em ambientes abertos. O protocolo WFTT foi inspirado no paradigma FTT, do qual importa os serviços de alto nível como, por exemplo, o controlo de admissão. Após a observação da eficácia da técnica de bandjacking em assegurar o acesso ao meio e a correspondente manutenção, foi reconhecida a possibilidade de utilização deste mecanismo para o desenvolvimento de um protocolo de acesso ao meio, capaz de oferecer as funcionalidades do paradigma FTT em meios de comunicação sem-fios. O desempenho do protocolo WFTT é reportado nesta tese com uma descrição dos dispositivos implementados, da bancada de ensaios desenvolvida e dos resultados obtidos

    Actas de las VI Jornadas Nacionales (JNIC2021 LIVE)

    Get PDF
    Estas jornadas se han convertido en un foro de encuentro de los actores más relevantes en el ámbito de la ciberseguridad en España. En ellas, no sólo se presentan algunos de los trabajos científicos punteros en las diversas áreas de ciberseguridad, sino que se presta especial atención a la formación e innovación educativa en materia de ciberseguridad, y también a la conexión con la industria, a través de propuestas de transferencia de tecnología. Tanto es así que, este año se presentan en el Programa de Transferencia algunas modificaciones sobre su funcionamiento y desarrollo que han sido diseñadas con la intención de mejorarlo y hacerlo más valioso para toda la comunidad investigadora en ciberseguridad
    corecore