9 research outputs found

    Analysis of Single Board Architectures Integrating Sensors Technologies

    Get PDF
    Development boards, Single-Board Computers (SBCs) and Single-Board Microcontrollers (SBMs) integrating sensors and communication technologies have become a very popular and interesting solution in the last decade. They are of interest for their simplicity, versatility, adaptability, ease of use and prototyping, which allow them to serve as a starting point for projects and as reference for all kinds of designs. In this sense, there are innumerable applications integrating sensors and communication technologies where they are increasingly used, including robotics, domotics, testing and measurement, Do-It-Yourself (DIY) projects, Internet of Things (IoT) devices in the home or workplace and science, technology, engineering, educational and also academic world for STEAM (Science, Technology, Engineering and Mathematics) skills. The interest in single-board architectures and their applications have caused that all electronics manufacturers currently develop low-cost single board platform solutions. In this paper we realized an analysis of the most important topics related with single-board architectures integrating sensors. We analyze the most popular platforms based on characteristics as: cost, processing capacity, integrated processing technology and opensource license, as well as power consumption (mA@V), reliability (%), programming flexibility, support availability and electronics utilities. For evaluation, an experimental framework has been designed and implemented with six sensors (temperature, humidity, CO2/TVOC, pressure, ambient light and CO) and different data storage and monitoring options: locally on a ”SD (Micro Secure Digital), on a Cloud Server, on a Web Server or on a Mobile ApplicationThis research was partially supported by the Centro Científico Tecnológico de Huelva (CCTH), University of Huelv

    Security and trust in cloud computing and IoT through applying obfuscation, diversification, and trusted computing technologies

    Get PDF
    Cloud computing and Internet of Things (IoT) are very widely spread and commonly used technologies nowadays. The advanced services offered by cloud computing have made it a highly demanded technology. Enterprises and businesses are more and more relying on the cloud to deliver services to their customers. The prevalent use of cloud means that more data is stored outside the organization’s premises, which raises concerns about the security and privacy of the stored and processed data. This highlights the significance of effective security practices to secure the cloud infrastructure. The number of IoT devices is growing rapidly and the technology is being employed in a wide range of sectors including smart healthcare, industry automation, and smart environments. These devices collect and exchange a great deal of information, some of which may contain critical and personal data of the users of the device. Hence, it is highly significant to protect the collected and shared data over the network; notwithstanding, the studies signify that attacks on these devices are increasing, while a high percentage of IoT devices lack proper security measures to protect the devices, the data, and the privacy of the users. In this dissertation, we study the security of cloud computing and IoT and propose software-based security approaches supported by the hardware-based technologies to provide robust measures for enhancing the security of these environments. To achieve this goal, we use obfuscation and diversification as the potential software security techniques. Code obfuscation protects the software from malicious reverse engineering and diversification mitigates the risk of large-scale exploits. We study trusted computing and Trusted Execution Environments (TEE) as the hardware-based security solutions. Trusted Platform Module (TPM) provides security and trust through a hardware root of trust, and assures the integrity of a platform. We also study Intel SGX which is a TEE solution that guarantees the integrity and confidentiality of the code and data loaded onto its protected container, enclave. More precisely, through obfuscation and diversification of the operating systems and APIs of the IoT devices, we secure them at the application level, and by obfuscation and diversification of the communication protocols, we protect the communication of data between them at the network level. For securing the cloud computing, we employ obfuscation and diversification techniques for securing the cloud computing software at the client-side. For an enhanced level of security, we employ hardware-based security solutions, TPM and SGX. These solutions, in addition to security, ensure layered trust in various layers from hardware to the application. As the result of this PhD research, this dissertation addresses a number of security risks targeting IoT and cloud computing through the delivered publications and presents a brief outlook on the future research directions.Pilvilaskenta ja esineiden internet ovat nykyÀÀn hyvin tavallisia ja laajasti sovellettuja tekniikkoja. Pilvilaskennan pitkĂ€lle kehittyneet palvelut ovat tehneet siitĂ€ hyvin kysytyn teknologian. Yritykset enenevĂ€ssĂ€ mÀÀrin nojaavat pilviteknologiaan toteuttaessaan palveluita asiakkailleen. Vallitsevassa pilviteknologian soveltamistilanteessa yritykset ulkoistavat tietojensa kĂ€sittelyĂ€ yrityksen ulkopuolelle, minkĂ€ voidaan nĂ€hdĂ€ nostavan esiin huolia taltioitavan ja kĂ€siteltĂ€vĂ€n tiedon turvallisuudesta ja yksityisyydestĂ€. TĂ€mĂ€ korostaa tehokkaiden turvallisuusratkaisujen merkitystĂ€ osana pilvi-infrastruktuurin turvaamista. Esineiden internet -laitteiden lukumÀÀrĂ€ on nopeasti kasvanut. Teknologiana sitĂ€ sovelletaan laajasti monilla sektoreilla, kuten Ă€lykkÀÀssĂ€ terveydenhuollossa, teollisuusautomaatiossa ja Ă€lytiloissa. Sellaiset laitteet kerÀÀvĂ€t ja vĂ€littĂ€vĂ€t suuria mÀÀriĂ€ informaatiota, joka voi sisĂ€ltÀÀ laitteiden kĂ€yttĂ€jien kannalta kriittistĂ€ ja yksityistĂ€ tietoa. TĂ€stĂ€ syystĂ€ johtuen on erittĂ€in merkityksellistĂ€ suojata verkon yli kerĂ€ttĂ€vÀÀ ja jaettavaa tietoa. Monet tutkimukset osoittavat esineiden internet -laitteisiin kohdistuvien tietoturvahyökkĂ€ysten mÀÀrĂ€n olevan nousussa, ja samaan aikaan suuri osuus nĂ€istĂ€ laitteista ei omaa kunnollisia teknisiĂ€ ominaisuuksia itse laitteiden tai niiden kĂ€yttĂ€jien yksityisen tiedon suojaamiseksi. TĂ€ssĂ€ vĂ€itöskirjassa tutkitaan pilvilaskennan sekĂ€ esineiden internetin tietoturvaa ja esitetÀÀn ohjelmistopohjaisia tietoturvalĂ€hestymistapoja turvautumalla osittain laitteistopohjaisiin teknologioihin. Esitetyt lĂ€hestymistavat tarjoavat vankkoja keinoja tietoturvallisuuden kohentamiseksi nĂ€issĂ€ konteksteissa. TĂ€mĂ€n saavuttamiseksi työssĂ€ sovelletaan obfuskaatiota ja diversifiointia potentiaalisiana ohjelmistopohjaisina tietoturvatekniikkoina. Suoritettavan koodin obfuskointi suojaa pahantahtoiselta ohjelmiston takaisinmallinnukselta ja diversifiointi torjuu tietoturva-aukkojen laaja-alaisen hyödyntĂ€misen riskiĂ€. VĂ€itöskirjatyössĂ€ tutkitaan luotettua laskentaa ja luotettavan laskennan suoritusalustoja laitteistopohjaisina tietoturvaratkaisuina. TPM (Trusted Platform Module) tarjoaa turvallisuutta ja luottamuksellisuutta rakentuen laitteistopohjaiseen luottamukseen. PyrkimyksenĂ€ on taata suoritusalustan eheys. TyössĂ€ tutkitaan myös Intel SGX:ÀÀ yhtenĂ€ luotettavan suorituksen suoritusalustana, joka takaa suoritettavan koodin ja datan eheyden sekĂ€ luottamuksellisuuden pohjautuen suojatun sĂ€iliön, saarekkeen, tekniseen toteutukseen. Tarkemmin ilmaistuna työssĂ€ turvataan kĂ€yttöjĂ€rjestelmĂ€- ja sovellusrajapintatasojen obfuskaation ja diversifioinnin kautta esineiden internet -laitteiden ohjelmistokerrosta. Soveltamalla samoja tekniikoita protokollakerrokseen, työssĂ€ suojataan laitteiden vĂ€listĂ€ tiedonvaihtoa verkkotasolla. Pilvilaskennan turvaamiseksi työssĂ€ sovelletaan obfuskaatio ja diversifiointitekniikoita asiakaspuolen ohjelmistoratkaisuihin. Vankemman tietoturvallisuuden saavuttamiseksi työssĂ€ hyödynnetÀÀn laitteistopohjaisia TPM- ja SGX-ratkaisuja. Tietoturvallisuuden lisĂ€ksi nĂ€mĂ€ ratkaisut tarjoavat monikerroksisen luottamuksen rakentuen laitteistotasolta ohjelmistokerrokseen asti. TĂ€mĂ€n vĂ€itöskirjatutkimustyön tuloksena, osajulkaisuiden kautta, vastataan moniin esineiden internet -laitteisiin ja pilvilaskentaan kohdistuviin tietoturvauhkiin. TyössĂ€ esitetÀÀn myös nĂ€kemyksiĂ€ jatkotutkimusaiheista

    Design And Evaluation Of Wireless Technology Systems For Data Analysis

    Get PDF
    DissertationThe internet and cloud storage are becoming increasingly important to researchers, hobbyists and commercial developers. This includes the transmission of reliable data as the availability and functionality of remote sensors and IoT devices are becoming more common. The availability of high-speed internet connections, like fibre-optic cable, LTE and digital radios, changed the playing field and enabled the user to transmit data to cloud storage as speedily as possible. With these various technologies available, the question now arises: Which technology is more reliable and efficient for IoT sensors and for users to transmit data to a cloud server? This project aims to investigate the reliability and transmission delay of transmitted data from Wi-Fi, GPRS Class 10, and digital radio networks to cloud storage. A sampling unit was designed to evaluate analogue inputs periodically and send the recorded data to the three technologies under test. It also records the data to an on-board micro SD card along with an indexing system. The systems then transmit the sampled data and index number to a cloud storage server via the communication technologies under test. The cloud-stored data is then compared with the recorded data of the sampler unit to determine data integrity. The transmission delays can be calculated by using the cloud storage server’s time stamp information and the original time stamp of each data message. From the results acquired in the research, it showed that digital radio is a very reliable and stable means of data communication but it lacks direct connection to the internet. Although, both Wi-Fi and GPRS Class 10 are permanently connected to the internet, it was also observed that Wi-Fi internet connectivity may be susceptible to interference from external factors like the continuity of supply from the national power grid and from load shedding. It also showed that the XBee digital radio system lost 0.21% packets compared to the 0.31% for Wi-Fi and 1.46% for GPRS Class 10. On the other hand, although GPRS Class 10 may be a bit less reliable than digital radio and Wi-Fi, it is relatively cheap to use and has the ability to connect to multiple communication towers for communications redundancy. The outcome of this research may help researchers, hobbyists and commercial developers to make a better-informed decision about the technology they wish to use for their particular project

    Fully Programming the Data Plane: A Hardware/Software Approach

    Get PDF
    Les rĂ©seaux dĂ©finis par logiciel — en anglais Software-Defined Networking (SDN) — sont apparus ces derniĂšres annĂ©es comme un nouveau paradigme de rĂ©seau. SDN introduit une sĂ©paration entre les plans de gestion, de contrĂŽle et de donnĂ©es, permettant Ă  ceux-ci d’évoluer de maniĂšre indĂ©pendante, rompant ainsi avec la rigiditĂ© des rĂ©seaux traditionnels. En particulier, dans le plan de donnĂ©es, les avancĂ©es rĂ©centes ont portĂ© sur la dĂ©finition des langages de traitement de paquets, tel que P4, et sur la dĂ©finition d’architectures de commutateurs programmables, par exemple la Protocol Independent Switch Architecture (PISA). Dans cette thĂšse, nous nous intĂ©ressons a l’architecture PISA et Ă©valuons comment exploiter les FPGA comme plateforme de traitement efficace de paquets. Cette problĂ©matique est Ă©tudiĂ©e a trois niveaux d’abstraction : microarchitectural, programmation et architectural. Au niveau microarchitectural, nous avons proposĂ© une architecture efficace d’un analyseur d’entĂȘtes de paquets pour PISA. L’analyseur de paquets utilise une architecture pipelinĂ©e avec propagation en avant — en anglais feed-forward. La complexitĂ© de l’architecture est rĂ©duite par rapport Ă  l’état de l’art grĂące a l’utilisation d’optimisations algorithmiques. Finalement, l’architecture est gĂ©nĂ©rĂ©e par un compilateur P4 vers C++, combinĂ© Ă  un outil de synthĂšse de haut niveau. La solution proposĂ©e atteint un dĂ©bit de 100 Gb/s avec une latence comparable Ă  celle d’analyseurs d’entĂȘtes de paquets Ă©crits Ă  la main. Au niveau de la programmation, nous avons proposĂ© une nouvelle mĂ©thodologie de conception de synthĂšse de haut niveau visant Ă  amĂ©liorer conjointement la qualitĂ© logicielle et matĂ©rielle. Nous exploitons les fonctionnalitĂ©s du C++ moderne pour amĂ©liorer Ă  la fois la modularitĂ© et la lisibilitĂ© du code, tout en conservant (ou amĂ©liorant) les rĂ©sultats du matĂ©riel gĂ©nĂ©rĂ©. Des exemples de conception utilisant notre mĂ©thodologie, incluant pour l’analyseur d’entĂȘte de paquets, ont Ă©tĂ© rendus publics.----------ABSTRACT: Software-Defined Networking (SDN) has emerged in recent years as a new network paradigm to de-ossify communication networks. Indeed, by offering a clear separation of network concerns between the management, control, and data planes, SDN allows each of these planes to evolve independently, breaking the rigidity of traditional networks. However, while well spread in the control and management planes, this de-ossification has only recently reached the data plane with the advent of packet processing languages, e.g. P4, and novel programmable switch architectures, e.g. Protocol Independent Switch Architecture (PISA). In this work, we focus on leveraging the PISA architecture by mainly exploiting the FPGA capabilities for efficient packet processing. In this way, we address this issue at different abstraction levels: i) microarchitectural; ii) programming; and, iii) architectural. At the microarchitectural level, we have proposed an efficient FPGA-based packet parser architecture, which is a major PISA’s component. The proposed packet parser follows a feedforward pipeline architecture in which the internal microarchitectural has been meticulously optimized for FPGA implementation. The architecture is automatically generated by a P4- to-C++ compiler after several rounds of graph optimizations. The proposed solution achieves 100 Gb/s line rate with latency comparable to hand-written packet parsers. The throughput scales from 10 Gb/s to 160 Gb/s with moderate increase in resource consumption. Both the compiler and the packet parser codebase have been open-sourced to permit reproducibility. At the programming level, we have proposed a novel High-Level Synthesis (HLS) design methodology aiming at improving software and hardware quality. We have employed this novel methodology when designing the packet parser. In our work, we have exploited features of modern C++ that improves at the same time code modularity and readability while keeping (or improving) the results of the generated hardware. Design examples using our methodology have been publicly released

    Fundamental Approaches to Software Engineering

    Get PDF
    This open access book constitutes the proceedings of the 24th International Conference on Fundamental Approaches to Software Engineering, FASE 2021, which took place during March 27–April 1, 2021, and was held as part of the Joint Conferences on Theory and Practice of Software, ETAPS 2021. The conference was planned to take place in Luxembourg but changed to an online format due to the COVID-19 pandemic. The 16 full papers presented in this volume were carefully reviewed and selected from 52 submissions. The book also contains 4 Test-Comp contributions

    Knowledge management in learning software SMMEs in KwaZulu-Natal, South Africa.

    Get PDF
    Doctor of Philosophy in Information Studies. University of KwaZulu-Natal, Pietermaritzburg, 2017.The study investigated the nature and causes of software development failures and knowledge management practices adopted to mitigate the failures in small, micro, and medium software developing enterprises (SMMEs) in the province of KwaZulu-Natal, South Africa. The study adopted an interpretive, qualitative multiple case study approach to investigate the problem. Twelve software development SMMEs were involved in the study. Interviews were conducted with 12 information technology (IT)/software development project managers and eight software developers identified through purposive sampling. Qualitative content analysis was used to analyse and interpret the data. The findings reveal that software development SMMEs in the province of KwaZulu-Natal, South Africa, experience software development failures. Ten causes of failure were identified. They are bureaucracy in IT departments, compatibility issues, complacency of developers, involvement of the wrong people in the planning stages of projects, a lack of detailed documentation, lack of resources, lack of user commitment/non-adoption of systems, miscommunication/misrepresentation of requirements, unrealistic customer expectations, and work overload. The results also indicate that software organisations and individual software developers experience knowledge gaps during the course of their work. Six knowledge management practices are adopted by the organisations and the individual developers to fill the knowledge gaps. The practices are knowledge acquisition, creation, storage, sharing, organisation and application. These practices are supported by Internet technologies such as blogs, Wikis, search engines, social networks, organisational databases and computer hardware such as servers and personal computers. The study reveals two important knowledge management practices that are ignored by software organisations, namely post-mortem reviews, which are essential in software development, and formal training of the developers. The findings further reveal that knowledge management has enabled the organisations and individual developers to save time, retain their intellectual property (IP), become more efficient and effective in knowledge reuse. Organisations face a number of knowledge management related challenges. The challenges are lack of formal knowledge management procedures, difficulty protecting knowledge, expensive knowledge storage costs, increasing information needs, lack of the time to fully adopt knowledge management practices, difficulty finding information, and the ever-changing nature of knowledge. The study concluded that software development failures are prevalent in software SMMEs and that the organisations have informally adopted knowledge management. Moreover, knowledge management has brought benefits to the organisations but the role played by knowledge management in eliminating project failures is not clear. It is recommended that software organisations should consider formally adopting knowledge management so that knowledge management specialists can be employed to drive the knowledge management initiatives and so help in conducting post-mortem reviews and the training of staff. In addition, further research is recommended to investigate the role of knowledge management in reducing or eliminating software project failures. Quantitative studies are also recommended to objectively measure the benefits brought by knowledge management. Such studies would measure how much time and which costs are saved by adopting knowledge management. The study contributes to theory and practice (software development industry). Theoretically, the study developed and used a conceptual framework developed from software engineering and knowledge management that could be used to investigate knowledge management activities in organisations. The study also contributes to the existing body of knowledge on the subject software learning organisations from a developing country perspective. It is envisaged that software development organisations will adopt the recommendations proffered to improve their knowledge management practices

    Next generation automotive embedded systems-on-chip and their applications

    Get PDF
    It is a well known fact in the automotive industry that critical and costly delays in the development cycle of powertrain1 controllers are unavoidable due to the complex nature of the systems-on-chip used in them. The primary goal of this portfolio is to show the development of new methodologies for the fast and efficient implementation of next generation powertrain applications and the associated automotive qualified systems-on-chip. A general guideline for rapid automotive applications development, promoting the integration of state-of-the-art tools and techniques necessary, is presented. The methods developed in this portfolio demonstrate a new and better approach to co-design of automotive systems that also raises the level of design abstraction.An integrated business plan for the development of a camless engine controller platform is presented. The plan provides details for the marketing plan, management and financial data.A comprehensive real-time system level development methodology for the implementation of an electromagnetic actuator based camless internal combustion engine is developed. The proposed development platform enables developers to complete complex software and hardware development before moving to silicon, significantly shortening the development cycle and improving confidence in the design.A novel high performance internal combustion engine knock processing strategy using the next generation automotive system-on-chip, particularly highlighting the capabilities of the first-of-its-kind single-instruction-multiple-data micro-architecture is presented. A patent application has been filed for the methodology and the details of the invention are also presented.Enhancements required for the performance optimisation of several resource properties such as memory accesses, energy consumption and execution time of embedded powertrain applications running on the developed system-on-chip and its next generation of devices is proposed. The approach used allows the replacement of various software segments by hardware units to speed up processing.1 Powertrain: A name applied to the group of components used to transmit engine power to the driving wheels. It can consist of engine, clutch, transmission, universal joints, drive shaft, differential gear, and axle shafts
    corecore