763 research outputs found

    Augmenting data warehousing architectures with hadoop

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementAs the volume of available data increases exponentially, traditional data warehouses struggle to transform this data into actionable knowledge. Data strategies that include the creation and maintenance of data warehouses have a lot to gain by incorporating technologies from the Big Data’s spectrum. Hadoop, as a transformation tool, can add a theoretical infinite dimension of data processing, feeding transformed information into traditional data warehouses that ultimately will retain their value as central components in organizations’ decision support systems. This study explores the potentialities of Hadoop as a data transformation tool in the setting of a traditional data warehouse environment. Hadoop’s execution model, which is oriented for distributed parallel processing, offers great capabilities when the amounts of data to be processed require the infrastructure to expand. Horizontal scalability, which is a key aspect in a Hadoop cluster, will allow for proportional growth in processing power as the volume of data increases. Through the use of a Hive on Tez, in a Hadoop cluster, this study transforms television viewing events, extracted from Ericsson’s Mediaroom Internet Protocol Television infrastructure, into pertinent audience metrics, like Rating, Reach and Share. These measurements are then made available in a traditional data warehouse, supported by a traditional Relational Database Management System, where they are presented through a set of reports. The main contribution of this research is a proposed augmented data warehouse architecture where the traditional ETL layer is replaced by a Hadoop cluster, running Hive on Tez, with the purpose of performing the heaviest transformations that convert raw data into actionable information. Through a typification of the SQL statements, responsible for the data transformation processes, we were able to understand that Hadoop, and its distributed processing model, delivers outstanding performance results associated with the analytical layer, namely in the aggregation of large data sets. Ultimately, we demonstrate, empirically, the performance gains that can be extracted from Hadoop, in comparison to an RDBMS, regarding speed, storage usage and scalability potential, and suggest how this can be used to evolve data warehouses into the age of Big Data

    Application architecture and typical cases of big data technology in health management of civil aircraft system

    Get PDF
    Under the background of the big data era, the concept and technology of civil aircraft health management have been continuously innovated. This paper investigates the progress of health management technologies of mainstream manufacturers and airlines. Based on the analysis of operational data, this paper proposes a big data platform architecture for civil health management applications. Besides, the case of applying QAR data for prediction is introduced in the typical fault of air conditioning system, and the effectiveness of AR model in short-term trend prediction is verified

    Model-based Approaches to Privacy Compliance

    Get PDF
    In the last decade, information technologies have been developing dramatically, and therefore data harvested via the Internet is growing rapidly. This technological change has a negative impact on privacy due to the sensitivity of the data collected and shared without convenient control or monitoring.\ua0The General Data Protection Regulation (GDPR) of the European Union has been in effect for more than three years, limiting how organizations collect, manage, and handle personal data. The GDPR poses both new challenges and opportunities for technological institutions. In this work, we address various aspects of privacy and propose approaches that can overcome some challenges of the GDPR.\ua0We focus on improving two currently adopted approaches to leverage them to enforce some of the GDPR\u27s requirements by design.\ua0The first part of this work is devoted to developing an access control model to effectively capture the nature of information accessed and shared in online social networks (OSNs).\ua0They might raise serious problems in what concerns users\u27 privacy. One privacy risk is caused by accessing and sharing co-owned data items, i.e., when a user posts a data item that involves other users, some users\u27 privacy might be disclosed. Another risk is caused by the privacy settings offered by OSNs that do not, in general, allow fine-grained enforcement.\ua0We propose a collaborative access control framework to deal with such privacy issues. We also present a proof-of-concept implementation of our approach.In the second part of the thesis, we adopt Data Flow Diagrams (DFDs) as a convenient representation to integrate privacy engineering activities into software design. DFDs are inadequate as a modeling tool for privacy, and there is a need to evolve them to be a privacy-aware approach.\ua0The first privacy-related lack that we solve is automatically inserting privacy requirements during design. Secondly, since DFDs have a hierarchical structure, we propose a refinement framework for DFDs that preserves structural and functional properties and the underlying privacy concepts. Finally, we take a step towards modeling privacy properties, and in particular purpose limitation, in DFDs, by defining a mathematical framework that elaborates how the purpose of a DFD should be specified, verified, or inferred. We provide proof-of-concept tools for all the proposed frameworks and evaluate them through case studies

    Enhancing and integration of security testing in the development of a microservices environment

    Get PDF
    In the last decade, web application development is moving toward the adoption of Service-Oriented Architecture (SOA). Accordingly to this trend, Software as a Service (SaaS) and Serverless providers are embracing DevOps with the latest tools to facilitate the creation, maintenance and scalability of microservices system configuration. Even if within this trend, security is still an open point that is too often underestimated. Many companies are still thinking about security as a set of controls that have to be checked before the software is used in production. In reality, security needs to be taken into account all along the entire Software Development Lifecycle (SDL). In this thesis, state of the art security recommendations for microservice architecture are reviewed, and useful improvements are given. The main target is for secure to become integrated better into a company workflow, increasing security awareness and simplifying the integration of security measures throughout the SDL. With this background, best practices and recommendations are compared with what companies are currently doing in securing their service-oriented infrastructures. The assumption that there still is much ground to cover security-wise still standing. Lastly, a small case study is presented and used as proof of how small and dynamic startups can be the front runners of high cybersecurity standards. The results of the analysis show that it is easier to integrate up-to-date security measures in a small company

    Optimising a defence-aware threat modelling diagram incorporating a defence-in-depth approach for the internet-of-things

    Get PDF
    Modern technology has proliferated into just about every aspect of life while improving the quality of life. For instance, IoT technology has significantly improved over traditional systems, providing easy life, time-saving, financial saving, and security aspects. However, security weaknesses associated with IoT technology can pose a significant threat to the human factor. For instance, smart doorbells can make household life easier, save time, save money, and provide surveillance security. Nevertheless, the security weaknesses in smart doorbells could be exposed to a criminal and pose a danger to the life and money of the household. In addition, IoT technology is constantly advancing and expanding and rapidly becoming ubiquitous in modern society. In that case, increased usage and technological advancement create security weaknesses that attract cybercriminals looking to satisfy their agendas. Perfect security solutions do not exist in the real world because modern systems are continuously improving, and intruders frequently attempt various techniques to discover security flaws and bypass existing security control in modern systems. In that case, threat modelling is a great starting point in understanding the threat landscape of the system and its weaknesses. Therefore, the threat modelling field in computer science was significantly improved by implementing various frameworks to identify threats and address them to mitigate them. However, most mature threat modelling frameworks are implemented for traditional IT systems that only consider software-related weaknesses and do not address the physical attributes. This approach may not be practical for IoT technology because it inherits software and physical security weaknesses. However, scholars employed mature threat modelling frameworks such as STRIDE on IoT technology because mature frameworks still include security concepts that are significant for modern technology. Therefore, mature frameworks cannot be ignored but are not efficient in addressing the threat associated with modern systems. As a solution, this research study aims to extract the significant security concept of matured threat modelling frameworks and utilise them to implement robust IoT threat modelling frameworks. This study selected fifteen threat modelling frameworks from among researchers and the defence-in-depth security concept to extract threat modelling techniques. Subsequently, this research study conducted three independent reviews to discover valuable threat modelling concepts and their usefulness for IoT technology. The first study deduced that integration of threat modelling approach software-centric, asset-centric, attacker-centric and data-centric with defence-in-depth is valuable and delivers distinct benefits. As a result, PASTA and TRIKE demonstrated four threat modelling approaches based on a classification scheme. The second study deduced the features of a threat modelling framework that achieves a high satisfaction level toward defence-in-depth security architecture. Under evaluation criteria, the PASTA framework scored the highest satisfaction value. Finally, the third study deduced IoT systematic threat modelling techniques based on recent research studies. As a result, the STRIDE framework was identified as the most popular framework, and other frameworks demonstrated effective capabilities valuable to IoT technology. Respectively, this study introduced Defence-aware Threat Modelling (DATM), an IoT threat modelling framework based on the findings of threat modelling and defence-in-depth security concepts. The steps involved with the DATM framework are further described with figures for better understatement. Subsequently, a smart doorbell case study is considered for threat modelling using the DATM framework for validation. Furthermore, the outcome of the case study was further assessed with the findings of three research studies and validated the DATM framework. Moreover, the outcome of this thesis is helpful for researchers who want to conduct threat modelling in IoT environments and design a novel threat modelling framework suitable for IoT technology

    Analysing Integration and Information Security: Enterprise Service Bus Solution for Smart Grid

    Get PDF
    Electricity is the lifeline of modern society. Without major improvements and new technology, the current electric grid cannot meet the future demand for safe, reliable, sustainable, and affordable electricity. A proposed solution is the Smart Grid that utilises advanced information and communication technologies (ICT). The Smart Grid will help to change the ways electricity is produced and consumed. This thesis focuses on two important areas in the Smart Grid: the integration of existing and new information systems, and the information security of the integration solutions. The Smart Grids and Energy Markets (SGEM) is a project for extensive research on the future of electric energy. As part of the SGEM project, this thesis focuses on the integration of information systems within the distribution domain. Earlier research suggests that concepts such as Service-Oriented Architecture (SOA), Enterprise Service Bus (ESB), and Common Information Model (CIM) are essential for a successful Smart Grid integration. The goal of this work was to study these topics and to provide an integration component to be used in a concrete demonstration environment. The theoretical background section consists of research on various integration architectures and their characteristics, and provides details of their functionality and performance. The integration landscape includes an introduction to the Smart Grid, the electricity distribution domain and related information systems, and the most important standards in the field. An introduction is provided to Microsoft BizTalk Server, the integration platform used in this project. Information security is a key aspect that cross-cuts the entire work. A specific section for related information security aspects is included for each of the discussed topics. The experimental part of this work started from an example ICT architecture and three use cases as described previously within the SGEM project. The use cases are analysed in detail using a data flow approach to define the specific integration and information security requirements. A BizTalk based demonstration environment was designed and implemented. It will serve as a foundation for future work and allow for the integration of other parts of the example architecture. The main result of this work is that, although SOA, ESB, and CIM are beneficial concepts, they are no silver bullet for integration issues. Further, they fundamentally change the approach to information security; this is particularly true for service-orientation. BizTalk offers a viable platform for integration, but, as an ESB, has certain limitations that must be carefully considered. A guideline for implementing the said concepts is offered to aid future integration work. It can be used to lower the barriers for collaboration between experts in the fields of electricity, integration, and information security. Co-operation of the foresaid parties is crucial for building secure, reliable, and efficient integration that will meet the needs of the Smart Grid.Sähköenergia on elintärkeää modernin yhteiskunnan toimivuudelle. Tulevaisuudessa tarvitaan yhä enemmän turvallista, luotettavaa, ympäristön kannalta kestävää ja riittävän edullista sähköenergiaa. Nykyinen sähköverkko vaatii kehittämistä ja merkittäviä parannuksia, jotta se pystyy vastaamaan näihin tarpeisiin. Ratkaisuksi on ehdotettu älykästä sähköverkkoa, Smart Gridiä. Tavoitteena on kehittää uusia tapoja tuottaa ja kuluttaa sähköä hyödyntämällä sähköverkon toteutuksessa laajamittaisesti tieto- ja viestintäteknologioita. Tässä työssä käsitellään kahta Smart Gridin kannalta tärkeää aihetta: tietojärjestelmien integrointia ja tietoturvallisuutta. Smart Grids and Energy Markets (SGEM) -projekti tutkii laaja-alaisesti sähköenergian tulevaisuutta. Osana SGEM-projektia tämä diplomityö keskittyy sähkön jakeluverkon hallinnassa käytettävien tietojärjestelmien integrointiin, sekä siihen liittyvään tietoturvaan. Aiemman tutkimuksen perusteella integraatioratkaisun tärkeimmiksi osa-alueiksi on todettu palveluväylään perustuva palvelupohjainen arkkitehtuuri, sekä kaikille toimijoille yhteinen tietomalli. Tämän työn tavoitteena on tarjota konkreettisia ohjeita ja esimerkkejä mainittujen konseptien hyödyntämisestä. Tarkoitus on demonstroida projektissa aiemmin esitettyä malliarkkitehtuuria rakentamalla testiympäristö ja toteuttamalla siinä tarvittava integraatioratkaisu. Yhtenä päätavoitteena oli tutkia integraation teoriaa ja eri arkkitehtuureja ja esitellä niiden toiminnallisuuden ja suorituskyvyn olennaisia eroja. Monet tahot tarjoavat ohjelmistoalustoja, jotka toimivat eri integraatioarkkitehtuurien käytännön toteutusten pohjana. Toinen päätavoite oli evaluoida erästä integraatio-ohjelmistoa, Microsoftin BizTalk Serveriä. Evaluoinnin pohjana ovat yksityiskohtainen analyysi ja BizTalkiin perustuvan demonstraatioympäristön rakentaminen. Tavoitteena oli toteuttaa tässä ympäristössä yksinkertaisia testejä ja luoda perusta, jota voidaan hyödyntää tulevissa testauksissa. BizTalk-ympäristön tulee mahdollistaa uusien järjestelmien integrointi myöhemmin. Tietoturva tulee ottaa huomioida integrointiprosessin kaikissa vaiheissa. Se on siten koko työtä läpileikkaava aihealue, jota erityisesti painotetaan. Työn ensimmäinen osa esittelee teoreettista taustaa ja toimintaympäristön. Toinen luku esittelee lyhyesti sähköverkon toimintaa lukijoille, joilla ei ole sähköalan taustaa. Olennainen osa on älykkään sähköverkon tietoturva-aspektien käsittely. Smart Grid on ympäristönä ainutlaatuinen yhdistelmä perinteisen tietotekniikan ja automaatioalan järjestelmiä. Laajuutensa ja monimutkaisuutensa vuoksi se on ennennäkemättömän haastava toimintaympäristö tietoturvan kannalta. Automaatiojärjestelmien erityispiirteet, muun muassa reaaliaikavaatimukset, tulee huomioida myös tietoturvan suunnittelussa ja toteutuksessa. Kolmannessa luvussa käsitellään integraation ja eri arkkitehtuurien kehitystä. Luvussa esitellään työn kannalta olennaiset konseptit: palveluorientoitunut arkkitehtuuri (Service-Oriented Architecture, SOA) ja palveluväylä (Enterprise Service Bus, ESB). Samalla käsitellään myös palveluväylän tärkeimmät erot perinteisempään yrityssovellusten integrointiin (Enterprise Application Integration, EAI) verrattuna. Väliohjelmiston (middleware) testaamiseen ja valintaan vaikuttavia asioita sekä tietoturvaa käydään läpi. Tietoturvassa erityisesti palveluorientoituneisuus aiheuttaa suuria muutoksia: monet perinteisessä sovellusarkkitehtuurissa käytetyt tietoturvan toteutusmenetelmät eivät enää ole käyttökelpoisia. Neljäs luku esittelee aluksi tutkimusongelmaa ja toimintaympäristöä eli sähkön jakeluverkon moninaisia tietojärjestelmiä sekä niiden välisiä kommunikaatiotarpeita. Jakeluverkko-operaattorin (Distribution System Operator, DSO) tärkeimmät tietojärjestelmät sekä yhteinen tietomalli (Common Information Model, CIM) esitellään lyhyesti. Lisäksi tärkeimmät standardit ja suositukset käydään läpi, koska niillä on olennainen rooli minkä tahansa laajan ja monimutkaisen järjestelmän kehittämisessä. Tarkastelun näkökulmina ovat Smart Grid, integraatio yleisellä tasolla ja tietoturva Smart Gridissä. Lopuksi esitellään tietovuot ja tietovuokaaviot (Data Flow Diagrams, DFD), jotka tarjoavat hyvän perustan eri järjestelmien välisten tiedonsiirtotarpeiden käsittelyyn ja helpottavat myös tietoturvavaatimusten analysointia. Työssä käytetty integraatioratkaisu, Microsoft BizTalk Server, esitellään viidennessä luvussa. Luvussa kuvataan lyhyesti, mitä BizTalk tekee, mihin sitä voidaan käyttää ja miten se on toteutettu teknisesti. BizTalk on pohjimmiltaan viestinvälitysohjelmisto (message broker). Viestien välityksen toteuttavien komponenttien ja toimintalogiikan esittely antaa hyvän kuvan BizTalkin toiminnasta ja käyttömahdollisuuksista. Toimintalogiikan lisäksi käydään lyhyesti läpi BizTalkin asennus, sovelluskehitys, ajonaikainen ympäristö ja ylläpito. BizTalk on kehitetty alun perin EAI-tuotteeksi, mutta ESB Toolkit -laajennuksen avulla sitä voidaan käyttää myös ESB-palveluväylän rakentamisen perustana. ESB Toolkitin kehitys ja toiminnallisuus käydään läpi. Lopuksi käsitellään myös BizTalkin tietoturvaominaisuuksia. Kuten monet väliohjelmistot ja integraatiotuotteet, BizTalk on monimutkainen ohjelmistokokonaisuus. On syytä korostaa, että sen syvällinen tuntemus vaatii huomattavaa kokemusta. Yhden diplomityön puitteissa BizTalk voidaan esitellä vain pintapuolisesti. Työn toinen osa kuvaa esimerkkiarkkitehtuurin, rakennetun testiympäristön ja testauksen pohjana toimineet kolme käyttötapausesimerkkiä. Arkkitehtuuri ja käyttötapaukset pohjautuvat SGEM-projektissa aiemmin saatuihin tuloksiin. Testiympäristön tarkoituksena on toteuttaa osa malliarkkitehtuurista, tämän työn tavoittena on erityisesti integraatiokomponenttina toimivan BizTalk-pohjaisen palveluväylän toteutus. Testiympäristö ei siis sisällä kaikkia malliarkkitehtuurin osia, ja siihen tulee voida myöhemmin lisätä uusia järjestelmiä. Käyttötapaukset toimivat esimerkkeinä, ja uusia käyttötapauksia tulee voida jatkossa testata demonstraatioympäristön avulla. Testiosuus perustuu käyttötapausten yksityiskohtaiseen analysointiin ja toteutukseen siinä määrin kuin se on testiympäristössä mahdollista. Analysoinnin lähtökohtana perehdyttiin integroitavien järjestelmien välisiin tiedonsiirtotarpeisiin jokaisen eri käyttötapauksissa. Tiedonsiirtoa havainnollistettiin tietovuokaavioiden avulla. Tietovuot ovat hyödyllinen apuväline myös integrointiin liittyvien tietoturvariskien ja -vaatimusten analysoinnissa. Työn kolmannessa osassa käydään läpi tulokset ja johtopäätökset. Testiympäristöä rakennettaessa ja käyttötapauksia analysoitaessa kävi ilmi, että kokonaisuudessa on vielä suuria puutteita. Testiympäristön integraatiokomponentti eli BizTalk asennettiin ja sillä suoritettiin yksinkertaisia testejä. Käyttötapausten toteutus jäi puutteelliseksi osaltaan siksi, että ympäristön monia muita järjestelmiä ei ollut saatavilla. Kuitenkin jo käyttötapausten analysointivaihe toi ilmi monia ongelmakohtia. Havaitut ongelmat ja niihin liittyvät kehitysehdotukset on käyty läpi käyttötapauskohtaisesti seitsemännessä luvussa. Kahdeksas luku esittelee käyttötapausten analysoinnista opittuihin asioihin pohjautuvan ohjeistuksen, jota voidaan käyttää tulevien käyttötapausten suunnittelussa. Yhdessä BizTalk-luvun teorian ja asennetun BizTalk-ympäristön kanssa ohjeistus helpottaa ympäristön jatkokehitystä. Ohjeiden mukaisen prosessin avulla uusien käyttötapausten analysointi ja suunnittelu ja sitä kautta tietoturvallisen integraation rakentaminen helpottuu. Jakeluverkon tietojärjestelmien turvallinen ja toimiva integraatio on älykkään sähköverkon toteutuksen avaintekijöitä. Palveluorientoitunut arkkitehtuuri, palveluväylä sekä yhteinen tietomalli voivat tarjota ratkaisuja integraation haasteisiin. Johtopäätöksenä voidaan kuitenkin todeta, että ne vaativat merkittäviä muutoksia sekä ajatusmalleissa että ohjelmistojen ja integraation toteutustavoissa. Ne eivät ole integraation hopealuoteja eivätkä olemassa olevan arkkitehtuurin päälle liimattavia komponentteja, jotka ratkaisisivat integraatio-ongelmat. Lisäksi erityisesti palveluorientoituneisuus vie pohjan monilta pitkään käytössä olleilta tietoturvan toteutustavoilta ja vaatii uutta ajattelua myös tietoturvaratkaisuihin. Olennaisen tärkeää on ymmärtää palveluväylän erot perinteisempiin integraatioratkaisuihin nähden ja verrata näitä toteutusvaihtoehtoja integraatiolle asetettuihin vaatimuksiin. Jakeluverkko-operaattorin tietojärjestelmät ovat monoliittisia, eivätkä ne välittömästi muutu palvelupohjaisiksi. Ala kehittyy muutenkin hitaasti muun muassa sähköverkon toiminnan kriittisyyden vuoksi. Lisäksi toimintaympäristö pysyy suhteellisen samanlaisena, vaikka muutokset tulevaisuudessa lienevätkin aiempaa nopeampia. Tällaisessa ympäristössä myös perinteinen, monoliittinen viestinvälityspalvelin saattaa olla hyvä integraatioratkaisu. Integraatioratkaisut kehittyvät kohti palvelupohjaisuutta ja dynaamisen palveluväylän hyödyntämistä, mutta käytännön toteutuksen vaatimat merkittävät muutokset tulee ymmärtää ja huomioida. Tämän työn perusteella ESB-pohjaisen palveluorientoituneen integraatioratkaisun käyttöönotto sähkön jakeluverkkoympäristössä vaatii huomattavaa jatkokehitystä. Työn teoriaosuus toimii johdantona aiheeseen, ja tuloksena kehitetty ohjeellinen prosessi tarjoaa perustan käytännön toteutuksen kehittämiseen

    CIB W115 Green Design Conference:Sarajevo, Bosnia and Herzegovina 27 - 30 September 2012

    Get PDF

    Threat Assessment and Risk Analysis (TARA) for Interoperable Medical Devices in the Operating Room Inspired by the Automotive Industry

    Get PDF
    Prevailing trends in the automotive and medical device industry, such as life cycle overarching configurability, connectivity, and automation, require an adaption of development processes, especially regarding the security and safety thereof. The changing requirements imply that interfaces are more exposed to the outside world, making them more vulnerable to cyberattacks or data leaks. Consequently, not only do development processes need to be revised but also cybersecurity countermeasures and a focus on safety, as well as privacy, have become vital. While vehicles are especially exposed to cybersecurity and safety risks, the medical devices industry faces similar issues. In the automotive industry, proposals and draft regulations exist for security-related risk assessment processes. The medical device industry, which has less experience in these topics and is more heterogeneous, may benefit from drawing inspiration from these efforts. We examined and compared current standards, processes, and methods in both the automotive and medical industries. Based on the requirements regarding safety and security for risk analysis in the medical device industry, we propose the adoption of methods already established in the automotive industry. Furthermore, we present an example based on an interoperable Operating Room table (OR table)

    The BrightEyes-TTM: an open-source time-tagging module for fluorescence lifetime imaging microscopy applications

    Get PDF
    The aim of this Ph.D. work is to reason and show how an open-source multi-channel and standalone time-tagging device was developed, validated and used in combination with a new generation of single-photon array detectors to pursue super-resolved time-resolved fluorescence lifetime imaging measurements. Within the compound of time-resolved fluorescence laser scanning microscopy (LSM) techniques, fluorescence lifetime imaging microscopy (FLIM) plays a relevant role in the life-sciences field, thanks to its ability of detecting functional changes within the cellular micro-environment. The recent advancements in photon detection technologies, such as the introduction of asynchronous read-out single-photon avalanche diode (SPAD) array detectors, allow to image a fluorescent sample with spatial resolution below the diffraction limit, at the same time, yield the possibility of accessing the single-photon information content allowing for time-resolved FLIM measurements. Thus, super-resolved FLIM experiments can be accomplished using SPAD array detectors in combination with pulsed laser sources and special data acquisition systems (DAQs), capable of handling a multiplicity of inputs and dealing with the single-photons readouts generated by SPAD array detectors. Nowadays, the commercial market lacks a true standalone, multi-channel, single-board, time-tagging and affordable DAQ device specifically designed for super-resolved FLIM experiments. Moreover, in the scientific community, no-efforts have been placed yet in building a device that can compensate such absence. That is why, within this Ph.D. project, an open-source and low-cost device, the so-called BrightEyes-TTM (time tagging module), was developed and validated both for fluorescence lifetime and time-resolved measurements in general. The BrightEyes-TTM belongs to a niche of DAQ devices called time-to-digital converters (TDCs). The field-gate programmable array (FPGA) technology was chosen for implementing the BrightEyes-TTM thanks to its reprogrammability and low cost features. The literature reports several different FPGA-based TDC architectures. Particularly, the differential delay-line TDC architecture turned out to be the most suitable for this Ph.D. project as it offers an optimal trade-off between temporal precision, temporal range, temporal resolution, dead-time, linearity, and FPGA resources, which are all crucial characteristics for a TDC device. The goal of the project of pursuing a cost-effective and further-upgradable open-source time-tagging device was achieved as the BrigthEyes-TTM was developed and assembled using low-cost commercially available electronic development kits, thus allowing for the architecture to be easily reproduced. BrightEyes-TTM was deployed on a FPGA development board which was equipped with a USB 3.0 chip for communicating with a host-processing unit and a multi-input/output custom-built interface card for interconnecting the TTM with the outside world. Licence-free softwares were used for acquiring, reconstructing and analyzing the BrightEyes-TTM time-resolved data. In order to characterize the BrightEyes-TTM performances and, at the same time, validate the developed multi-channel TDC architecture, the TTM was firstly tested on a bench and then integrated into a fluorescent LSM system. Yielding a 30 ps single-shot precision and linearity performances that allows to be employed for actual FLIM measurements, the BrightEyes-TTM, which also proved to acquire data from many channels in parallel, was ultimately used with a SPAD array detector to perform fluorescence imaging and spectroscopy on biological systems. As output of the Ph.D. work, the BrightEyes-TTM was released on GitHub as a fully open-source project with two aims. The principal aim is to give to any microscopy and life science laboratory the possibility to implement and further develop single-photon-based time-resolved microscopy techniques. The second aim is to trigger the interest of the microscopy community, and establish the BrigthEyes-TTM as a new standard for single-photon FLSM and FLIM experiments

    FPGA based Embedded System to control an electric vehicle and the driver assistance systems

    Get PDF
    This Master Thesis involves the development of an embedded system based on FPGA for controlling an electric vehicle based on a Kart platform and its electronic driving aids. It consists of two distinct stages in the process of hardware-software co-design, hardware development, which includes all the elements of the periphery of the processor and communication elements, all developed in VHDL. An important part of the hardware development also include the development of electronic driving aids, which include traction control and torque vectoring differential gear, in hardware coprocessors, also writen in VHDL. The other part of the co-design is the development of the control software, which is going to be executed by the embedded system’s processor. This Master Thesis will be used in a range of new electric vehicles that will be built in a near future and also gives the base for future thesis in the fields of automotive, electronics and computing
    corecore