540 research outputs found

    Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach.

    Get PDF
    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain-computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a "containerized" approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data "Levels," each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org)

    Rakennuksen käyttöjärjestelmän luonti: kokonaisvaltainen lähestymistapa

    Get PDF
    Purpose of this thesis is to examine requirements for a building operating system from a holistic perspective. To understand the context of the subject, an extensive literature review was carried out which explores the evolution of operating systems alongside the history of computing, unravelling the concept of an operating system. In addition, various building information systems, including building automation systems and internet of things systems are reviewed in order to understand modern and future trends of building technology. Furthermore, literature review investigates telecommunications and digital identity authentication through their evolution and standardisation towards interoperability, to provide knowledge on how to achieve interoperability in building systems. An interview study was conducted as the empirical part of the study in order to complement the theoretical framework of the thesis. A dozen building digitalisation experts were interviewed, inquiring their insights on the current and future situation of building systems. More closely, open systems, open data, platform ownership, disruption, killer applications, user-centredness, and Finland’s opportunities were discussed in respect of the building operating system. Building operating system requires connection between various technology inside a building, and collaboration between various parties who use and manage the building. The system should exploit open standards and enable open data. User-centred development should be encouraged for the benefits of end users. The system needs to expand globally to achieve critical mass and unleash its full potential as a platform. Each building with similar properties should have the same features, being able to use same services and applications in any building with an operating system, thus enabling portability. The system requires convenient software development kits, application programming interfaces and abstractions for the needs of software and service developers. A vibrant developer community is required to expand the platform and enable a wide range of services and applications.Tämän diplomityön tarkoituksena on tutkia rakennuksen käyttöjärjestelmän holistisia vaatimuksia. Laaja kirjallisuuskatsaus tehtiin aiheen ymmärtämiseksi, joka tutkii käyttöjärjestelmien evoluutiota rinnakkain tietojenkäsittelyn historian kanssa, tarkoituksena hahmottaa käyttöjärjestelmän käsitettä. Lisäksi, eri rakennusten tietojärjestelmiä, mukaan lukien rakennusautomaatiojärjestelmiä ja esineiden internet -järjestelmiä käytiin läpi ymmärtääkseen nykyisiä ja tulevia trendejä rakennusteknologiassa. Edelleen kirjallisuuskatsaus tutkii televiestintää ja sähköistä tunnistautumista niiden kehityksen ja standardisoinnin kautta kohti yhteentoimivuutta, tarjoten tietoa siitä, miten yhteentoimivuutta voitaisiin kehittää rakennusjärjestelmissä. Haastattelututkimus tehtiin diplomityön empiirisenä osuutena, jonka tarkoituksena oli laajentaa työn teoreettista viitekehystä. Tusina rakennusten digitalisaation asiantuntijaa haastateltiin, joilta kysyttiin rakennusjärjestelmien nykytilasta ja tulevaisuudesta. Lähemmin, keskustelut käsittelivät avoimia järjestelmiä, avointa dataa, alustan omistajuutta, disruptiota, menestyssovelluksia, käyttäjäkeskeisyyttä sekä Suomen kansainvälistä potentiaalia rakennuksen käyttöjärjestelmän näkökulmasta. Rakennuksen käyttöjärjestelmä vaatii rakennuksen sisällä olevien eri teknologioiden yhteenliittämisen, sekä yhteistyötä rakennusta käyttävien ja hallinnoivien osapuolten välillä. Järjestelmän pitäisi hyödyntää avoimia standardeja ja mahdollistaa avoimen datan käytön. Käyttäjäkeskeistä suunnittelua pitäisi kannustaa loppukäyttäjien etuja suosien. Järjestelmän täytyy levitä globaalisti saavuttaakseen kriittisen massan ja ottaakseen käyttöön sen koko potentiaalin. Jokaisella samankaltaisella rakennuksella täytyisi olla käytössään yhtäläiset ominaisuudet, mahdollistaen samojen palveluiden ja sovellusten käytön missä tahansa käyttöjärjestelmää käyttävässä rakennuksessa, täten mahdollistaen siirrettävyyden. Järjestelmä vaatii sopivat ohjelmointirajapinnat, abstraktiot ja ohjelmistokehykset sovellus- ja palvelukehittäjien tarpeita varten. Laaja kehitysyhteisö vaaditaan alustan levittämiseksi ja sovellustarjonnan laajentamiseksi

    A decentralized framework for cross administrative domain data sharing

    Get PDF
    Federation of messaging and storage platforms located in remote datacenters is an essential functionality to share data among geographically distributed platforms. When systems are administered by the same owner data replication reduces data access latency bringing data closer to applications and enables fault tolerance to face disaster recovery of an entire location. When storage platforms are administered by different owners data replication across different administrative domains is essential for enterprise application data integration. Contents and services managed by different software platforms need to be integrated to provide richer contents and services. Clients may need to share subsets of data in order to enable collaborative analysis and service integration. Platforms usually include proprietary federation functionalities and specific APIs to let external software and platforms access their internal data. These different techniques may not be applicable to all environments and networks due to security and technological restrictions. Moreover the federation of dispersed nodes under a decentralized administration scheme is still a research issue. This thesis is a contribution along this research direction as it introduces and describes a framework, called \u201cWideGroups\u201d, directed towards the creation and the management of an automatic federation and integration of widely dispersed platform nodes. It is based on groups to exchange messages among distributed applications located in different remote datacenters. Groups are created and managed using client side programmatic configuration without touching servers. WideGroups enables the extension of the software platform services to nodes belonging to different administrative domains in a wide area network environment. It lets different nodes form ad-hoc overlay networks on-the-fly depending on message destinations located in distinct administrative domains. It supports multiple dynamic overlay networks based on message groups, dynamic discovery of nodes and automatic setup of overlay networks among nodes with no server-side configuration. I designed and implemented platform connectors to integrate the framework as the federation module of Message Oriented Middleware and Key Value Store platforms, which are among the most widespread paradigms supporting data sharing in distributed systems

    The European Industrial Data Space (EIDS)

    Get PDF
    This research work has been performed in the framework of the Boost 4.0 Big Data lighthouse initiative, a project that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement no. 780732. This datadriven digital transformation research is also endorsed by the Digital Factory Alliance (DFA)The path that the European Commission foresees to leverage data in the best possible way for the sake of European citizens and the digital single market clearly addresses the need for a European Data Space. This data space must follow the rules, derived from European values. The European Data Strategy rests on four pillars: (1) Governance framework for access and use; (2) Investments in Europe’s data capabilities and infrastructures; (3) Competences and skills of individuals and SMEs; (4) Common European Data Spaces in nine strategic areas such as industrial manufacturing, mobility, health, and energy. The project BOOST 4.0 developed a prototype for the industrial manufacturing sector, called European Industrial Data Space (EIDS), an endeavour of 53 companies. The publication will show the developed architectural pattern as well as the developed components and introduce the required infrastructure that was developed for the EIDS. Additionally, the population of such a data space with Big Data enabled services and platforms is described and will be enriched with the perspective of the pilots that have been build based on EIDS.publishersversionpublishe

    A cloud adoption framework for South African SMEs

    Get PDF
    Small to Medium Enterprises (SMEs) have been touted as key enablers to the economic development of most countries. Despite growing evidence that most SMEs fail within their initial years, ICTs have been found to add substantial value in facilitating their success. However, in most developing countries, ICT adoption by SMEs has been plagued with a plethora of challenges ranging from poor electricity supply, high ICT costs, lack of ICT expertise to lack of government support. While this might seem problematic for SMEs, the adoption and the use of cloud services mitigates some of these challenges. The problem, however, is that a limited amount of literature has provided guidance with regard to how the cloud adoption process should be carried out by SMEs. The objective of this research, was therefore, to address this by developing a framework that can be used by SMEs to guide them through the cloud adoption process. To this end, thirteen (13) semi-structured interviews were conducted across nine (9) SMEs in the Eastern Cape. The resultant interview transcripts were analysed using an established thematic approach; the result of which allowed for the development of a rich interpretive narrative about SME cloud adoption. Combined with theory from extant literature, this culminated in the development of a framework for cloud services adoption for SMEs in the Eastern Cape

    Attribute Based Encryption for Secure Data Access in Cloud

    Get PDF
    Cloud computing is a progressive computing worldview, which empowers adaptable, on-request, and ease use of Information Technology assets. However, the information transmitted to some cloud servers, and various protection concerns are arising out of it. Different plans given the property-based encryption have been proposed to secure the Cloud Storage. In any case, most work spotlights on the information substance security and the get to control, while less consideration towards the benefit control and the character protection. In this paper, a semi-anonymous benefit control conspires AnonyControl to address the information protection, as well as the client character security in existing access control plans. AnonyControl decentralizes the central authority to restrain the character spillage and accordingly accomplishes semi-anonymity. Furthermore, it likewise sums up the document get to control to the benefit control, by which advantages of all operations on the cloud information managed in a fine-grained way. Along these lines, display the AnonyControl-F, which ultimately keeps the character spillage and accomplish the full secrecy. Our security assessment demonstrates that both AnonyControl and AnonyControl-F are secure under the decisional bilinear Diffie-Hellman presumption, and our execution assessment shows the attainability of our plans. Index Terms: Anonymity, multi-authority, attribute-based encryption

    Strategies of development and maintenance in supervision, control, synchronization, data acquisition and processing in light sources

    Get PDF
    Programa Oficial de Doutoramento en Tecnoloxías da Información e as Comunicacións. 5032V01[Resumo] Os aceleradores de partículas e fontes de luz sincrotrón, evolucionan constantemente para estar na vangarda da tecnoloxía, levando os límites cada vez mais lonxe para explorar novos dominios e universos. Os sistemas de control son unha parte crucial desas instalacións científicas e buscan logra-la flexibilidade de manobra para poder facer experimentos moi variados, con configuracións diferentes que engloban moitos tipos de detectores, procedementos, mostras a estudar e contornas. As propostas de experimento son cada vez máis ambiciosas e van sempre un paso por diante do establecido. Precísanse detectores cada volta máis rápidos e eficientes, con máis ancho de banda e con máis resolución. Tamén é importante a operación simultánea de varios detectores tanto escalares como mono ou bidimensionáis, con mecanismos de sincronización de precisión que integren as singularidades de cada un. Este traballo estuda as solucións existentes no campo dos sistemas de control e adquisición de datos nos aceleradores de partículas e fontes de luz e raios X, ó tempo que explora novos requisitos e retos no que respecta á sincronización e velocidade de adquisición de datos para novos experimentos, a optimización do deseño, soporte, xestión de servizos e custos de operación. Tamén se estudan diferentes solucións adaptadas a cada contorna.[Resumen] Los aceleradores de partículas y fuentes de luz sincrotrón, evolucionan constantemente para estar en la vanguardia de la tecnología, y poder explorar nuevos dominios. Los sistemas de control son una parte fundamental de esas instalaciones científicas y buscan lograr la máxima flexibilidad para poder llevar a cabo experimentos más variados, con configuraciones diferentes que engloban varios tipos de detectores, procedimientos, muestras a estudiar y entornos. Los experimentos se proponen cada vez más ambiciosos y en ocasiones más allá de los límites establecidos. Se necesitan detectores cada vez más rápidos y eficientes, con más resolución y ancho de banda, que puedan sincronizarse simultáneamente con otros detectores tanto escalares como mono y bidimensionales, integrando las singularidades de cada uno y homogeneizando la adquisición de datos. Este trabajo estudia los sistemas de control y adquisición de datos de aceleradores de partículas y fuentes de luz y rayos X, y explora nuevos requisitos y retos en lo que respecta a la sincronización y velocidad de adquisición de datos, optimización y costo-eficiencia en el diseño, operación soporte, mantenimiento y gestión de servicios. También se estudian diferentes soluciones adaptadas a cada entorno.[Abstract] Particle accelerators and photon sources are constantly evolving, attaining the cutting-edge technologies to push the limits forward and explore new domains. The control systems are a crucial part of these installations and are required to provide flexible solutions to the new challenging experiments, with different kinds of detectors, setups, sample environments and procedures. Experiment proposals are more and more ambitious at each call and go often a step beyond the capabilities of the instrumentation. Detectors shall be faster, with higher efficiency, more resolution, more bandwidth and able to synchronize with other detectors of all kinds; scalars, one or two-dimensional, taking into account their singularities and homogenizing the data acquisition. This work examines the control and data acquisition systems for particle accelerators and X- ray / light sources and explores new requirements and challenges regarding synchronization and data acquisition bandwidth, optimization and cost-efficiency in the design / operation / support. It also studies different solutions depending on the environment

    Case study : an evaluation of the implementation of an Enterprise Resource Planning System (ERP) at a South African municipal entity.

    Get PDF
    Master of Commerce in Information Systems and Technology. University of KwaZulu-Natal, Pietermaritzburg, 2018.Enterprise resource planning (ERP) is a computer based software application that is widely implemented in many business organisations. These systems have evolved over the years into component based modules with the ability to easily integrate with other systems, provide real time information and improve information sharing and collaboration. Choosing an ERP system is a complex process and the literature clearly illustrates the failure of organisations to effectively specify, select and implement ERP systems resulting in the inability to effectively harness the associated benefits. This study focuses on a South African water utility and the processes followed in procuring and implementing an ERP system. It is interesting to note, that in this study, despite the utility having experienced a failed ERP implementation decided to replace the same? ERP system. A rigorous process was followed to find a replacement ERP system only to set aside all alternate commercial of-the-shelf systems and re-implement the original failed ERP system. To gain insight of the processes followed, the COTS theoretical framework is presented to bring the reader’s attention to associated theoretical studies. In this study, we will conduct a systematic literature review on ERP systems, its background, implementation processes and associated implementation outcomes. This research, presents a case study that will describe and explore the process of ERP implementation at the water utility. We will document the process the utility followed in acquiring and setting up functional and non-functional evaluation criteria for the ERP system. Further, we have considered the processes of preparation, evaluation, selection and implementation. The analysis of the implementation process has brought to light the importance of defining ERP scopes based on business requirements, specifications based on the business scopes and evaluation criterion. The findings and results from this case study will contribute to the conceptual and contextual understanding of the specification, selection and implementation of ERP systems

    Advanced Digital Auditing

    Get PDF
    This open access book discusses the most modern approach to auditing complex digital systems and technologies. It combines proven auditing approaches, advanced programming techniques and complex application areas, and covers the latest findings on theory and practice in this rapidly developing field. Especially for those who want to learn more about novel approaches to testing complex information systems and related technologies, such as blockchain and self-learning systems, the book will be a valuable resource. It is aimed at students and practitioners who are interested in contemporary technology and managerial implications
    corecore