13 research outputs found

    Real-time Monitoring of Low Voltage Grids using Adaptive Smart Meter Data Collection

    Get PDF

    Políticas de Copyright de Publicações Científicas em Repositórios Institucionais: O Caso do INESC TEC

    Get PDF
    A progressiva transformação das práticas científicas, impulsionada pelo desenvolvimento das novas Tecnologias de Informação e Comunicação (TIC), têm possibilitado aumentar o acesso à informação, caminhando gradualmente para uma abertura do ciclo de pesquisa. Isto permitirá resolver a longo prazo uma adversidade que se tem colocado aos investigadores, que passa pela existência de barreiras que limitam as condições de acesso, sejam estas geográficas ou financeiras. Apesar da produção científica ser dominada, maioritariamente, por grandes editoras comerciais, estando sujeita às regras por estas impostas, o Movimento do Acesso Aberto cuja primeira declaração pública, a Declaração de Budapeste (BOAI), é de 2002, vem propor alterações significativas que beneficiam os autores e os leitores. Este Movimento vem a ganhar importância em Portugal desde 2003, com a constituição do primeiro repositório institucional a nível nacional. Os repositórios institucionais surgiram como uma ferramenta de divulgação da produção científica de uma instituição, com o intuito de permitir abrir aos resultados da investigação, quer antes da publicação e do próprio processo de arbitragem (preprint), quer depois (postprint), e, consequentemente, aumentar a visibilidade do trabalho desenvolvido por um investigador e a respetiva instituição. O estudo apresentado, que passou por uma análise das políticas de copyright das publicações científicas mais relevantes do INESC TEC, permitiu não só perceber que as editoras adotam cada vez mais políticas que possibilitam o auto-arquivo das publicações em repositórios institucionais, como também que existe todo um trabalho de sensibilização a percorrer, não só para os investigadores, como para a instituição e toda a sociedade. A produção de um conjunto de recomendações, que passam pela implementação de uma política institucional que incentive o auto-arquivo das publicações desenvolvidas no âmbito institucional no repositório, serve como mote para uma maior valorização da produção científica do INESC TEC.The progressive transformation of scientific practices, driven by the development of new Information and Communication Technologies (ICT), which made it possible to increase access to information, gradually moving towards an opening of the research cycle. This opening makes it possible to resolve, in the long term, the adversity that has been placed on researchers, which involves the existence of barriers that limit access conditions, whether geographical or financial. Although large commercial publishers predominantly dominate scientific production and subject it to the rules imposed by them, the Open Access movement whose first public declaration, the Budapest Declaration (BOAI), was in 2002, proposes significant changes that benefit the authors and the readers. This Movement has gained importance in Portugal since 2003, with the constitution of the first institutional repository at the national level. Institutional repositories have emerged as a tool for disseminating the scientific production of an institution to open the results of the research, both before publication and the preprint process and postprint, increase the visibility of work done by an investigator and his or her institution. The present study, which underwent an analysis of the copyright policies of INESC TEC most relevant scientific publications, allowed not only to realize that publishers are increasingly adopting policies that make it possible to self-archive publications in institutional repositories, all the work of raising awareness, not only for researchers but also for the institution and the whole society. The production of a set of recommendations, which go through the implementation of an institutional policy that encourages the self-archiving of the publications developed in the institutional scope in the repository, serves as a motto for a greater appreciation of the scientific production of INESC TEC

    Human Computer Interaction and Emerging Technologies

    Get PDF
    The INTERACT Conferences are an important platform for researchers and practitioners in the field of human-computer interaction (HCI) to showcase their work. They are organised biennially by the International Federation for Information Processing (IFIP) Technical Committee on Human–Computer Interaction (IFIP TC13), an international committee of 30 member national societies and nine Working Groups. INTERACT is truly international in its spirit and has attracted researchers from several countries and cultures. With an emphasis on inclusiveness, it works to lower the barriers that prevent people in developing countries from participating in conferences. As a multidisciplinary field, HCI requires interaction and discussion among diverse people with different interests and backgrounds. The 17th IFIP TC13 International Conference on Human-Computer Interaction (INTERACT 2019) took place during 2-6 September 2019 in Paphos, Cyprus. The conference was held at the Coral Beach Hotel Resort, and was co-sponsored by the Cyprus University of Technology and Tallinn University, in cooperation with ACM and ACM SIGCHI. This volume contains the Adjunct Proceedings to the 17th INTERACT Conference, comprising a series of selected papers from workshops, the Student Design Consortium and the Doctoral Consortium. The volume follows the INTERACT conference tradition of submitting adjunct papers after the main publication deadline, to be published by a University Press with a connection to the conference itself. In this case, both the Adjunct Proceedings Chair of the conference, Dr Usashi Chatterjee, and the lead Editor of this volume, Dr Fernando Loizides, work at Cardiff University which is the home of Cardiff University Press

    Models and applications for the Bitcoin ecosystem

    Get PDF
    Cryptocurrencies are widely known and used principally as a means of investment and payment by more and more users outside the restricted circle of technologists and computer scientists. However, like fiat money, they can also be used as a means for illegal activities, exploiting their pseudo-anonymity and easiness/speed in moving capitals. This thesis aims to provide a suite of tools and models to better analyze and understand several aspect of the Bitcoin blockchain. In particular, we developed a visual tool that highlights transaction islands, i.e., the sub-graphs disconnected from the super-graph, which represents the whole blockchain. We also show the distributions of Bitcoin transactions types and define new classes of nonstandard transactions. We analyze the addresses reuse in Bitcoin, showing that it corresponds to malicious activities in the Bitcoin ecosystem. Then we investigate whether solids or weak forms of arbitrage strategies are possible by trading across different Bitcoin Exchanges. We found that Bitcoin price/exchange rate is influenced by future and past events. Finally, we present a Stochastic Model to quantitative analyze different consensus protocols. In particular, the probabilistic analysis of the Bitcoin model highlights how forks happen and how they depend on specific parameters of the protocol

    Applying patterns in embedded systems design for managing quality attributes and their trade-offs

    Get PDF
    Embedded systems comprise one of the most important types of software-intensive systems, as they are pervasive and used in daily life more than any other type, e.g., in cars or in electrical appliances. When these systems operate under hard constraints, the violation of which can lead to catastrophic events, the system is classified as a critical embedded system (CES). The quality attributes related to these hard constraints are named critical quality attributes (CQAs). For example, the performance of the software for cruise-control or self-driving in a car are critical as they can potentially relate to harming human lives. Despite the growing body of knowledge on engineering CESs, there is still a lack of approaches that can support its design, while managing CQAs and their trade-offs with noncritical ones (e.g., maintainability and reusability). To address this gap, the state-of-research and practice on designing CES and managing quality trade-offs were explored, approaches to improve its design identified, and the merit of these approaches empirically investigated. When designing software, one common approach is to organize its components according to well-known structures, named design patterns. However, these patterns may be avoided in some classes of systems such as CES, as they are sometimes associated with the detriment of CQAs. In short, the findings reported in the thesis suggest that, when applicable, design patterns can promote CQAs while supporting the management of trade-offs. The thesis also reports on a phenomena, namely pattern grime, and factors that can influence the extent of the observed benefits

    Trading Indistinguishability-based Privacy and Utility of Complex Data

    Get PDF
    The collection and processing of complex data, like structured data or infinite streams, facilitates novel applications. At the same time, it raises privacy requirements by the data owners. Consequently, data administrators use privacy-enhancing technologies (PETs) to sanitize the data, that are frequently based on indistinguishability-based privacy definitions. Upon engineering PETs, a well-known challenge is the privacy-utility trade-off. Although literature is aware of a couple of trade-offs, there are still combinations of involved entities, privacy definition, type of data and application, in which we miss valuable trade-offs. In this thesis, for two important groups of applications processing complex data, we study (a) which indistinguishability-based privacy and utility requirements are relevant, (b) whether existing PETs solve the trade-off sufficiently, and (c) propose novel PETs extending the state-of-the-art substantially in terms of methodology, as well as achieved privacy or utility. Overall, we provide four contributions divided into two parts. In the first part, we study applications that analyze structured data with distance-based mining algorithms. We reveal that an essential utility requirement is the preservation of the pair-wise distances of the data items. Consequently, we propose distance-preserving encryption (DPE), together with a general procedure to engineer respective PETs by leveraging existing encryption schemes. As proof of concept, we apply it to SQL log mining, useful for database performance tuning. In the second part, we study applications that monitor query results over infinite streams. To this end, -event differential privacy is state-of-the-art. Here, PETs use mechanisms that typically add noise to query results. First, we study state-of-the-art mechanisms with respect to the utility they provide. Conducting the so far largest benchmark that fulfills requirements derived from limitations of prior experimental studies, we contribute new insights into the strengths and weaknesses of existing mechanisms. One of the most unexpected, yet explainable result, is a baseline supremacy. It states that one of the two baseline mechanisms delivers high or even the best utility. A natural follow-up question is whether baseline mechanisms already provide reasonable utility. So, second, we perform a case study from the area of electricity grid monitoring revealing two results. First, achieving reasonable utility is only possible under weak privacy requirements. Second, the utility measured with application-specific utility metrics decreases faster than the sanitization error, that is used as utility metric in most studies, suggests. As a third contribution, we propose a novel differential privacy-based privacy definition called Swellfish privacy. It allows tuning utility beyond incremental -event mechanism design by supporting time-dependent privacy requirements. Formally, as well as by experiments, we prove that it increases utility significantly. In total, our thesis contributes substantially to the research field, and reveals directions for future research

    Applying patterns in embedded systems design for managing quality attributes and their trade-offs

    Get PDF

    Technology Assessment of Dual-Use ICTs - How to Assess Diffusion, Governance and Design

    Get PDF
    Technologies that can be used in military and civilian applications are referred to as dual-use. The dual-use nature of many information and communications technologies (ICTs) raises new questions for research and development for national, international, and human security. Measures to deal with the risks associated with the various dual-use technologies, including proliferation control, design approaches, and policy measures, vary widely. For example, Autonomous Weapon Systems (AWS) have not yet been regulated, while cryptographic products are subject to export and import controls. Innovations in artificial intelligence (AI), robotics, cybersecurity, and automated analysis of publicly available data raise new questions about their respective dual-use risks. Dual-use risks have been systematically discussed so far, especially in the life sciences, which have contributed to the development of methods for assessment and risk management. Dual-use risks arise, among other things, from the fact that safety-critical technologies can be easily disseminated or modified, as well as used as part of a weapon system. Therefore, the development and adaptation of robots and software requires an independent consideration that builds on the insights of related dual-use discourses. Therefore, this dissertation considers the management of such risks in terms of the proliferation, regulation, and design of individual dual-use information technologies. Technology Assessment (TA) is the epistemological framework for this work, bringing together the concepts and approaches of Critical Security Studies (CSS) and Human-Computer Interaction (HCI) to help evaluate and shape dual-use technologies. In order to identify the diffusion of dual-use at an early stage, the dissertation first examines the diffusion of dual-use innovations between civilian and military research in expert networks on LinkedIn, as well as on the basis of AI patents in a patent network. The results show low diffusion and tend to confirm existing studies on diffusion in patent networks. In the following section, the regulation of dual-use technologies is examined in the paper through two case studies. The first study uses a discourse analysis to show the value conflicts with regard to the regulation of autonomous weapons systems using the concept of Meaningful Human Control (MHC), while a second study, as a long-term comparative case study, analyzes the change and consequences of the regulation of strong cryptography in the U.S. as well as the programs of intelligence agencies for mass surveillance. Both cases point to the central role of private companies, both in the production of AWS and as intermediaries for the dissemination of encryption, as well as surveillance intermediaries. Subsequently, the dissertation examines the design of a dual-use technology using an Open Source Intelligence System (OSINT) for cybersecurity. For this purpose, conceptual, empirical, and technical studies are conducted as part of the Value-Sensitive Design (VSD) framework. During the studies, implications for research on and design of OSINT were identified. For example, the representative survey of the German population has shown that transparency of use while reducing mistrust is associated with higher acceptance of such systems. Additionally, it has been shown that data sparsity through the use of expert networks has many positive effects, not only improving the performance of the system, but is also preferable for legal and social reasons. Thus, the work contributes to the understanding of specific dual-use risks of AI, the regulation of AWS and cryptography, and the design of OSINT in cybersecurity. By combining concepts from CSS and participatory design methods in HCI, this work provides an interdisciplinary and multi-method contribution
    corecore