17 research outputs found

    Representing Data Visualization Goals and Tasks through Meta-Modeling to Tailor Information Dashboards

    Get PDF
    [EN]Information dashboards are everywhere. They support knowledge discovery in a huge variety of contexts and domains. Although powerful, these tools can be complex, not only for the end-users but also for developers and designers. Information dashboards encode complex datasets into different visual marks to ease knowledge discovery. Choosing a wrong design could compromise the entire dashboard’s effectiveness, selecting the appropriate encoding or configuration for each potential context, user, or data domain is a crucial task. For these reasons, there is a necessity to automatize the recommendation of visualizations and dashboard configurations to deliver tools adapted to their context. Recommendations can be based on different aspects, such as user characteristics, the data domain, or the goals and tasks that will be achieved or carried out through the visualizations. This work presents a dashboard meta-model that abstracts all these factors and the integration of a visualization task taxonomy to account for the different actions that can be performed with information dashboards. This meta-model has been used to design a domain specific language to specify dashboards requirements in a structured way. The ultimate goal is to obtain a dashboard generation pipeline to deliver dashboards adapted to any context, such as the educational context, in which a lot of data are generated, and there are several actors involved (students, teachers, managers, etc.) that would want to reach different insights regarding their learning performance or learning methodologies

    NOVA mobility assistive system: Developed and remotely controlled with IOPT-tools

    Get PDF
    UID/EEA/00066/2020In this paper, a Mobility Assistive System (NOVA-MAS) and a model-driven development approach are proposed to support the acquisition and analysis of data, infrastructures control, and dissemination of information along public roads. A literature review showed that the work related to mobility assistance of pedestrians in wheelchairs has a gap in ensuring their safety on road. The problem is that pedestrians in wheelchairs and scooters often do not enjoy adequate and safe lanes for their circulation on public roads, having to travel sometimes side by side with vehicles and cars moving at high speed. With NOVA-MAS, city infrastructures can obtain information regarding the environment and provide it to their users/vehicles, increasing road safety in an inclusive way, contributing to the decrease of the accidents of pedestrians in wheelchairs. NOVA-MAS not only supports information dissemination, but also data acquisition from sensors and infrastructures control, such as traffic light signs. For that, it proposed a development approach that supports the acquisition of data from the environment and its control while using a tool framework, named IOPT-Tools (Input-Output Place-Transition Tools). IOPT-Tools support controllers’ specification, validation, and implementation, with remote operation capabilities. The infrastructures’ controllers are specified through IOPT Petri net models, which are then simulated using computational tools and verified using state-space-based model-checking tools. In addition, an automatic code generator tool generates the C code, which supports the controllers’ implementation, avoiding manual codification errors. A set of prototypes were developed and tested to validate and conclude on the feasibility of the proposals.publishersversionpublishe

    Constructing and interrogating actor histories

    Get PDF
    Complex systems, such as organizations, can be represented as executable simulation models using actor-based languages. Decision-making can be supported by system simulation so that different configurations provide a basis for what-if analysis. Actor-based models are expressed in terms of large numbers of concurrent actors that communicate using asynchronous messages leading to complex non-deterministic behaviour. This chapter addresses the problem of analyzing the results of model executions and proposes a general approach that can be added to any actor-based system. The approach uses a logic programming language with temporal extensions to query execution traces. The approach has been implemented and is shown to support a representative system model

    Constructing and interrogating actor histories

    Get PDF
    Complex systems, such as organizations, can be represented as executable simulation models using actor-based languages. Decision-making can be supported by system simulation so that different configurations provide a basis for what-if analysis. Actor-based models are expressed in terms of large numbers of concurrent actors that communicate using asynchronous messages leading to complex non-deterministic behaviour. This chapter addresses the problem of analyzing the results of model executions and proposes a general approach that can be added to any actor-based system. The approach uses a logic programming language with temporal extensions to query execution traces. The approach has been implemented and is shown to support a representative system model

    Towards a Technological Ecosystem to Provide Information Dashboards as a Service: A Dynamic Proposal for Supplying Dashboards Adapted to Specific Scenarios

    Get PDF
    [EN]Data are crucial to improve decision-making and obtain greater benefits in any type of activity. However, the large amount of information generated by new technologies has made data analysis and knowledge generation a complex task. Numerous tools have emerged to facilitate this generation of knowledge, such as dashboards. Although dashboards are useful tools, their effectiveness can be affected by poor design or by not taking into account the context in which they are placed. Therefore, it is necessary to design and create custom dashboards according to the audience and data domain. This paper presents an application of the software product line paradigm and the integration of this approach into a web service to allow users to request source code for customized information dashboards. The main goal is to introduce the idea of creating a holistic ecosystem of different services to craft and integrate information visualizations in a variety of contexts. One of the contexts that can be especially favored by this approach is the educational context, where learning analytics, data analysis of student performance, and didactic tools are becoming very relevant. Three different use cases of this approach are presented to illustrate the benefits of the developed generative service

    Security Management Framework for the Internet of Things

    Get PDF
    The increase in the design and development of wireless communication technologies offers multiple opportunities for the management and control of cyber-physical systems with connections between smart and autonomous devices, which provide the delivery of simplified data through the use of cloud computing. Given this relationship with the Internet of Things (IoT), it established the concept of pervasive computing that allows any object to communicate with services, sensors, people, and objects without human intervention. However, the rapid growth of connectivity with smart applications through autonomous systems connected to the internet has allowed the exposure of numerous vulnerabilities in IoT systems by malicious users. This dissertation developed a novel ontology-based cybersecurity framework to improve security in IoT systems using an ontological analysis to adapt appropriate security services addressed to threats. The composition of this proposal explores two approaches: (1) design time, which offers a dynamic method to build security services through the application of a methodology directed to models considering existing business processes; and (2) execution time, which involves monitoring the IoT environment, classifying vulnerabilities and threats, and acting in the environment, ensuring the correct adaptation of existing services. The validation approach was used to demonstrate the feasibility of implementing the proposed cybersecurity framework. It implies the evaluation of the ontology to offer a qualitative evaluation based on the analysis of several criteria and also a proof of concept implemented and tested using specific industrial scenarios. This dissertation has been verified by adopting a methodology that follows the acceptance in the research community through technical validation in the application of the concept in an industrial setting.O aumento no projeto e desenvolvimento de tecnologias de comunicação sem fio oferece múltiplas oportunidades para a gestão e controle de sistemas ciber-físicos com conexões entre dispositivos inteligentes e autônomos, os quais proporcionam a entrega de dados simplificados através do uso da computação em nuvem. Diante dessa relação com a Internet das Coisas (IoT) estabeleceu-se o conceito de computação pervasiva que permite que qualquer objeto possa comunicar com os serviços, sensores, pessoas e objetos sem intervenção humana. Entretanto, o rápido crescimento da conectividade com as aplicações inteligentes através de sistemas autônomos conectados com a internet permitiu a exposição de inúmeras vulnerabilidades dos sistemas IoT para usuários maliciosos. Esta dissertação desenvolveu um novo framework de cibersegurança baseada em ontologia para melhorar a segurança em sistemas IoT usando uma análise ontológica para a adaptação de serviços de segurança apropriados endereçados para as ameaças. A composição dessa proposta explora duas abordagens: (1) tempo de projeto, o qual oferece um método dinâmico para construir serviços de segurança através da aplicação de uma metodologia dirigida a modelos, considerando processos empresariais existentes; e (2) tempo de execução, o qual envolve o monitoramento do ambiente IoT, a classificação de vulnerabilidades e ameaças, e a atuação no ambiente garantindo a correta adaptação dos serviços existentes. Duas abordagens de validação foram utilizadas para demonstrar a viabilidade da implementação do framework de cibersegurança proposto. Isto implica na avaliação da ontologia para oferecer uma avaliação qualitativa baseada na análise de diversos critérios e também uma prova de conceito implementada e testada usando cenários específicos. Esta dissertação foi validada adotando uma metodologia que segue a validação na comunidade científica através da validação técnica na aplicação do nosso conceito em um cenário industrial

    Automatic generation of software interfaces for supporting decisionmaking processes. An application of domain engineering & machine learning

    Get PDF
    [EN] Data analysis is a key process to foster knowledge generation in particular domains or fields of study. With a strong informative foundation derived from the analysis of collected data, decision-makers can make strategic choices with the aim of obtaining valuable benefits in their specific areas of action. However, given the steady growth of data volumes, data analysis needs to rely on powerful tools to enable knowledge extraction. Information dashboards offer a software solution to analyze large volumes of data visually to identify patterns and relations and make decisions according to the presented information. But decision-makers may have different goals and, consequently, different necessities regarding their dashboards. Moreover, the variety of data sources, structures, and domains can hamper the design and implementation of these tools. This Ph.D. Thesis tackles the challenge of improving the development process of information dashboards and data visualizations while enhancing their quality and features in terms of personalization, usability, and flexibility, among others. Several research activities have been carried out to support this thesis. First, a systematic literature mapping and review was performed to analyze different methodologies and solutions related to the automatic generation of tailored information dashboards. The outcomes of the review led to the selection of a modeldriven approach in combination with the software product line paradigm to deal with the automatic generation of information dashboards. In this context, a meta-model was developed following a domain engineering approach. This meta-model represents the skeleton of information dashboards and data visualizations through the abstraction of their components and features and has been the backbone of the subsequent generative pipeline of these tools. The meta-model and generative pipeline have been tested through their integration in different scenarios, both theoretical and practical. Regarding the theoretical dimension of the research, the meta-model has been successfully integrated with other meta-model to support knowledge generation in learning ecosystems, and as a framework to conceptualize and instantiate information dashboards in different domains. In terms of the practical applications, the focus has been put on how to transform the meta-model into an instance adapted to a specific context, and how to finally transform this later model into code, i.e., the final, functional product. These practical scenarios involved the automatic generation of dashboards in the context of a Ph.D. Programme, the application of Artificial Intelligence algorithms in the process, and the development of a graphical instantiation platform that combines the meta-model and the generative pipeline into a visual generation system. Finally, different case studies have been conducted in the employment and employability, health, and education domains. The number of applications of the meta-model in theoretical and practical dimensions and domains is also a result itself. Every outcome associated to this thesis is driven by the dashboard meta-model, which also proves its versatility and flexibility when it comes to conceptualize, generate, and capture knowledge related to dashboards and data visualizations

    Políticas de Copyright de Publicações Científicas em Repositórios Institucionais: O Caso do INESC TEC

    Get PDF
    A progressiva transformação das práticas científicas, impulsionada pelo desenvolvimento das novas Tecnologias de Informação e Comunicação (TIC), têm possibilitado aumentar o acesso à informação, caminhando gradualmente para uma abertura do ciclo de pesquisa. Isto permitirá resolver a longo prazo uma adversidade que se tem colocado aos investigadores, que passa pela existência de barreiras que limitam as condições de acesso, sejam estas geográficas ou financeiras. Apesar da produção científica ser dominada, maioritariamente, por grandes editoras comerciais, estando sujeita às regras por estas impostas, o Movimento do Acesso Aberto cuja primeira declaração pública, a Declaração de Budapeste (BOAI), é de 2002, vem propor alterações significativas que beneficiam os autores e os leitores. Este Movimento vem a ganhar importância em Portugal desde 2003, com a constituição do primeiro repositório institucional a nível nacional. Os repositórios institucionais surgiram como uma ferramenta de divulgação da produção científica de uma instituição, com o intuito de permitir abrir aos resultados da investigação, quer antes da publicação e do próprio processo de arbitragem (preprint), quer depois (postprint), e, consequentemente, aumentar a visibilidade do trabalho desenvolvido por um investigador e a respetiva instituição. O estudo apresentado, que passou por uma análise das políticas de copyright das publicações científicas mais relevantes do INESC TEC, permitiu não só perceber que as editoras adotam cada vez mais políticas que possibilitam o auto-arquivo das publicações em repositórios institucionais, como também que existe todo um trabalho de sensibilização a percorrer, não só para os investigadores, como para a instituição e toda a sociedade. A produção de um conjunto de recomendações, que passam pela implementação de uma política institucional que incentive o auto-arquivo das publicações desenvolvidas no âmbito institucional no repositório, serve como mote para uma maior valorização da produção científica do INESC TEC.The progressive transformation of scientific practices, driven by the development of new Information and Communication Technologies (ICT), which made it possible to increase access to information, gradually moving towards an opening of the research cycle. This opening makes it possible to resolve, in the long term, the adversity that has been placed on researchers, which involves the existence of barriers that limit access conditions, whether geographical or financial. Although large commercial publishers predominantly dominate scientific production and subject it to the rules imposed by them, the Open Access movement whose first public declaration, the Budapest Declaration (BOAI), was in 2002, proposes significant changes that benefit the authors and the readers. This Movement has gained importance in Portugal since 2003, with the constitution of the first institutional repository at the national level. Institutional repositories have emerged as a tool for disseminating the scientific production of an institution to open the results of the research, both before publication and the preprint process and postprint, increase the visibility of work done by an investigator and his or her institution. The present study, which underwent an analysis of the copyright policies of INESC TEC most relevant scientific publications, allowed not only to realize that publishers are increasingly adopting policies that make it possible to self-archive publications in institutional repositories, all the work of raising awareness, not only for researchers but also for the institution and the whole society. The production of a set of recommendations, which go through the implementation of an institutional policy that encourages the self-archiving of the publications developed in the institutional scope in the repository, serves as a motto for a greater appreciation of the scientific production of INESC TEC

    Methodological approaches and techniques for designing ontologies in information systems requirements engineering

    Get PDF
    Programa doutoral em Information Systems and TechnologyThe way we interact with the world around us is changing as new challenges arise, embracing innovative business models, rethinking the organization and processes to maximize results, and evolving change management. Currently, and considering the projects executed, the methodologies used do not fully respond to the companies' needs. On the one hand, organizations are not familiar with the languages used in Information Systems, and on the other hand, they are often unable to validate requirements or business models. These are some of the difficulties encountered that lead us to think about formulating a new approach. Thus, the state of the art presented in this paper includes a study of the models involved in the software development process, where traditional methods and the rivalry of agile methods are present. In addition, a survey is made about Ontologies and what methods exist to conceive, transform, and represent them. Thus, after analyzing some of the various possibilities currently available, we began the process of evolving a method and developing an approach that would allow us to design ontologies. The method we evolved and adapted will allow us to derive terminologies from a specific domain, aggregating them in order to facilitate the construction of a catalog of terminologies. Next, the definition of an approach to designing ontologies will allow the construction of a domain-specific ontology. This approach allows in the first instance to integrate and store the data from different information systems of a given organization. In a second instance, the rules for mapping and building the ontology database are defined. Finally, a technological architecture is also proposed that will allow the mapping of an ontology through the construction of complex networks, allowing mapping and relating terminologies. This doctoral work encompasses numerous Research & Development (R&D) projects belonging to different domains such as Software Industry, Textile Industry, Robotic Industry and Smart Cities. Finally, a critical and descriptive analysis of the work done is performed, and we also point out perspectives for possible future work.A forma como interagimos com o mundo à nossa volta está a mudar à medida que novos desafios surgem, abraçando modelos empresariais inovadores, repensando a organização e os processos para maximizar os resultados, e evoluindo a gestão da mudança. Atualmente, e considerando os projetos executados, as metodologias utilizadas não respondem na totalidade às necessidades das empresas. Por um lado, as organizações não estão familiarizadas com as linguagens utilizadas nos Sistemas de Informação, por outro lado, são muitas vezes incapazes de validar requisitos ou modelos de negócio. Estas são algumas das dificuldades encontradas que nos levam a pensar na formulação de uma nova abordagem. Assim, o estado da arte apresentado neste documento inclui um estudo dos modelos envolvidos no processo de desenvolvimento de software, onde os métodos tradicionais e a rivalidade de métodos ágeis estão presentes. Além disso, é efetuado um levantamento sobre Ontologias e quais os métodos existentes para as conceber, transformar e representar. Assim, e após analisarmos algumas das várias possibilidades atualmente disponíveis, iniciou-se o processo de evolução de um método e desenvolvimento de uma abordagem que nos permitisse conceber ontologias. O método que evoluímos e adaptamos permitirá derivar terminologias de um domínio específico, agregando-as de forma a facilitar a construção de um catálogo de terminologias. Em seguida, a definição de uma abordagem para conceber ontologias permitirá a construção de uma ontologia de um domínio específico. Esta abordagem permite em primeira instância, integrar e armazenar os dados de diferentes sistemas de informação de uma determinada organização. Num segundo momento, são definidas as regras para o mapeamento e construção da base de dados ontológica. Finalmente, é também proposta uma arquitetura tecnológica que permitirá efetuar o mapeamento de uma ontologia através da construção de redes complexas, permitindo mapear e relacionar terminologias. Este trabalho de doutoramento engloba inúmeros projetos de Investigação & Desenvolvimento (I&D) pertencentes a diferentes domínios como por exemplo Indústria de Software, Indústria Têxtil, Indústria Robótica e Smart Cities. Finalmente, é realizada uma análise critica e descritiva do trabalho realizado, sendo que apontamos ainda perspetivas de possíveis trabalhos futuros
    corecore