17,793 research outputs found

    Linking an integrated framework with appropriate methods for measuring QoE

    Get PDF
    Quality of Experience (QoE) has recently gained recognition for being an important determinant of the success of new technologies. Despite the growing interest in QoE, research into this area is still fragmented. Similar - but separate - efforts are being carried out in technical as well as user oriented research domains, which are rarely communicating with each other. In this paper, we take a multidisciplinary approach and review both user oriented and technical definitions on Quality of Experience (including the related concept of User Experience). We propose a detailed and comprehensive framework that integrates both perspectives. Finally, we take a first step at linking methods for measuring QoE with this framework

    Information and communication technology solutions for outdoor navigation in dementia

    Get PDF
    INTRODUCTION: Information and communication technology (ICT) is potentially mature enough to empower outdoor and social activities in dementia. However, actual ICT-based devices have limited functionality and impact, mainly limited to safety. What is an ideal operational framework to enhance this field to support outdoor and social activities? METHODS: Review of literature and cross-disciplinary expert discussion. RESULTS: A situation-aware ICT requires a flexible fine-tuning by stakeholders of system usability and complexity of function, and of user safety and autonomy. It should operate by artificial intelligence/machine learning and should reflect harmonized stakeholder values, social context, and user residual cognitive functions. ICT services should be proposed at the prodromal stage of dementia and should be carefully validated within the life space of users in terms of quality of life, social activities, and costs. DISCUSSION: The operational framework has the potential to produce ICT and services with high clinical impact but requires substantial investment

    Toward a collective intelligence recommender system for education

    Get PDF
    The development of Information and Communication Technology (ICT), have revolutionized the world and have moved us into the information age, however the access and handling of this large amount of information is causing valuable time losses. Teachers in Higher Education especially use the Internet as a tool to consult materials and content for the development of the subjects. The internet has very broad services, and sometimes it is difficult for users to find the contents in an easy and fast way. This problem is increasing at the time, causing that students spend a lot of time in search information rather than in synthesis, analysis and construction of new knowledge. In this context, several questions have emerged: Is it possible to design learning activities that allow us to value the information search and to encourage collective participation?. What are the conditions that an ICT tool that supports a process of information search has to have to optimize the student's time and learning? This article presents the use and application of a Recommender System (RS) designed on paradigms of Collective Intelligence (CI). The RS designed encourages the collective learning and the authentic participation of the students. The research combines the literature study with the analysis of the ICT tools that have emerged in the field of the CI and RS. Also, Design-Based Research (DBR) was used to compile and summarize collective intelligence approaches and filtering techniques reported in the literature in Higher Education as well as to incrementally improving the tool. Several are the benefits that have been evidenced as a result of the exploratory study carried out. Among them the following stand out: ‱ It improves student motivation, as it helps you discover new content of interest in an easy way. ‱ It saves time in the search and classification of teaching material of interest. ‱ It fosters specialized reading, inspires competence as a means of learning. ‱ It gives the teacher the ability to generate reports of trends and behaviors of their students, real-time assessment of the quality of learning material. The authors consider that the use of ICT tools that combine the paradigms of the CI and RS presented in this work, are a tool that improves the construction of student knowledge and motivates their collective development in cyberspace, in addition, the model of Filltering Contents used supports the design of models and strategies of collective intelligence in Higher Education.Postprint (author's final draft

    From Social Simulation to Integrative System Design

    Full text link
    As the recent financial crisis showed, today there is a strong need to gain "ecological perspective" of all relevant interactions in socio-economic-techno-environmental systems. For this, we suggested to set-up a network of Centers for integrative systems design, which shall be able to run all potentially relevant scenarios, identify causality chains, explore feedback and cascading effects for a number of model variants, and determine the reliability of their implications (given the validity of the underlying models). They will be able to detect possible negative side effect of policy decisions, before they occur. The Centers belonging to this network of Integrative Systems Design Centers would be focused on a particular field, but they would be part of an attempt to eventually cover all relevant areas of society and economy and integrate them within a "Living Earth Simulator". The results of all research activities of such Centers would be turned into informative input for political Decision Arenas. For example, Crisis Observatories (for financial instabilities, shortages of resources, environmental change, conflict, spreading of diseases, etc.) would be connected with such Decision Arenas for the purpose of visualization, in order to make complex interdependencies understandable to scientists, decision-makers, and the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c

    A framework and tool to manage Cloud Computing service quality

    Get PDF
    Cloud Computing has generated considerable interest in both companies specialized in Information and Communication Technology and business context in general. The Sourcing Capability Maturity Model for service (e-SCM) is a capability model for offshore outsourcing services between clients and providers that offers appropriate strategies to enhance Cloud Computing implementation. It intends to achieve the required quality of service and develop an effective working relationship between clients and providers. Moreover, quality evaluation framework is a framework to control the quality of any product and/or process. It offers a tool support that can generate software artifacts to manage any type of product and service efficiently and effectively. Thus, the aim of this paper was to make this framework and tool support available to manage Cloud Computing service quality between clients and providers by means of e-SCM.Ministerio de Ciencia e InnovaciĂłn TIN2013-46928-C3-3-RJunta de AndalucĂ­a TIC-578

    An Ontology-Based Recommender System with an Application to the Star Trek Television Franchise

    Full text link
    Collaborative filtering based recommender systems have proven to be extremely successful in settings where user preference data on items is abundant. However, collaborative filtering algorithms are hindered by their weakness against the item cold-start problem and general lack of interpretability. Ontology-based recommender systems exploit hierarchical organizations of users and items to enhance browsing, recommendation, and profile construction. While ontology-based approaches address the shortcomings of their collaborative filtering counterparts, ontological organizations of items can be difficult to obtain for items that mostly belong to the same category (e.g., television series episodes). In this paper, we present an ontology-based recommender system that integrates the knowledge represented in a large ontology of literary themes to produce fiction content recommendations. The main novelty of this work is an ontology-based method for computing similarities between items and its integration with the classical Item-KNN (K-nearest neighbors) algorithm. As a study case, we evaluated the proposed method against other approaches by performing the classical rating prediction task on a collection of Star Trek television series episodes in an item cold-start scenario. This transverse evaluation provides insights into the utility of different information resources and methods for the initial stages of recommender system development. We found our proposed method to be a convenient alternative to collaborative filtering approaches for collections of mostly similar items, particularly when other content-based approaches are not applicable or otherwise unavailable. Aside from the new methods, this paper contributes a testbed for future research and an online framework to collaboratively extend the ontology of literary themes to cover other narrative content.Comment: 25 pages, 6 figures, 5 tables, minor revision

    Big data analytics:Computational intelligence techniques and application areas

    Get PDF
    Big Data has significant impact in developing functional smart cities and supporting modern societies. In this paper, we investigate the importance of Big Data in modern life and economy, and discuss challenges arising from Big Data utilization. Different computational intelligence techniques have been considered as tools for Big Data analytics. We also explore the powerful combination of Big Data and Computational Intelligence (CI) and identify a number of areas, where novel applications in real world smart city problems can be developed by utilizing these powerful tools and techniques. We present a case study for intelligent transportation in the context of a smart city, and a novel data modelling methodology based on a biologically inspired universal generative modelling approach called Hierarchical Spatial-Temporal State Machine (HSTSM). We further discuss various implications of policy, protection, valuation and commercialization related to Big Data, its applications and deployment

    Method versatility in analysing human attitudes towards technology

    Get PDF
    Various research domains are facing new challenges brought about by growing volumes of data. To make optimal use of them, and to increase the reproducibility of research findings, method versatility is required. Method versatility is the ability to flexibly apply widely varying data analytic methods depending on the study goal and the dataset characteristics. Method versatility is an essential characteristic of data science, but in other areas of research, such as educational science or psychology, its importance is yet to be fully accepted. Versatile methods can enrich the repertoire of specialists who validate psychometric instruments, conduct data analysis of large-scale educational surveys, and communicate their findings to the academic community, which corresponds to three stages of the research cycle: measurement, research per se, and communication. In this thesis, studies related to these stages have a common theme of human attitudes towards technology, as this topic becomes vitally important in our age of ever-increasing digitization. The thesis is based on four studies, in which method versatility is introduced in four different ways: the consecutive use of methods, the toolbox choice, the simultaneous use, and the range extension. In the first study, different methods of psychometric analysis are used consecutively to reassess psychometric properties of a recently developed scale measuring affinity for technology interaction. In the second, the random forest algorithm and hierarchical linear modeling, as tools from machine learning and statistical toolboxes, are applied to data analysis of a large-scale educational survey related to students’ attitudes to information and communication technology. In the third, the challenge of selecting the number of clusters in model-based clustering is addressed by the simultaneous use of model fit, cluster separation, and the stability of partition criteria, so that generalizable separable clusters can be selected in the data related to teachers’ attitudes towards technology. The fourth reports the development and evaluation of a scholarly knowledge graph-powered dashboard aimed at extending the range of scholarly communication means. The findings of the thesis can be helpful for increasing method versatility in various research areas. They can also facilitate methodological advancement of academic training in data analysis and aid further development of scholarly communication in accordance with open science principles.Verschiedene Forschungsbereiche mĂŒssen sich durch steigende Datenmengen neuen Herausforderungen stellen. Der Umgang damit erfordert – auch in Hinblick auf die Reproduzierbarkeit von Forschungsergebnissen – Methodenvielfalt. Methodenvielfalt ist die FĂ€higkeit umfangreiche Analysemethoden unter BerĂŒcksichtigung von angestrebten Studienzielen und gegebenen Eigenschaften der DatensĂ€tze flexible anzuwenden. Methodenvielfalt ist ein essentieller Bestandteil der Datenwissenschaft, der aber in seinem Umfang in verschiedenen Forschungsbereichen wie z. B. den Bildungswissenschaften oder der Psychologie noch nicht erfasst wird. Methodenvielfalt erweitert die Fachkenntnisse von Wissenschaftlern, die psychometrische Instrumente validieren, Datenanalysen von groß angelegten Umfragen im Bildungsbereich durchfĂŒhren und ihre Ergebnisse im akademischen Kontext prĂ€sentieren. Das entspricht den drei Phasen eines Forschungszyklus: Messung, Forschung per se und Kommunikation. In dieser Doktorarbeit werden Studien, die sich auf diese Phasen konzentrieren, durch das gemeinsame Thema der Einstellung zu Technologien verbunden. Dieses Thema ist im Zeitalter zunehmender Digitalisierung von entscheidender Bedeutung. Die Doktorarbeit basiert auf vier Studien, die Methodenvielfalt auf vier verschiedenen Arten vorstellt: die konsekutive Anwendung von Methoden, die Toolbox-Auswahl, die simultane Anwendung von Methoden sowie die Erweiterung der Bandbreite. In der ersten Studie werden verschiedene psychometrische Analysemethoden konsekutiv angewandt, um die psychometrischen Eigenschaften einer entwickelten Skala zur Messung der AffinitĂ€t von Interaktion mit Technologien zu ĂŒberprĂŒfen. In der zweiten Studie werden der Random-Forest-Algorithmus und die hierarchische lineare Modellierung als Methoden des Machine Learnings und der Statistik zur Datenanalyse einer groß angelegten Umfrage ĂŒber die Einstellung von SchĂŒlern zur Informations- und Kommunikationstechnologie herangezogen. In der dritten Studie wird die Auswahl der Anzahl von Clustern im modellbasierten Clustering bei gleichzeitiger Verwendung von Kriterien fĂŒr die Modellanpassung, der Clustertrennung und der StabilitĂ€t beleuchtet, so dass generalisierbare trennbare Cluster in den Daten zu den Einstellungen von Lehrern zu Technologien ausgewĂ€hlt werden können. Die vierte Studie berichtet ĂŒber die Entwicklung und Evaluierung eines wissenschaftlichen wissensgraphbasierten Dashboards, das die Bandbreite wissenschaftlicher Kommunikationsmittel erweitert. Die Ergebnisse der Doktorarbeit tragen dazu bei, die Anwendung von vielfĂ€ltigen Methoden in verschiedenen Forschungsbereichen zu erhöhen. Außerdem fördern sie die methodische Ausbildung in der Datenanalyse und unterstĂŒtzen die Weiterentwicklung der wissenschaftlichen Kommunikation im Rahmen von Open Science

    Towards a better understanding of the e-health user: comparing USE IT and Requirements study for an Electronic Patient Record.

    Get PDF
    This paper compares a traditional requirements study with 22 interviews for the design of an electronic patient record (EPR) and a USE IT analysis with 17 interviews trying to understand the end- user of an EPR. Developing, implementing and using information technology in organizations is a complex social activity. It is often characterized by ill-defined problems or vague goals, conflicts and disruptions that result from organizational change. Successfully implementing information systems in healthcare organizations appears to be a difficult task. Information Technology is regarded as an enabler of change in healthcare organizations but (information) technology adoption decisions in healthcare are complex, because of the uncertainty of benefits and the rate of change of technology. (Job) Relevance is recognized as an important determinant for IS success but still does not find its way into a systems design process

    Semantic discovery and reuse of business process patterns

    Get PDF
    Patterns currently play an important role in modern information systems (IS) development and their use has mainly been restricted to the design and implementation phases of the development lifecycle. Given the increasing significance of business modelling in IS development, patterns have the potential of providing a viable solution for promoting reusability of recurrent generalized models in the very early stages of development. As a statement of research-in-progress this paper focuses on business process patterns and proposes an initial methodological framework for the discovery and reuse of business process patterns within the IS development lifecycle. The framework borrows ideas from the domain engineering literature and proposes the use of semantics to drive both the discovery of patterns as well as their reuse
    • 

    corecore