553 research outputs found

    10301 Executive Summary and Abstracts Collection -- Service Value Networks

    Get PDF
    From 25.07.2010 to 30.07.2010, the Perspectives Workshop 10301 ``Perspectives Workshop: Service Value Networks \u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Information visualisation and data analysis using web mash-up systems

    Get PDF
    A thesis submitted in partial fulfilment for the degree of Doctor of PhilosophyThe arrival of E-commerce systems have contributed greatly to the economy and have played a vital role in collecting a huge amount of transactional data. It is becoming difficult day by day to analyse business and consumer behaviour with the production of such a colossal volume of data. Enterprise 2.0 has the ability to store and create an enormous amount of transactional data; the purpose for which data was collected could quite easily be disassociated as the essential information goes unnoticed in large and complex data sets. The information overflow is a major contributor to the dilemma. In the current environment, where hardware systems have the ability to store such large volumes of data and the software systems have the capability of substantial data production, data exploration problems are on the rise. The problem is not with the production or storage of data but with the effectiveness of the systems and techniques where essential information could be retrieved from complex data sets in a comprehensive and logical approach as the data questions are asked. Using the existing information retrieval systems and visualisation tools, the more specific questions are asked, the more definitive and unambiguous are the visualised results that could be attained, but when it comes to complex and large data sets there are no elementary or simple questions. Therefore a profound information visualisation model and system is required to analyse complex data sets through data analysis and information visualisation, to make it possible for the decision makers to identify the expected and discover the unexpected. In order to address complex data problems, a comprehensive and robust visualisation model and system is introduced. The visualisation model consists of four major layers, (i) acquisition and data analysis, (ii) data representation, (iii) user and computer interaction and (iv) results repositories. There are major contributions in all four layers but particularly in data acquisition and data representation. Multiple attribute and dimensional data visualisation techniques are identified in Enterprise 2.0 and Web 2.0 environment. Transactional tagging and linked data are unearthed which is a novel contribution in information visualisation. The visualisation model and system is first realised as a tangible software system, which is then validated through different and large types of data sets in three experiments. The first experiment is based on the large Royal Mail postcode data set. The second experiment is based on a large transactional data set in an enterprise environment while the same data set is processed in a non-enterprise environment. The system interaction facilitated through new mashup techniques enables users to interact more fluently with data and the representation layer. The results are exported into various reusable formats and retrieved for further comparison and analysis purposes. The information visualisation model introduced in this research is a compact process for any size and type of data set which is a major contribution in information visualisation and data analysis. Advanced data representation techniques are employed using various web mashup technologies. New visualisation techniques have emerged from the research such as transactional tagging visualisation and linked data visualisation. The information visualisation model and system is extremely useful in addressing complex data problems with strategies that are easy to interact with and integrate

    An Effective End-User Development Approach Through Domain-Specific Mashups for Research Impact Evaluation

    Get PDF
    Over the last decade, there has been growing interest in the assessment of the performance of researchers, research groups, universities and even countries. The assessment of productivity is an instrument to select and promote personnel, assign research grants and measure the results of research projects. One particular assessment approach is bibliometrics i.e., the quantitative analysis of scientific publications through citation and content analysis. However, there is little consensus today on how research evaluation should be performed, and it is commonly acknowledged that the quantitative metrics available today are largely unsatisfactory. A number of different scientific data sources available on the Web (e.g., DBLP, Google Scholar) that are used for such analysis purposes. Taking data from these diverse sources, performing the analysis and visualizing results in different ways is not a trivial and straight forward task. Moreover, people involved in such evaluation processes are not always IT experts and hence not capable to crawl data sources, merge them and compute the needed evaluation procedures. The recent emergence of mashup tools has refueled research on end-user development, i.e., on enabling end-users without programming skills to produce their own applications. We believe that the heart of the problem is that it is impractical to design tools that are generic enough to cover a wide range of application domains, powerful enough to enable the specification of non-trivial logic, and simple enough to be actually accessible to non-programmers. This thesis presents a novel approach for an effective end-user development, specifically for non-programmers. That is, we introduce a domain-specific approach to mashups that "speaks the language of users"., i.e., that is aware of the terminology, concepts, rules, and conventions (the domain) the user is comfortable with.Comment: This PhD dissertation consists of 206 page

    Mash-ups of Information Services for Promoting Higher Education Institution in Hanoi, Vietnam

    Get PDF
    Currently, today is the Information Era, with the typical symbol - Internet Technology. Any people can become the users of the Internet at any time. Internet is everywhere. The users can use internet to search, find or collect any information that they need. Nowadays, for sure that, any one of us heard about Web 2.0 technologies as well as applications. The development from Web 1.0 to Web 2.0 was considered like "break through step" of the information technology. Web 2.0 is really more efficient information sharing, collaboration and business processes. And, Mash- up is one of the outcomes of Web 2.0 paradigm that has been widely accepted and used for users over the world. The role and effect of mash-ups in modern life is obvious. At the moment, although mash - ups are mainly used for less fundamental tasks, such as customized queries and map based visualizations. But, compared to a few years ago, mash-ups' development and application are becoming popular and increasing day by day with higher demands. In the future, it has the potential to be used for more fundamental, complex and sophisticated tasks. Finding, searching, collecting as well as using information in the Internet is one of problems about technology of Vietnam in general, and Hanoi in particular, where are the developing countries and be focused on the agriculture fields. Catch and combine two above events, the developers want to create "Mash - ups of Promoting Information Services Institution for Higher Education in HaNoi, Vietnam" as Final Year Project. For this project, the users just need Internet route to access. And after this, they can find out, search, collect and compare all important and necessary information about universities in Hanoi, Vietnam. This way will useful for users, because of it will minimize the effort, time as well as money of the finders. Besides, it is also easy to understand and using

    A human factors perspective on volunteered geographic information

    Get PDF
    This thesis takes a multidisciplinary approach to understanding the unique abilities of Volunteered Geographic Information (VGI) to enhance the utility of online mashups in ways not achievable with Professional Geographic Information (PGI). The key issues currently limiting the use of successful of VGI are the concern for quality, accuracy and value of the information, as well as the polarisation and bias of views within the user community. This thesis reviews different theoretical approaches in Human Factors, Geography, Information Science and Computer Science to help understand the notion of user judgements relative to VGI within an online environment (Chapter 2). Research methods relevant to a human factors investigation are also discussed (Chapter 3). (Chapter 5) The scoping study established the fundamental insights into the terminology and nature of VGI and PGI, a range of users were engaged through a series of qualitative interviews. This led the development of a framework on VGI (Chapter 4), and comparative description of users in relation to one another through a value framework (Chapter 5). Study Two produced qualitative multi-methods investigation into how users perceive VGI and PGI in use (Chapter 6), demonstrating similarities and the unique ability for VGI to provide utility to consumers. Chapter Seven and Study Three brought insight into the specific abilities for VGI to enhance the user judgement of online information within an information relevance context (Chapter 7 and 8). In understanding the outcomes of these studies, this thesis discusses how users perceive VGI as different from PGI in terms of its benefit to consumers from a user centred design perspective (Chapter 9). In particular, the degree to which user concerns are valid, the limitation of VGI in application and its potential strengths in enriching the user experiences of consumers engaged within an information search. In conclusion, specific contributions and avenues for further work are highlighted (Chapter 10)

    Improved clinical investigation and evaluation of high-risk medical devices: the rationale and objectives of CORE-MD (Coordinating Research and Evidence for Medical Devices).

    Get PDF
    In the European Union (EU), the delivery of health services is a national responsibility but there are concerted actions between member states to protect public health. Approval of pharmaceutical products is the responsibility of the European Medicines Agency, while authorising the placing on the market of medical devices is decentralised to independent 'conformity assessment' organisations called notified bodies. The first legal basis for an EU system of evaluating medical devices and approving their market access was the Medical Device Directive, from the 1990s. Uncertainties about clinical evidence requirements, among other reasons, led to the EU Medical Device Regulation (2017/745) that has applied since May 2021. It provides general principles for clinical investigations but few methodological details - which challenges responsible authorities to set appropriate balances between regulation and innovation, pre- and post-market studies, and clinical trials and real-world evidence. Scientific experts should advise on methods and standards for assessing and approving new high-risk devices, and safety, efficacy, and transparency of evidence should be paramount. The European Commission recently awarded a Horizon 2020 grant to a consortium led by the European Society of Cardiology and the European Federation of National Associations of Orthopaedics and Traumatology, that will review methodologies of clinical investigations, advise on study designs, and develop recommendations for aggregating clinical data from registries and other real-world sources. The CORE-MD project (Coordinating Research and Evidence for Medical Devices) will run until March 2024. Here, we describe how it may contribute to the development of regulatory science in Europe. Cite this article: EFORT Open Rev 2021;6:839-849. DOI: 10.1302/2058-5241.6.210081
    • …
    corecore