250 research outputs found

    Information Outlook, December 2005

    Get PDF
    Volume 9, Issue 12https://scholarworks.sjsu.edu/sla_io_2005/1011/thumbnail.jp

    Can Machines Learn to Detect Fake News? A Survey Focused on Social Media

    Get PDF
    Through a systematic literature review method, in this work we searched classical electronic libraries in order to find the most recent papers related to fake news detection on social medias. Our target is mapping the state of art of fake news detection, defining fake news and finding the most useful machine learning technique for doing so. We concluded that the most used method for automatic fake news detection is not just one classical machine learning technique, but instead a amalgamation of classic techniques coordinated by a neural network. We also identified a need for a domain ontology that would unify the different terminology and definitions of the fake news domain. This lack of consensual information may mislead opinions and conclusions

    Stimulating Personal Development and Knowledge Sharing

    Get PDF
    Koper, R., Stefanov, K., & Dicheva, D. (Eds.) (2009). Proceedings of the 5th International TENCompetence Open Workshop "Stimulating Personal Development and Knowledge Sharing". October, 30-31, 2008, Sofia, Bulgaria: TENCompetence Workshop.The fifth open workshop of the TENCompetence project took place in Sofia, Bulgaria, from 30th to 31st October 2008. These proceedings contain the papers that were accepted for publication by the Program Committee.The work on this publication has been sponsored by the TENCompetence Integrated Project that is funded by the European Commission's 6th Framework Programme, priority IST/Technology Enhanced Learning. Contract 027087 [http://www.tencompetence.org

    From Social Data Mining to Forecasting Socio-Economic Crisis

    Full text link
    Socio-economic data mining has a great potential in terms of gaining a better understanding of problems that our economy and society are facing, such as financial instability, shortages of resources, or conflicts. Without large-scale data mining, progress in these areas seems hard or impossible. Therefore, a suitable, distributed data mining infrastructure and research centers should be built in Europe. It also appears appropriate to build a network of Crisis Observatories. They can be imagined as laboratories devoted to the gathering and processing of enormous volumes of data on both natural systems such as the Earth and its ecosystem, as well as on human techno-socio-economic systems, so as to gain early warnings of impending events. Reality mining provides the chance to adapt more quickly and more accurately to changing situations. Further opportunities arise by individually customized services, which however should be provided in a privacy-respecting way. This requires the development of novel ICT (such as a self- organizing Web), but most likely new legal regulations and suitable institutions as well. As long as such regulations are lacking on a world-wide scale, it is in the public interest that scientists explore what can be done with the huge data available. Big data do have the potential to change or even threaten democratic societies. The same applies to sudden and large-scale failures of ICT systems. Therefore, dealing with data must be done with a large degree of responsibility and care. Self-interests of individuals, companies or institutions have limits, where the public interest is affected, and public interest is not a sufficient justification to violate human rights of individuals. Privacy is a high good, as confidentiality is, and damaging it would have serious side effects for society.Comment: 65 pages, 1 figure, Visioneer White Paper, see http://www.visioneer.ethz.c

    Exploring Volunteered Geographic Information (VGI) for Emergency Management: Toward a Wiki GIS Framework

    Get PDF
    The past three years have witnessed unprecedented growth of user-generated volunteered geographic information (VGI) on the Web. Although scholars, decision makers, and citizens have recognized the potential value of VGI in emergency management, there exists no rigorous study on the availability, quality, and feasibility of VGI for applications related to emergency management. This dissertation applies methodologies of GIScience and computer science to present an overview of VGI and explore its value in emergency management with the goal of developing a wiki GIS approach for community emergency preparedness. This dissertation research concludes that VGI and wiki GIS represent new development in public participation in the production and use of geographic information. In emergency management, VGI and wiki GIS suggest a new approach to incorporate the general public in emergency response activities. By incorporating VGI in emergency management, official agencies and the general public gain better situational awareness in emergency management

    Social Computing: Study on the Use and Impacts of Collaborative Content

    Get PDF
    Collaborative content, created with web2.0 technologies, is part of the social computing phenomenon. The key feature of collaborative content is that it is created, reviewed, refined, enhanced and shared by interactions and contributions of a number of people. The report provides an assessment of the use, adoption and impact of collaborative content applications, giving an in-depth description of YouTube, Wikipedia and blogging, and discussing the socio-economic impacts and challenges of collaborative content phenomenon. The great variety of collaborative content applications is providing people with access to a great diversity of content and information, new relations to other people based on common interests, and a new tool for collaboration. Organizations can not avoid responding to the challenges rising, but there are various ways in which they can also benefit from the opportunities available. A major challenge is how to nurture a responsible digital culture, where users adopt a critical attitude in both creating and using the content, and where the collaborative communities have sustainable models for participation and content quality management.JRC.J.4-Information Societ

    Mobile Learning with Micro-content: A Framework and Evaluation

    Get PDF
    Micro-learning (ML) combines micro-content delivery with a sequence of micro interactions which enable users to learn without information overload. This has the potential to enable better learning results in terms of retention of propositional content. Learners familiar with Web2.0 technologies, like Tweets and SMS, expect a personalized learning solution and the KnowledgePulse (KP) system researched and developed by the RSA FG delivers this in a work context. ML has potential for enhancing mobile learning which has lacked success despite the explosive popularity of mobile devices. This paper presents the micro-learning approach and the KP sytem that delivers micro-content on mobile devices and allows learning anytime, anyplace and any pace. Three case studies of different product stages of KP are reported with 100+ users in three settings. Results show high usage levels and good satisfaction of learners. These preliminary results provide encouraging signs for the further development of micro-learning systems. Future research needs to expand to a much large scale and also develop an evaluation framework which can serve as standard to investigate how micro and mobile learning can be integrated to create more effective learning

    Dagstuhl News January - December 2008

    Get PDF
    "Dagstuhl News" is a publication edited especially for the members of the Foundation "Informatikzentrum Schloss Dagstuhl" to thank them for their support. The News give a summary of the scientific work being done in Dagstuhl. Each Dagstuhl Seminar is presented by a small abstract describing the contents and scientific highlights of the seminar as well as the perspectives or challenges of the research topic

    A treatise on Web 2.0 with a case study from the financial markets

    Get PDF
    There has been much hype in vocational and academic circles surrounding the emergence of web 2.0 or social media; however, relatively little work was dedicated to substantiating the actual concept of web 2.0. Many have dismissed it as not deserving of this new title, since the term web 2.0 assumes a certain interpretation of web history, including enough progress in certain direction to trigger a succession [i.e. web 1.0 → web 2.0]. Others provided arguments in support of this development, and there has been a considerable amount of enthusiasm in the literature. Much research has been busy evaluating current use of web 2.0, and analysis of the user generated content, but an objective and thorough assessment of what web 2.0 really stands for has been to a large extent overlooked. More recently the idea of collective intelligence facilitated via web 2.0, and its potential applications have raised interest with researchers, yet a more unified approach and work in the area of collective intelligence is needed. This thesis identifies and critically evaluates a wider context for the web 2.0 environment, and what caused it to emerge; providing a rich literature review on the topic, a review of existing taxonomies, a quantitative and qualitative evaluation of the concept itself, an investigation of the collective intelligence potential that emerges from application usage. Finally, a framework for harnessing collective intelligence in a more systematic manner is proposed. In addition to the presented results, novel methodologies are also introduced throughout this work. In order to provide interesting insight but also to illustrate analysis, a case study of the recent financial crisis is considered. Some interesting results relating to the crisis are revealed within user generated content data, and relevant issues are discussed where appropriate
    corecore