7,450 research outputs found

    The three 'W' of the World Wide Web call for the three 'M' of a Massively Multidisciplinary Methodology

    Get PDF
    International audienceThis position paper defends the idea that the development of the Web to its full potential requires addressing the challenge of massive multidisciplinarity. It was triggered by the topic of a joint panel at the conferences CLOSER and WEBIST 2014.The topic of the panel was: " social, political and economic implications of the cloud and the Web ". I focused on the Web and the position that I defended during the panel, and that I report here, is that, while implications of the Web can be identified in social, political and economic domains, the global challenge raised by the Web is the need for massive multidisciplinarity to lead it to its full potential that goes beyond any individual prediction. This article starts with three sections respectively confirming the social, political and economic impacts of the Web. The fourth section shows that in fact many other domains are impacted and the fifth section proposes that this spreading is in fact due to several existential characteristic of the Web. The sixth section concludes insisting on the importance of preserving this open nature of the Web, of assisting it and of addressing the challenge of a massive multidisciplinary approach for developing the Web

    "Needless to Say My Proposal Was Turned Down": The Early Days of Commercial Citation Indexing, an "Error-making" Activity and Its Repercussions Till Today

    Get PDF
    In today’s neoliberal audit cultures university rankings, quantitative evaluation of publications by JIF or researchers by h-index are believed to be indispensable instruments for “quality assurance” in the sciences. Yet there is increasing resistance against “impactitis” and “evaluitis”. Usually overseen: Trivial errors in Thomson Reuters’ citation indexes produce severe non-trivial effects: Their victims are authors, institutions, journals with names beyond the ASCII-code and scholars of humanities and social sciences. Analysing the “Joshua Lederberg Papers” I want to illuminate eventually successful ‘invention’ of science citation indexing is a product of contingent factors. To overcome severe resistance Eugene Garfield, the “father” of citation indexing, had to foster overoptimistic attitudes and to downplay the severe problems connected to global and multidisciplinary citation indexing. The difficulties to handle different formats of references and footnotes, non-Anglo-American names, and of publications in non-English languages were known to the pioneers of citation indexing. Nowadays the huge for-profit North-American media corporation Thomson Reuters is the owner of the citation databases founded by Garfield. Thomson Reuters’ influence on funding decisions, individual careers, departments, universities, disciplines and countries is immense and ambivalent. Huge technological systems show a heavy inertness. This insight of technology studies is applicable to the large citation indexes by Thomson Reuters, too

    Macroservers: An Execution Model for DRAM Processor-In-Memory Arrays

    Get PDF
    The emergence of semiconductor fabrication technology allowing a tight coupling between high-density DRAM and CMOS logic on the same chip has led to the important new class of Processor-In-Memory (PIM) architectures. Newer developments provide powerful parallel processing capabilities on the chip, exploiting the facility to load wide words in single memory accesses and supporting complex address manipulations in the memory. Furthermore, large arrays of PIMs can be arranged into a massively parallel architecture. In this report, we describe an object-based programming model based on the notion of a macroserver. Macroservers encapsulate a set of variables and methods; threads, spawned by the activation of methods, operate asynchronously on the variables' state space. Data distributions provide a mechanism for mapping large data structures across the memory region of a macroserver, while work distributions allow explicit control of bindings between threads and data. Both data and work distributuions are first-class objects of the model, supporting the dynamic management of data and threads in memory. This offers the flexibility required for fully exploiting the processing power and memory bandwidth of a PIM array, in particular for irregular and adaptive applications. Thread synchronization is based on atomic methods, condition variables, and futures. A special type of lightweight macroserver allows the formulation of flexible scheduling strategies for the access to resources, using a monitor-like mechanism

    Living Innovation Laboratory Model Design and Implementation

    Full text link
    Living Innovation Laboratory (LIL) is an open and recyclable way for multidisciplinary researchers to remote control resources and co-develop user centered projects. In the past few years, there were several papers about LIL published and trying to discuss and define the model and architecture of LIL. People all acknowledge about the three characteristics of LIL: user centered, co-creation, and context aware, which make it distinguished from test platform and other innovation approaches. Its existing model consists of five phases: initialization, preparation, formation, development, and evaluation. Goal Net is a goal-oriented methodology to formularize a progress. In this thesis, Goal Net is adopted to subtract a detailed and systemic methodology for LIL. LIL Goal Net Model breaks the five phases of LIL into more detailed steps. Big data, crowd sourcing, crowd funding and crowd testing take place in suitable steps to realize UUI, MCC and PCA throughout the innovation process in LIL 2.0. It would become a guideline for any company or organization to develop a project in the form of an LIL 2.0 project. To prove the feasibility of LIL Goal Net Model, it was applied to two real cases. One project is a Kinect game and the other one is an Internet product. They were both transformed to LIL 2.0 successfully, based on LIL goal net based methodology. The two projects were evaluated by phenomenography, which was a qualitative research method to study human experiences and their relations in hope of finding the better way to improve human experiences. Through phenomenographic study, the positive evaluation results showed that the new generation of LIL had more advantages in terms of effectiveness and efficiency.Comment: This is a book draf

    Web Science: expanding the notion of Computer Science

    No full text
    Academic disciplines which practice in the context of rapid external change face particular problems when seeking to maintain timely, current and relevant teaching programs. In different institutions faculty will tune and update individual component courses while more radical revisions are typically departmental-wide strategic responses to perceived needs. Internationally, the ACM has sought to define curriculum recommendations since the 1960s and recognizes the diversity of the computing disciplines with its 2005 overview volume. The consequent rolling program of revisions is demanding in terms of time and effort, but an inevitable response to the change inherent is our family of specialisms. Preparation for the Computer Curricula 2013 is underway, so it seems appropriate to ask what place Web Science will have in the curriculum landscape. Web Science has been variously described; the most concise definition being the ‘science of decentralized information systems’. Web science is fundamentally interdisciplinary encompassing the study of the technologies and engineering which constitute the Web, alongside emerging associated human, social and organizational practices. Furthermore, to date little teaching of Web Science is at undergraduate level. Some questions emerge - is Web Science a transient artifact? Can Web Science claim a place in the ACM family, Is Web Science an exotic relative with a home elsewhere? This paper discusses the role and place of Web Science in the context of the computing disciplines. It provides an account of work which has been established towards defining an initial curriculum for Web Science with plans for future developments utilizing novel methods to support and elaborate curriculum definition and review. The findings of a desk survey of existing related curriculum recommendations are presented. The paper concludes with recommendations for future activities which may help us determine whether we should expand the notion of computer science

    "Needless to Say My Proposal Was Turned Down": The Early Days of Commercial Citation Indexing, an "Error-making" (Popper) Activity and Its Repercussions Till Today

    Get PDF
    In today’s neoliberal audit cultures university rankings, quantitative evaluation of publications by JIF (Journal Impact Factor) or researchers by h-index (Hirsch-Index) are believed to be indispensable instruments for “quality assurance” in the sciences. Yet there is increasing resistance against “impactitis” and “evaluitis”. Usually overseen: Trivial errors in Thomson Reuters’ citation indexes (SCI, SSCI, AHCI) produce severe non-trivial effects: Their victims are authors, institutions, journals with names beyond the ASCII-code and scholars of humanities and social sciences. Analysing the “Joshua Lederberg Papers” (provided by the National Library of Medicine) I want to illuminate eventually successful ‘invention’ of science citation indexing (more precisely: its transfer from the juridical field to the field of science) is a product of contingent factors. To overcome severe resistance Eugene Garfield, the “father” of citation indexing, had to foster overoptimistic attitudes and to downplay the severe problems connected to global and multidisciplinary citation indexing. The difficulties to handle different formats of references and footnotes, non-Anglo-American names, and of publications in non-English languages were known to the pioneers of citation indexing. Nowadays the huge for-profit North-American media corporation Thomson Reuters is the owner of the citation databases founded by Garfield. Thomson Reuters’ influence on funding decisions, individual careers, departments, universities, disciplines and countries is immense and ambivalent. Huge technological systems show a heavy inertness. This insight of technology studies is applicable to the large citation indexes by Thomson Reuters, too

    Helping Business Schools Engage with Real Problems: The Contribution of Critical Realism and Systems Thinking

    Get PDF
    The world faces major problems, not least climate change and the financial crisis, and business schools have been criticised for their failure to help address these issues and, in the case of the financial meltdown, for being causally implicated in it. In this paper we begin by describing the extent of what has been called the rigour/relevance debate. We then diagnose the nature of the problem in terms of historical, structural and contextual mechanisms that initiated and now sustain an inability of business schools to engage with real-world issues. We then propose a combination of measures, which mutually reinforce each other, that are necessary to break into this vicious circle – critical realism as an underpinning philosophy that supports and embodies the next points; holism and transdisciplinarity; multimethodology (mixed-methods research); and a critical and ethical-committed stance. OR and management science have much to contribute in terms of both powerful analytical methods and problem structuring methods

    Massively Multiplayer Online Gamers’ Language: Argument for an M-Gamer Corpus

    Get PDF
    The past few decades have seen a steady, and sometimes rapid rise in the production and consumption of Massively Multiple Online Games (MMOGs), spanning a global arena. Players from a wide variety of demographical, economic, geographical, cultural and linguistic backgrounds congregate under the banner of MMOGs and spend a considerable amount of time interacting and communicating with one another, in the context of playing and socializing through such playing. It is only logical then, to see such players become part of larger and extended socio-communal landscapes, wherein they may appropriate multiple roles in conjunction with their MMOG player roles, such as teachers, learners, family members and workplace cohorts. It is also equally logical for a curious mind to speculate the effects of the communication and language characteristics of such gamers on themselves, and the greater communities they may inhabit, investigate the realms of such possibilities, and appropriate knowledge garnered from such investigations to share. That is precisely what this study and paper is about. In this paper, I report the findings of an investigation of the communication and language characteristics of MMOG players, using 23 participants for interviews and journal writing, as well as multiple online documents. The findings suggest that MMOG players share some unique communication and language patterns, based on which they can be justifiably categorized as a sub culture with their own corpus. Additionally, researcher and practitioner implications are also discussed

    Web Science, Artificial Intelligence and Intelligence Augmentation (in Dagstuhl Perspectives Workshop 18262 - 10 Years of Web Science: Closing The Loop)

    Get PDF
    This abstract paper (from Dagstuhl Perspectives Workshop 18262 - 10 Years of Web Science: Closing The Loop) summarizes some challenges and opportunities at the intersection of Web Science, Artificial Intelligence and Intelligence Augmentation

    CGAMES'2009

    Get PDF
    • 

    corecore