1,016 research outputs found

    Comparing user experiences in using Twiki & Mediawiki to facilitate collaborative learning

    Get PDF
    This research seeks to determine the perceived effectiveness of using TWiki and MediaWiki in collaborative work and knowledge management; and to compare the use of TWiki and MediaWiki in terms of user experiences in the master’s level of study at the University of Hong Kong. Through a multiple case study approach, the study adopted a mixed methods research design which used both quantitative and qualitative methods to analyze findings from specific user groups in two study programmes. In the study, both wiki platforms were regarded as suitable tools for group work co-construction, which were found to be effective in improving group collaboration and work quality. Wikis were also viewed as enabling tools for knowledge management. MediaWiki was rated more favorably than TWiki, especially in the ease of use and enjoyment experienced. The paper should be of interest to educators who may want to explore wiki as a platform to enhance students’ collaborative group work.postprintThe 6th International Conference on Knowledge Management (ICKM 2009), Hong Kong, 3-4 December 2009. In Proceedings of ICKM, 2009, p. 1-1

    Reduce Response Time: Get Hooked on a Wiki

    Get PDF
    Managing the flow of information both within the IT department and to our customers is one of our greatest challenges in the Office of Technology Information at Valparaiso University. To be successful, IT staff first need to acquire the right information from colleagues to provide excellent service. Then, the staff must determine the most effective way to communicate that information to internal and external customers to encourage the flow of information. To advance the IT department’s goals, how best can we utilize “information” and “communication” vehicles to exchange information, improve workflow, and ultimately communicate essential information to our internal and external customers? We’ve asked ourselves this question and have resolved that “information” and “communication” need to work cooperatively! How better than with a wiki? Recent changes in departmental structure gave us the opportunity to examine our communication vehicles—specifically the software tools we use to facilitate the flow of information. Our previous knowledge base, First Level Support, a module of the HEAT support software produced by FrontRange Solutions, once met our needs as an internal knowledge base solution. We realized we had outgrown FLS and needed a more robust alternative. Our student employees asked for a newer, more interactive method of sharing information. With the assistance of our UNIX systems administrator, we investigated various options and decided to implement the MediaWikiTM system. As we had anticipated, use of this wiki system reduced the response time a customer must wait for an answer to their inquiry. What we didn’t realize was that utilization of the wiki would meet many more needs than we had anticipated. It has also helped us meet other departmental needs, such as increased collaboration, an online knowledge base, and a training tool for staff. Come see how a sprinkle of pixie dust improved communication through adoption of the wiki, and brought information to the forefront of our operations

    Project Oriented Immersion Learning "Building online digital products for a cyberspace publishing house"

    Get PDF

    Wikis supporting PLM and Technical Documentation

    Get PDF
    Over the last years, Wikis have arisen as powerful tools for collaborative documentation on the Internet. The Encyclopaedia Wikipedia has become a reference, and the power of community editing in a Wiki allows for capture of knowledge from contributors all over the world. Use of a Wiki for Technical Documentation, along with hyper-links to other data sources such as a Product Lifecycle Management (PLM) system, provides a very effective collaboration tool as information can be easily feed into the system throughout the project life-cycle. In particular for software- and hardware projects with rapidly evolving documentation, the Wiki approach has proved to be successful. Certain Wiki implementations, such as TWiki, are project-oriented and include functionality such as automatic page revisioning. This paper addresses the use of TWiki to document hardware and software projects at CERN, from the requirements and brain-storming phase to end-product documentation. 2 examples are covered: large scale engineering for the ATLAS Experiment, and a network management software project

    Review of ATLAS Software Documentation (February 8-9, 2006)

    Get PDF
    Review of the ATLAS Offline Documentation: Web pages, WorkBook, TWiki, HyperNews, Doxygen

    Distributed Data Analysis in ATLAS

    Get PDF
    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central ca talog of file locations. The operation of the grid resources is continually monitored by the GangaRobot functional testing system, and infrequent site stress tests are performed using the HammerCloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers
    • …
    corecore