81,287 research outputs found

    Neuroimaging study designs, computational analyses and data provenance using the LONI pipeline.

    Get PDF
    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges--management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu

    The Partial Evaluation Approach to Information Personalization

    Get PDF
    Information personalization refers to the automatic adjustment of information content, structure, and presentation tailored to an individual user. By reducing information overload and customizing information access, personalization systems have emerged as an important segment of the Internet economy. This paper presents a systematic modeling methodology - PIPE (`Personalization is Partial Evaluation') - for personalization. Personalization systems are designed and implemented in PIPE by modeling an information-seeking interaction in a programmatic representation. The representation supports the description of information-seeking activities as partial information and their subsequent realization by partial evaluation, a technique for specializing programs. We describe the modeling methodology at a conceptual level and outline representational choices. We present two application case studies that use PIPE for personalizing web sites and describe how PIPE suggests a novel evaluation criterion for information system designs. Finally, we mention several fundamental implications of adopting the PIPE model for personalization and when it is (and is not) applicable.Comment: Comprehensive overview of the PIPE model for personalizatio

    Collaboration in the Semantic Grid: a Basis for e-Learning

    Get PDF
    The CoAKTinG project aims to advance the state of the art in collaborative mediated spaces for the Semantic Grid. This paper presents an overview of the hypertext and knowledge based tools which have been deployed to augment existing collaborative environments, and the ontology which is used to exchange structure, promote enhanced process tracking, and aid navigation of resources before, after, and while a collaboration occurs. While the primary focus of the project has been supporting e-Science, this paper also explores the similarities and application of CoAKTinG technologies as part of a human-centred design approach to e-Learning

    Reflecting on the usability of research on culture in designing interaction

    Get PDF
    The concept of culture has been attractive to producers of interactive\ud systems who are willing to design useful and relevant solutions to users\ud increasingly located in culturally diverse contexts. Despite a substantial body of\ud research on culture and technology, interaction designers have not always been\ud able to apply these research outputs to effectively define requirements for\ud culturally diverse users. This paper frames this issue as one of understanding of\ud the different paradigms underpinning the cultural models being applied to\ud interface development and research. Drawing on different social science theories,\ud the authors discuss top-down and bottom-up perspectives in the study of users‟\ud cultural differences and discuss the extent to which each provides usable design\ud knowledge. The case is made for combining bottom-up and top-down perspectives\ud into a sociotechnical approach that can produce knowledge useful and usable by\ud interaction designers. This is illustrated with a case study about the design of\ud interactive systems for farmers in rural Kenya

    The Cathedral and the bazaar: (de)centralising certitude in river basin management

    Get PDF

    Using Technology Enabled Qualitative Research to Develop Products for the Social Good, An Overview

    Get PDF
    This paper discusses the potential benefits of the convergence of three recent trends for the design of socially beneficial products and services: the increasing application of qualitative research techniques in a wide range of disciplines, the rapid mainstreaming of social media and mobile technologies, and the emergence of software as a service. Presented is a scenario facilitating the complex data collection, analysis, storage, and reporting required for the qualitative research recommended for the task of designing relevant solutions to address needs of the underserved. A pilot study is used as a basis for describing the infrastructure and services required to realize this scenario. Implications for innovation of enhanced forms of qualitative research are presented

    Infrastructure for Detector Research and Development towards the International Linear Collider

    Full text link
    The EUDET-project was launched to create an infrastructure for developing and testing new and advanced detector technologies to be used at a future linear collider. The aim was to make possible experimentation and analysis of data for institutes, which otherwise could not be realized due to lack of resources. The infrastructure comprised an analysis and software network, and instrumentation infrastructures for tracking detectors as well as for calorimetry.Comment: 54 pages, 48 picture

    Building a Tailored Text Messaging System for Smoking Cessation in Native American Populations

    Get PDF
    When starting new and healthy habits or encouraging vigilance against returning to poor habits, a simple text message can be beneficial. Text messages also have the advantage of being easily accessible for lower-income populations spread over a rural area, who may not be able to afford smartphones with apps or data plans. Users benefit the most from text messages that are customized for them, but personalization requires time and effort on part of the user and the counselor. However, personalization that focuses on the cultural background of a pool of recipients, in addition to general personal preferences, can be a low-cost method of ensuring the best experience for patients interested in taking up new habits. In this paper, we discuss the development of a system for motivating users to quit smoking designed for Native American users in South Dakota, using text messaging as a daily intervention method for patients. Our results show that focusing on modular message customization options and messages with a conversational tone best helps our goal of providing users with customization options that help motivate them to live happy and healthy lifestyles
    corecore