62,178 research outputs found

    Neuroimaging study designs, computational analyses and data provenance using the LONI pipeline.

    Get PDF
    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges--management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu

    Designing a novel virtual collaborative environment to support collaboration in design review meetings

    Get PDF
    Project review meetings are part of the project management process and are organised to assess progress and resolve any design conflicts to avoid delays in construction. One of the key challenges during a project review meeting is to bring the stakeholders together and use this time effectively to address design issues as quickly as possible. At present, current technology solutions based on BIM or CAD are information-centric and do not allow project teams to collectively explore the design from a range of perspectives and brainstorm ideas when design conflicts are encountered. This paper presents a system architecture that can be used to support multi-functional team collaboration more effectively during such design review meetings. The proposed architecture illustrates how information-centric BIM or CAD systems can be made human- and team-centric to enhance team communication and problem solving. An implementation of the proposed system architecture has been tested for its utility, likability and usefulness during design review meetings. The evaluation results suggest that the collaboration platform has the potential to enhance collaboration among multi-functional teams

    Fine Grained Component Engineering of Adaptive Overlays: Experiences and Perspectives

    Get PDF
    Recent years have seen significant research being carried out into peer-to-peer (P2P) systems. This work has focused on the styles and applications of P2P computing, from grid computation to content distribution; however, little investigation has been performed into how these systems are built. Component based engineering is an approach that has seen successful deployment in the field of middleware development; functionality is encapsulated in ‘building blocks’ that can be dynamically plugged together to form complete systems. This allows efficient, flexible and adaptable systems to be built with lower overhead and development complexity. This paper presents an investigation into the potential of using component based engineering in the design and construction of peer-to-peer overlays. It is highlighted that the quality of these properties is dictated by the component architecture used to implement the system. Three reusable decomposition architectures are designed and evaluated using Chord and Pastry case studies. These demonstrate that significant improvements can be made over traditional design approaches resulting in much more reusable, (re)configurable and extensible systems

    Methodological Approaches to Modeling Information Architecture of the Organization in the Conditions of Digital Economy

    Get PDF
    It is significant for businesses, especially in the digital economy, the solution of theoretical and methodological justifications and the development of practical recommendations for building an organization\u27s information architecture as a holistic description of its key strategies, related to business, information, application systems and technologies, and also their impact on the functions and business processes of an organization. The article discusses issues, related to methodological approaches to modeling an organization\u27s information architectureб using information management tools to help manage innovation in information systems (IS) and information technologies (IT). The relevance of organizational provisions to determine the way, in which a business entity\u27s business model is functionally integrated with the IS architecture is substantiated. The consideration and analysis of the use of industrial standards for describing the architecture of an organization, adopted by such institutions as the International Organization for Standardization (ISO), The Open Group, Institute of Electrical and Electronics Engineers (IEEE), etc. reveal that none of these standards is dominant and does not provide teams, responsible for the architecture development with all the tools, necessary from the methodological point of view and from the point of view of the templates, used to describe the architecture. Recommendations are given on the theoretical and methodological substantiation and construction of the information architecture of an organization as a complete description of its key strategies related to business, information, application systems and technologies, as well as their impact on the functions and business processes of an organization

    Grid Global Behavior Prediction

    Get PDF
    Complexity has always been one of the most important issues in distributed computing. From the first clusters to grid and now cloud computing, dealing correctly and efficiently with system complexity is the key to taking technology a step further. In this sense, global behavior modeling is an innovative methodology aimed at understanding the grid behavior. The main objective of this methodology is to synthesize the grid's vast, heterogeneous nature into a simple but powerful behavior model, represented in the form of a single, abstract entity, with a global state. Global behavior modeling has proved to be very useful in effectively managing grid complexity but, in many cases, deeper knowledge is needed. It generates a descriptive model that could be greatly improved if extended not only to explain behavior, but also to predict it. In this paper we present a prediction methodology whose objective is to define the techniques needed to create global behavior prediction models for grid systems. This global behavior prediction can benefit grid management, specially in areas such as fault tolerance or job scheduling. The paper presents experimental results obtained in real scenarios in order to validate this approach

    Model Based Development of Quality-Aware Software Services

    Get PDF
    Modelling languages and development frameworks give support for functional and structural description of software architectures. But quality-aware applications require languages which allow expressing QoS as a first-class concept during architecture design and service composition, and to extend existing tools and infrastructures adding support for modelling, evaluating, managing and monitoring QoS aspects. In addition to its functional behaviour and internal structure, the developer of each service must consider the fulfilment of its quality requirements. If the service is flexible, the output quality depends both on input quality and available resources (e.g., amounts of CPU execution time and memory). From the software engineering point of view, modelling of quality-aware requirements and architectures require modelling support for the description of quality concepts, support for the analysis of quality properties (e.g. model checking and consistencies of quality constraints, assembly of quality), tool support for the transition from quality requirements to quality-aware architectures, and from quality-aware architecture to service run-time infrastructures. Quality management in run-time service infrastructures must give support for handling quality concepts dynamically. QoS-aware modeling frameworks and QoS-aware runtime management infrastructures require a common evolution to get their integration

    Salford postgraduate annual research conference (SPARC) 2012 proceedings

    Get PDF
    These proceedings bring together a selection of papers from the 2012 Salford Postgraduate Annual Research Conference (SPARC). They reflect the breadth and diversity of research interests showcased at the conference, at which over 130 researchers from Salford, the North West and other UK universities presented their work. 21 papers are collated here from the humanities, arts, social sciences, health, engineering, environment and life sciences, built environment and business
    • 

    corecore