9,047 research outputs found

    ASTRO Journals' Data Sharing Policy and Recommended Best Practices.

    Get PDF
    Transparency, openness, and reproducibility are important characteristics in scientific publishing. Although many researchers embrace these characteristics, data sharing has yet to become common practice. Nevertheless, data sharing is becoming an increasingly important topic among societies, publishers, researchers, patient advocates, and funders, especially as it pertains to data from clinical trials. In response, ASTRO developed a data policy and guide to best practices for authors submitting to its journals. ASTRO's data sharing policy is that authors should indicate, in data availability statements, if the data are being shared and if so, how the data may be accessed

    Standards and Costs for Quality Management of E-Learning Services

    Get PDF
    The proportions of the technological development in the field of communications and information represent an irrefutable premise for significant changes in all the spheres of human life. Corroborated with the advance of the Internet recorded in the last decade, these changes form the perfect recipe for the emergence (ever since the '90s), functioning and development of flexible forms of labour at distance, using the technology of information and communication. Among the first domains where the impact of technology is very strong may be named education, marketing, advertising and commerce; the forms of manifestation are materialized in e-learning, cyber-marketing, online advertising and electronic commerce. But the simple use of technology does not automatically assure the success of the new forms of activity. These transformations of the traditional into digital, of the classic into virtual must be accompanied by the adequate support with respect to the quality of services, standards, platforms and the hardware and software technologies. If we are referring to the educational domain, we have to analyze the e-learning phenomenon or tele-education in its spectacular evolution in such a recent history. Quality represents a landmark of major importance in all the fields of modern society based on knowledge. From the perspective of tele-education, quality assurance must be focalized on three main directions: the quality of the actual educational process (class/course support, platform, technology, etc.); the quality of the instructor (professional training, qualification, specialization, pedagogic ability, teaching method, etc.); the quality of the person undergoing the course/class (training, knowledge thesaurus, involvement, accumulation wish, etc.). Also, like in any activity, quality standard reporting means an economic approach by quality costs. Theat means that the good product or quality services in e-learning are very strongly linked with educational multimedia production and good costs.quality, standards, e-learning, technology, cost

    Migrating agile methods to standardized development practice

    Get PDF
    Situated process and quality frame-works offer a way to resolve the tensions that arise when introducing agile methods into standardized software development engineering. For these to be successful, however, organizations must grasp the opportunity to reintegrate software development management, theory, and practice

    A hybrid EAV-relational model for consistent and scalable capture of clinical research data

    Get PDF
    Many clinical research databases are built for specific purposes and their design is often guided by the requirements of their particular setting. Not only does this lead to issues of interoperability and reusability between research groups in the wider community but, within the project itself, changes and additions to the system could be implemented using an ad hoc approach, which may make the system difficult to maintain and even more difficult to share. In this paper, we outline a hybrid Entity-Attribute-Value and relational model approach for modelling data, in light of frequently changing requirements, which enables the back-end database schema to remain static, improving the extensibility and scalability of an application. The model also facilitates data reuse. The methods used build on the modular architecture previously introduced in the CURe project

    Streaming Analytics and Workflow Automation for DFS

    Get PDF
    Researchers reuse data from past studies to avoid costly re-collection of experimental data. However, large-scale data reuse is challenging due to lack of consensus on metadata representations among research groups and disciplines. Dataset File System (DFS) is a semi-structured data description format that promotes such consensus by standardizing the semantics of data description, storage, and retrieval. In this paper, we present analytic-streams – a specification for streaming data analytics with DFS, and streaming-hub – a visual programming toolkit built on DFS to simplify data analysis work-flows. Analytic-streams facilitate higher-order data analysis with less computational overhead, while streaming-hub enables storage, retrieval, manipulation, and visualization of data and analytics. We discuss how they simplify data pre-processing, aggregation, and visualization, and their implications on data analysis workflows
    • …
    corecore