4,889 research outputs found

    Addendum to Informatics for Health 2017: Advancing both science and practice

    Get PDF
    This article presents presentation and poster abstracts that were mistakenly omitted from the original publication

    Doctor of Philosophy

    Get PDF
    DissertationHealth information technology (HIT) in conjunction with quality improvement (QI) methodologies can promote higher quality care at lower costs. Unfortunately, most inpatient hospital settings have been slow to adopt HIT and QI methodologies. Successful adoption requires close attention to workflow. Workflow is the sequence of tasks, processes, and the set of people or resources needed for those tasks that are necessary to accomplish a given goal. Assessing the impact on workflow is an important component of determining whether a HIT implementation will be successful, but little research has been conducted on the impact of eMeasure (electronic performance measure) implementation on workflow. One solution to addressing implementation challenges such as the lack of attention to workflow is an implementation toolkit. An implementation toolkit is an assembly of instruments such as checklists, forms, and planning documents. We developed an initial eMeasure Implementation Toolkit for the heart failure (HF) eMeasure to allow QI and information technology (IT) professionals and their team to assess the impact of implementation on workflow. During the development phase of the toolkit, we undertook a literature review to determine the components of the toolkit. We conducted stakeholder interviews with HIT and QI key informants and subject matter experts (SMEs) at the US Department of Veteran Affairs (VA). Key informants provided a broad understanding about the context of workflow during eMeasure implementation. Based on snowball sampling, we also interviewed other SMEs based on the recommendations of the key informants who suggested tools and provided information essential to the toolkit development. The second phase involved evaluation of the toolkit for relevance and clarity, by experts in non-VA settings. The experts evaluated the sections of the toolkit that contained the tools, via a survey. The final toolkit provides a distinct set of resources and tools, which were iteratively developed during the research and available to users in a single source document. The research methodology provided a strong unified overarching implementation framework in the form of the Promoting Action on Research Implementation in Health Services (PARIHS) model in combination with a sociotechnical model of HIT that strengthened the overall design of the study

    THE RISE OF AI IN CONTENT MANAGEMENT: REIMAGINING INTELLIGENT WORKFLOWS

    Get PDF
    As content management systems (CMS) become indispensable for managing digital experiences, AI integration promises to bring new levels of automation and intelligence to streamline workflows. This paper surveys how AI techniques like machine learning, natural language processing, computer vision, and knowledge graphs are transforming CMS capabilities across the content lifecycle. We analyze key use cases like automated metadata tagging, natural language generation, smart recommendations, predictive search, personalized experiences, and conversational interfaces. The benefits include enhanced content discoverability, accelerated creation, improved optimization, simplified governance, and amplified team productivity. However, adoption remains low due to challenges like opaque AI, poor workflow integration, unrealistic expectations, bias risks, and skills gaps. Strategic priorities include starting with focused pilots, evaluating multiple AI approaches, emphasizing transparent and fair AI models, and upskilling teams. Benefits are maximized through hybrid human-AI collaboration vs full automation. While AI integration is maturing, the outlook is cautiously optimistic. Leading CMS platforms are accelerating development of no-code AI tools. But mainstream adoption may take 2-5 years as skills and best practices evolve around transparent and ethical AI. Wise data practices, change management, and participatory design will be key. If implemented thoughtfully, AI can reimagine workflows by expanding human creativity, not replacing it. The future points to creative synergies between empowered users and AI assistants. But pragmatic pilots, continuous improvement, and participatory strategies are necessary to navigate the hype and deliver value. The promise warrants measured experimentation

    Automation and Adaptation: Information Technology, Work Practices, and Labor Demand at Three Firms

    Get PDF
    The use of information and communication technology to automate routine tasks involves two types of innovation: technological and organizational. Together, improvements in technological capabilities and complementary changes made by firms in the way they organize work and implement work practices constitute the conditions under which machines substitute for or complement human workers. Building on the prevailing model of routine-biased technical change and recent insights into organizational complementarities, I conduct three qualitative case studies in health care and real estate to assess the relationship between technology and firm-level labor demand. Unique combinations of technological innovation, organizational complementarity, and decision-making at each firm produce differential impacts for labor demand, with even similar technologies exhibiting quite different patterns of substitution for workers of all skill types. In addition, studying firm-level complementarities illuminates how and why the scope of the routine task may be growing, with particularly important implications for relatively higher skill workers

    Developing the Quantitative Histopathology Image Ontology : A case study using the hot spot detection problem

    Get PDF
    Interoperability across data sets is a key challenge for quantitative histopathological imaging. There is a need for an ontology that can support effective merging of pathological image data with associated clinical and demographic data. To foster organized, cross-disciplinary, information-driven collaborations in the pathological imaging field, we propose to develop an ontology to represent imaging data and methods used in pathological imaging and analysis, and call it Quantitative Histopathological Imaging Ontology – QHIO. We apply QHIO to breast cancer hot-spot detection with the goal of enhancing reliability of detection by promoting the sharing of data between image analysts

    Smart City: Concepts and two Relevant Components

    Get PDF
    In the last 30 years, the Smart City (SC) definitions have changed, they expressed different meanings by different people, but still no universally accepted definition, yet. The paper aims to summarize the existing relevant definitions to and propose a concept for characterizing the smartness of a city through intelligent planning and monitoring, guided by actionable information that underpins computer-assisted decisions and institutional digital transformation. As a practical approach, the SC concept is promoted by two components namely: spatial urban territorial planning and cultural heritage via virtual exhibitions. The article highlights the schematic diagram of cross-sectoral interactions between different stakeholders grouped by roles, and the expected impact for these interactions, a proposed functional system architecture for cultural heritage digital transformation and concrete steps for virtual exhibitions implementation

    Design considerations for workflow management systems use in production genomics research and the clinic

    Get PDF
    Abstract The changing landscape of genomics research and clinical practice has created a need for computational pipelines capable of efficiently orchestrating complex analysis stages while handling large volumes of data across heterogeneous computational environments. Workflow Management Systems (WfMSs) are the software components employed to fill this gap. This work provides an approach and systematic evaluation of key features of popular bioinformatics WfMSs in use today: Nextflow, CWL, and WDL and some of their executors, along with Swift/T, a workflow manager commonly used in high-scale physics applications. We employed two use cases: a variant-calling genomic pipeline and a scalability-testing framework, where both were run locally, on an HPC cluster, and in the cloud. This allowed for evaluation of those four WfMSs in terms of language expressiveness, modularity, scalability, robustness, reproducibility, interoperability, ease of development, along with adoption and usage in research labs and healthcare settings. This article is trying to answer, which WfMS should be chosen for a given bioinformatics application regardless of analysis type?. The choice of a given WfMS is a function of both its intrinsic language and engine features. Within bioinformatics, where analysts are a mix of dry and wet lab scientists, the choice is also governed by collaborations and adoption within large consortia and technical support provided by the WfMS team/community. As the community and its needs continue to evolve along with computational infrastructure, WfMSs will also evolve, especially those with permissive licenses that allow commercial use. In much the same way as the dataflow paradigm and containerization are now well understood to be very useful in bioinformatics applications, we will continue to see innovations of tools and utilities for other purposes, like big data technologies, interoperability, and provenance
    • …
    corecore