913,750 research outputs found

    A Framework for Learning During an Honors Internship: Embold Health Inc.

    Get PDF
    Embold Health Inc. is a health care analytics start-up company that identifies top performing providers using large data sets and clinically nuanced measurement. A comprehensive assessment of quality, appropriateness, and cost is what Embold Health delivers. Embold allows payers more control in choosing providers within their region. Being able to identify top provider behaviors through rigorous data driven measurement is significant both clinically and commercially; it provides the opportunity to increase value of care, decrease waste, and drive down cost. Over the past year as an intern, I’ve had a first-hand look into the complexity that surrounds data driven health care improvement

    A maturity model for the information-driven SME

    Get PDF
    Purpose: This article presents a maturity model for the evaluation of the information-driven decision-making process (DMP) in small and medium enterprises. This model is called “Simplified Holistic Approach to DMP Evaluation (SHADE)”. The SHADE model is based in the “Circumplex Hierarchical Representation of the Organization Maturity Assessment” (CHROMA) framework for characterizing the information-driven DMP in organizations Design/methodology/approach: The CHROMA-SHADE provides a competency evaluation methodology regarding the SME’s use of data for making better-informed decisions. This model groups the main factors influencing the information-driven DMP and classifies them into five dimensions: data availability, data quality, data analysis and insights, information use and decision-making. It addresses these dimensions systematically, delivering a framework for positioning the organization from an uninitiated to a completely embedded stage. The assessment consists of interviews based on a standardized open-ended questionnaire performed to key company personnel followed by an analysis of the answers and their scoring performed by an expert evaluator. Findings: The results of its application indicate this model is well adapted to the SMEs resulting useful for identifying strengths and weaknesses, thereby providing insights for prioritizing improvement actions. Originality/value: The CHROMA-SHADE model follows a novel, holistic approach that embraces the complexities inherent in a multiplicity of factors that, at the technological and management level, converge to enable more objective and better-supported decisions to be made through the intelligent use of information.Peer Reviewe

    Invest for Impact: Maximizing the Impact of W.K. Kellogg Foundations Mission Driven Investment Portfolio

    Get PDF
    Impact investing is an exciting, fast-growing field that holds promise to generate both social and financial value. However, its full potential has yet to be released. For that to happen, impact investors would need to adopt a systems perspective when allocating capital. Only by adopting a systems perspective, investors can catalyse change that has the potential toproduce disruptive change.The imperative to disrupt and catalyse positive change, has guided the W.K. Kellogg Foundation (WKKF) to expand their innovative work to deploy more of their endowment for impact through the creation of their Mission Driven Investment portfolio. The Invest for Impact Framework, introduced in this report, is a management and governance tool created by KKS Advisors on behalf of WKKF and seeks to help mission driven investors in four ways. First, as an assessment tool for systems-level impacts duringinvestment due diligence. Second, as a diagnostic tool post investment to understand portfolio level interactions and synergies across investments. Third, as a monitoring tool for assessing whether investees are delivering the expected impact. Fourth, as a collaboration tool where different teams working across investment and grant making teams could have a platform for cross-learning, information exchange and collaboration.While there are tremendous data deficiencies plaguing the field, we find promising pathways for aligning mission and investments with rigor. Therefore, the framework is evidence-based and research driven. It focuses on the assessment of the following four categories; alignment and evidence, measurement quality, scalability of impact and potential for disruption. We apply the framework in the case of three investees to verify and showcase its practicality and decision usefulness. This could not have been done without the help of the investees who provided us with value insights, data and their time.The framework can enable all types of mission-oriented investors to better weigh up alternative opportunities for impact in the marketplace. Although originally developed for WKKF's Mission Driven Investment Portfolio, the learnings and approach are designed to be applicable to any impact investor. We hope you will find it useful

    Virtual OSCE Delivery and Quality Assurance During a Pandemic: Implications for the Future

    Get PDF
    Background: During 2020, the COVID-19 pandemic caused worldwide disruption to the delivery of clinical assessments, requiring medicals schools to rapidly adjust their design of established tools. Derived from the traditional face-to-face Objective Structured Clinical Examination (OSCE), the virtual OSCE (vOSCE) was delivered online, using a range of school-dependent designs. The quality of these new formats was evaluated remotely through virtual quality assurance (vQA). This study synthesizes the vOSCE and vQA experiences of stakeholders from participating Australian medical schools based on a Quality framework. Methods: This study utilized a descriptive phenomenological qualitative design. Focus group discussions (FGD) were held with 23 stakeholders, including examiners, academics, simulated patients, professional staff, students and quality assurance examiners. The data was analyzed using a theory-driven conceptual Quality framework. Results: The vOSCE was perceived as a relatively fit-for purpose assessment during pandemic physical distancing mandates. Additionally, the vOSCE was identified as being value-for-money and was noted to provide procedural benefits which lead to an enhanced experience for those involved. However, despite being largely delivered fault-free, the current designs are considered limited in the scope of skills they can assess, and thus do not meet the established quality of the traditional OSCE. Conclusions: Whilst virtual clinical assessments are limited in their scope of assessing clinical competency when compared with the traditional OSCE, their integration into programs of assessment does, in fact, have significant potential. Scholarly review of stakeholder experiences has elucidated quality aspects that can inform iterative improvements to the design and implementation of future vOSCEs

    Community infrastructure and repository for marine magnetic identifications

    Get PDF
    Magnetic anomaly identifications underpin plate tectonic reconstructions and form the primary data set from which the age of the oceanic lithosphere and seafloor spreading regimes in the ocean basins can be determined. Although these identifications are an invaluable resource, their usefulness to the wider scientific community has been limited due to the lack of a central community infrastructure to organize, host, and update these interpretations. We have developed an open-source, community-driven online infrastructure as a repository for quality-checked magnetic anomaly identifications from all ocean basins. We provide a global sample data set that comprises 96,733 individually picked magnetic anomaly identifications organized by ocean basin and publication reference, and provide accompanying Hellingerformat files, where available. Our infrastructure is designed to facilitate research in plate tectonic reconstructions or research that relies on an assessment of plate reconstructions, for both experts and nonexperts alike. To further enhance the existing repository and strengthen its value, we encourage others in the community to contribute to this effort

    A national harmonised data collection network for neurodevelopmental disorders: A transdiagnostic assessment protocol for neurodevelopment, mental health, functioning and well-being

    Get PDF
    BACKGROUND: Children with neurodevelopmental disorders share common phenotypes, support needs and comorbidities. Such overlap suggests the value of transdiagnostic assessment pathways that contribute to knowledge about research and clinical needs of these children and their families. Despite this, large transdiagnostic data collection networks for neurodevelopmental disorders are not well developed. This paper describes the development of a nationally supported transdiagnostic clinical and research assessment protocol across Australia. The vision is to establish a harmonised network for data collection and collaboration that promotes transdiagnostic clinical practice and research. METHODS: Clinicians, researchers and community groups across Australia were consulted using surveys and national summits to identify assessment instruments and unmet needs. A national research committee was formed and, using a consensus approach, selected assessment instruments according to pre-determined criteria to form a harmonised transdiagnostic assessment protocol. RESULTS: Identified assessment instruments were clustered into domains of transdiagnostic assessment needs, which included child functioning/quality of life, child mental health, caregiver mental health, and family background information. From this, the research committee identified a core set of nine measures and an extended set of 14 measures that capture these domains with potential for further modifications as recommended by clinicians, researchers and community members. CONCLUSION: The protocol proposed here was established through a strong partnership between clinicians, researchers and the community. It will enable (i) consensus driven transdiagnostic clinical assessments for children with neurodevelopmental disorders, and (ii) research studies that will inform large transdiagnostic datasets across neurodevelopmental disorders and that can be used to inform research and policy beyond narrow diagnostic groups. The long-term vision is to use this framework to facilitate collaboration across clinics to enable large-scale data collection and research. Ultimately, the transdiagnostic assessment data can be used to inform practice and improve the lives of children with neurodevelopmental disorders and their families

    Scalable Quality Assessment of Linked Data

    Get PDF
    In a world where the information economy is booming, poor data quality can lead to adverse consequences, including social and economical problems such as decrease in revenue. Furthermore, data-driven indus- tries are not just relying on their own (proprietary) data silos, but are also continuously aggregating data from different sources. This aggregation could then be re-distributed back to “data lakes”. However, this data (including Linked Data) is not necessarily checked for its quality prior to its use. Large volumes of data are being exchanged in a standard and interoperable format between organisations and published as Linked Data to facilitate their re-use. Some organisations, such as government institutions, take a step further and open their data. The Linked Open Data Cloud is a witness to this. However, similar to data in data lakes, it is challenging to determine the quality of this heterogeneous data, and subsequently to make this information explicit to data consumers. Despite the availability of a number of tools and frameworks to assess Linked Data quality, the current solutions do not aggregate a holistic approach that enables both the assessment of datasets and also provides consumers with quality results that can then be used to find, compare and rank datasets’ fitness for use. In this thesis we investigate methods to assess the quality of (possibly large) linked datasets with the intent that data consumers can then use the assessment results to find datasets that are fit for use, that is; finding the right dataset for the task at hand. Moreover, the benefits of quality assessment are two-fold: (1) data consumers do not need to blindly rely on subjective measures to choose a dataset, but base their choice on multiple factors such as the intrinsic structure of the dataset, therefore fostering trust and reputation between the publishers and consumers on more objective foundations; and (2) data publishers can be encouraged to improve their datasets so that they can be re-used more. Furthermore, our approach scales for large datasets. In this regard, we also look into improving the efficiency of quality metrics using various approximation techniques. However the trade-off is that consumers will not get the exact quality value, but a very close estimate which anyway provides the required guidance towards fitness for use. The central point of this thesis is not on data quality improvement, nonetheless, we still need to understand what data quality means to the consumers who are searching for potential datasets. This thesis looks into the challenges faced to detect quality problems in linked datasets presenting quality results in a standardised machine-readable and interoperable format for which agents can make sense out of to help human consumers identifying the fitness for use dataset. Our proposed approach is more consumer-centric where it looks into (1) making the assessment of quality as easy as possible, that is, allowing stakeholders, possibly non-experts, to identify and easily define quality metrics and to initiate the assessment; and (2) making results (quality metadata and quality reports) easy for stakeholders to understand, or at least interoperable with other systems to facilitate a possible data quality pipeline. Finally, our framework is used to assess the quality of a number of heterogeneous (large) linked datasets, where each assessment returns a quality metadata graph that can be consumed by agents as Linked Data. In turn, these agents can intelligently interpret a dataset’s quality with regard to multiple dimensions and observations, and thus provide further insight to consumers regarding its fitness for use

    Assessment of river water quality using an expert system

    Get PDF
    This study is to develop an expert system to assess water quality for user to derive at a decision equivalent to an expert decision. The ES-RWQ was developed as simple yet reliable tool to assist user in assessing the status of the water quality and relating it to the source contributing to the water quality problem. The main objective is to build a prototype of ES-RWQ with an assessment method, and the ability to derive to a recommendation on the water quality problem. While the conventional models can provide an inside to the quality problem but being complex issue in itself, an effective method of extracting value added information from all sources to facilitate decisions on the implementation of cost-effective pollution prevention and control measurements still requires expert consultation. Therefore a re-look at the process of managing water quality are critical and timely to ensure the strategies and actions planned will lead to measurable water quality improvement. Reliable assessment tools are needed to effectively communicate the water quality data so that the data become an important part of finding solutions and decision making process. Expert system (ES-RWQ) aim to assist decision maker and water quality managers to decide on the most appropriate decision/action to be taken when confronting a situation which requires immediate action. ES-RWQ was developed using the visual basic programming language as the tool that consists of user interface, knowledge engine, and inference engine. The user interface can be construct using menu driven or natural language as the communication mode between user and the system. Load duration curve has been identified as one of the assessment tool that offers a practical approach to watershed management. The duration curve has an added value due to its ability to perform a quick and reliable statistical analysis of data targeting the sources as well as linking it to the potential implementation efforts to the hydrologic condition of the watershed. The assessment of water quality was based on pollutant loading model establishing the load capacity of the river to determine specific limit for thr river in receiving pollutant. Selection of tools was based on the combination of prediction reliability with ease of use and reduced requirement of field data. A prototype developed using Microsoft Visual Basic and validated by selecting two sub-catchment from Melaka and one sub-catchment of upper Sungai Langat, Selangor. Four variables Biochemical Oxygen Demand (BOD), Chemical Oxygen Demand (COD), Ammoniacal Nitrogen (AN) and Total Suspended Solid (TSS) of different river classification standard were also selected to validate the ES-RWQ. The 20 years of flow data, the variables, and other information on the sub catchment were input into the prototype, for it to execute the knowledge and the inferences engine to ensure the system produced the expected result. ES-RWQ, produced an output of a load duration curve as the assessment for the water quality status, with it graphical representation, map extract from the geographical information system and recommendation to control pollution source identified. When the information of sub-catchments were executed by the system the output perform as per the specification design for the system to function. Test run with the user agencies was conducted with three rivers; Sg. Pinang, Pulau Pinang, Sg. Jelai, Pahang and Sg. Kanowit, Sarawak. The ES-RWQ was agreed by user agencies as an assessment tool and loading capacity as new approach to determine the status of water quality and identify sources of pollution. With effective knowledge management tool such as ES-RWQ will enhance decision making process and employing computer based-technology to capture knowledge and human expert knowledge as an added toolfor river water quality management
    corecore