10,907 research outputs found

    Employing dynamic fuzzy membership functions to assess environmental performance in the supplier selection process

    Get PDF
    The proposed system illustrates that logic fuzzy can be used to aid management in assessing a supplier's environmental performance in the supplier selection process. A user-centred hierarchical system employing scalable fuzzy membership functions implement human priorities in the supplier selection process, with particular focus on a supplier's environmental performance. Traditionally, when evaluating supplier performance, companies have considered criteria such as price, quality, flexibility, etc. These criteria are of varying importance to individual companies pertaining to their own specific objectives. However, with environmental pressures increasing, many companies have begun to give more attention to environmental issues and, in particular, to their suppliers’ environmental performance. The framework presented here was developed to introduce efficiently environmental criteria into the existing supplier selection process and to reflect on its relevant importance to individual companies. The system presented attempts to simulate the human preference given to particular supplier selection criteria with particular focus on environmental issues when considering supplier selection. The system considers environmental data from multiple aspects of a suppliers business, and based on the relevant impact this will have on a Buying Organization, a decision is reached on the suitability of the supplier. This enables a particular supplier's strengths and weaknesses to be considered as well as considering their significance and relevance to the Buying OrganizationPeer reviewe

    Creating national weights for a patient-level longitudinal database

    Get PDF
    Onur BaƟer (MEF Author)##nofulltext##To create a nationally-representative estimate from longitudinal data by controlling for sociodemographic factors and health status. The Agency for Healthcare Research and Quality’s (AHRQ) Medicare Expenditures Panel Survey (MEPS) was used as the basis for adjustment methodology. MEPS is a data source representing health insurance coverage cost and utilization, and comprises several large-scale surveys of families, individuals, employers, and health care providers. Using these data, we created subset populations. We then used multivariate logistic regression to construct demographics and case-mix-based weights, which were applied to create a population sample that is similar to the national population. The weight was derived using the inverse probability of the weighting approach, as well as a raking mechanism. We compared the results with the projected number of persons in the US population in the same categories to examine the validity of the weights. The following variables were used in the logistic regression: Age group, gender, race, location, income level and health status (Charlson Comorbidity Index scores and chronic condition diagnosis). Relative to MEPS data, patients included in the private insurance data were more likely to be male, older, to have a chronic condition, and to be white (p=0.0000). Adjusted weighted values for patients in the commercial group ranged from 15.47 to 36.36 (median: 16.91). Commercial insurance and MEPS data populations were similar in terms of their socioeconomic and clinical categories. As an outcomes measure, the predicted annual number of patients with prescription claims from private insurance data was 6 963 034. The annual number of statin users were predicted as 6 709 438 using weighted MEPS data. National projections of large-scale patient longitudinal databases require adjustment utilizing demographic factors and case-mix differences related to health status

    Development of a Short Trauma Screening Tool (STST) to Measure Child Trauma Symptoms: Establishing Content Validity

    Get PDF
    Purpose: The purpose of the study was to identify major symptom domain variables common to child trauma and create a prototype short trauma symptom screening tool (STST) intended for use in pediatric medical settings. Methods: This manuscript describes the first two phases of an on-going prospective mixed-method instrument development study. Phase 1 exploratory factor analysis was conducted with an archived LONGSCAN CBCL dataset to: (1) identify behavioral symptoms endorsed by children with known trauma exposure; and (2) generate a preliminary STST item pool. During Phase 2, researchers convened an expert panel (N = 10) and conducted Content Validity Index (CVI) procedures with the 20-item preliminary STST item pool, to further inform item retention, elimination and modification for an updated prototype STST. Findings: Expert quantitative scores yielded a CVI of 0.90 for the overall preliminary STST. The first two phases of this study assisted researchers with identifying 12 items that represent nine child trauma symptom domain variables, which include: (1) aggression/anger; (2) anxiety/fear; (3) sexual concerns; (4) elimination concerns; (5) somatic concerns; (6) depression; (7) dissociation; (8) physical acting out; and (9) dysregulation. Conclusions: The first two phases of STST development resulted in development of a brief, empirically-derived prototype screening tool that features 12 items operationalizing nine domains of child trauma symptoms. Developers can now advance to the next phase of STST development; feasibility assessment and psychometric testing

    Comparing Design Ground Snow Load Prediction in Utah and Idaho

    Get PDF
    Snow loads in the western United States are largely undefined due to complex geography and climates, leaving the individual states to publish detailed studies for their region, usually through the local Structural Engineers Association (SEAs). These associations are typically made up of engineers not formally trained to develop or evaluate spatial statistical methods for their regions and there is little guidance from ASCE 7. Furthermore, little has been written to compare the independently developed design ground snow load prediction methods used by various western states. This paper addresses this topic by comparing the accuracy of a variety of spatial methods for predicting 50-year (i.e., design) ground snow loads in Utah and Idaho. These methods include, among others, the current Utah snow load equations, Idaho’s normalized ground snow loads based on inverse distance weighting, two forms of kriging, and the authors’ adaptation of the Parameter-elevation Relationships on Independent Slopes Model (PRISM). The accuracy of each method is evaluated by measuring the mean absolute error using 10-fold cross validation on data sets obtained from Idaho’s 2015 snow load report, Utah’s 1992 snow load report, and a new Utah ground snow load data set. These results show that regression-based kriging and PRISM methods have the lowest cross-validated errors across all three data sets. These results also show that normalized ground snow loads, which are a common way of accounting for elevation in traditional interpolation methods, do not fully account for the effect of elevation on ground snow loads within the considered data sets. The methodologies and cautions outlined in this paper provide a framework for an objective comparison of snow load estimation methods for a given region as state SEAs look to improve their future design ground snow predictions. Such comparisons will aid states looking to amend or improve their current ground snow load requirements

    Dispersion monitoring for high-speed WDM networks via two-photon absorption in a semiconductor microcavity

    Get PDF
    Due to the continued demand for bandwidth, network operators have to increase the data rates at which individual wavelengths operate at. As these data rates will exceed 100 Gbit/s in the next 5-10 years, it will be crucial to be able to monitor and compensate for the amount of chromatic dispersion encountered by individual wavelength channels. This paper will focus on the use of the novel nonlinear optical-to-electrical conversion process of two-photon absorption (TPA) for dispersion monitoring. By incorporating a specially designed semiconductor microcavity, the TPA response becomes wavelength dependent, thus allowing simultaneous channel selection and monitoring without the need for external wavelength filterin
    • 

    corecore